We don't actually need to set a ContextOverride unless the ContextLoc and L
paren/brace/bracket are on different lines. Combined with the fact that we
only set them if the L and R parens/braces/brackets are on different lines
to, it guarantees there will be at most one override that's applicable on
any given line, which lets us simplify the logic somewhat.
TLDR: This will allow me to in a forthcoming commit to eliminate the extra run
of the peephole optimizer to seed the phi arg analysis since this eliminates
potential data invalidation issues. That being said, I am formalizing the
language/model in the pass a little in this commit, so I wanted to give a long
description below.
----
In this commit, I am formalizing some of the language around this optimization
away phi nodes specifically and more towards a notion that Andy and I have
discussed called "Joined Live Ranges". The idea is a "Live Range" in OSSA is an
owned value together with its set of forwarding uses. The owned value is in a
certain sense the equivalence class representative of all of the values that are
derived from the owned value via forwarding instructions.
This works well as long as all of the "forwarding" instructions only derive
their ownership from a single operand. Some instructions without that property
are branch, tuple, struct. Since such a "forwarding" instruction derives its
ownership by merging the ownership of multiple operands, if multiple of its
operands are non-trivial owned operands, we are in a sense joining together the
live ranges of those non-trivial owned operands. The main implication of this is
that if we want to convert the joined live range from owned to guaranteed, we
can only do it if we can also do it for all of its incoming values at the same
time.
This creates a conundrum though since this pass has been written like a peephole
pass which is ill-suited to be converted into this form. So instead, this pass
seeds an analysis from peephole runs that /could/ have been converted from owned
to guaranteed except for a specific joined live range. Then after we reach a
fixed point, we use that information from the failed peepholes to try to
eliminate the joined live ranges.
This approach is a good approach, but if implemented in certain ways exposes the
risk of problems around touching invalidated data. Specifically, before this
patch, this was implemented with the concern that each branch
instruction (currently the only supported type of joined live range instruction)
could be paired with multiple owned value introducers. This is because
initially, we were allowing for tuple, struct with multiple non-trivial operands
to be used as owned value introducers. This implied that we could not rely on
just storing a bool per joined live range operand since we would need to know
that all introducers associated with the operand were optimizable.
After some thought/discussion, I realized that this sort of optimization can
also apply to tuple/struct/similar insts. This let to the shift in
thinking/realization that we could talk instead about general joined live ranges
and handle tuple/struct/similar instructions just like they were branch/phi! As
such, if we assume that all such instructions are treated as separate
LiveRanges, then we know that our branch can only have a single owned value
introducer since we do not look through /any/ joined live ranges when computing
LiveRanges.
This realization then allows us to store a single bool value marking the operand
on the joined live range as being one that we /may/ be able to optimize. This
then allows us to simplify the state we handle and not have to worry about
storing memory to instructions that may be eliminated. Since we have a joined
live range use, we know that any value that is used by the joined live range can
not be eliminated by the peephole optimizer.
This fixes an immediate bug with subst-to-orig conversion of
parameter functions that I'm surprised isn't otherwise tested.
More importantly, it preserves valuable information that should
let us handle a much wider variety of variant representations
that aren't necessarily expressed in the AbstractionPattern.
Previously we were using `getPlainType` to match
the parameter type against the key path's base
type. This gave us the external parameter type,
which would be the element type for a variadic
parameter. However the code we generate expects
the internal parameter type, which is provided by
`getParameterType`.
Resolves rdar://problem/59445486.
Compatibility header may #import bridging header if specified with -import-underlying-module.
How these two headers are relative to each other is subject to project setting. To accommodate
this, we should allow users to specify bridging header directory for header generating purposes.
rdar://59110975
- In member completions, when 'callAsFunction' decls are found, suggest
call patterns
- In call pattern completions, fallback to search 'callAsFunction' if
the base type is not a function type
rdar://problem/59792682
If the substituted type for a conformance found via a superclass constraint is a subclass of that
superclass, then we should represent that with an InheritedProtocolConformance rather than with
the original root conformance that applies to the superclass. If we don't do this, then we end up
with spurious inequalities in generic signatures that ought to be equivalent, because some
paths use the inherited conformance and some don't, as in SR-12330 | rdar://problem/60174186.
Commit 7b30370e5bcf569fcdc15204d4c592163fd78cb3 changed the Sysroot
attribute to the CompileUnit which broke the build.
(cherry picked from commit 728e8a1bde)
Doing this requires us to re-introduce the concept of the contextual generic signature to SIL type lowering, but hopefully just in a few places.
As the FIXME notes, I found a problem here for substituted function types, but I need to land this first to fix ProcedureKit in the source-compatibility test suite.
We are checking the unknown consuming use vector, so the name makes this clearer
and makes it clear we are not talking about destroying consumers.
I also eliminated the phiToIncomingValueMultiMap being passed in favor of a bool
argument since that is how we are using it today. That was a vestige of an
earlier version of the patch that had to be sunk.
For some reason, the changed caller in CS wasn't actually going to use any of the types in the constraint system from the entrypoint it was calling. Switch over to using the constraint-system-based entrypoint so we can pick up expression types consistently. Then, move the TypeChecker entrypoint onto ConstraintSystem to reduce the duplication here.
The remaining callers of buildCheckedRefExpr should be migrated.
EscapeAnalysis::mayReleaseContent was recently changed to assert on
address-type arguments. The assert ensures that callers directly pass
the reference being released. If the caller does not have the precise
reference being released, it opens the door to bugs in which the
EscapeAnalysis query looks up the wrong connection graph node.
The original AliasAnalysis logic is just a workaround for the fact
that we don't have information about which builtin's may release
reference-type arguments.
Fixes <rdar://60190962> Escape analysis crashes with "an address is
never a reference" error with -O -thread=sanitize
This is a bad pattern that we are trying to eliminate from the compiler. On a
side note, this commit also eliminates an unfortunate instance of a raw
SILBuilder!
- Rename several symbols to make it clearer whether the ranges they deal with
are open or closed.
- Add comments documenting the implementation of OutdentChecker::hasOutdent
- Fix a bug where code after a doc coment block of the '/**' style was being
indented 1 space.
- Fix IsInStringLiteral not being set if the indent target was in a string
segment of an interpolated multiline string.
- Update OutdentChecker::hasOutdent to propagate indent contexts from
parent parens/brackets/braces to child parens/brackets/braces that start
later in the same line (like FormatWalker already does). This changes the
braces in the example below to 'inherit' a ContextLoc from their parent
square brackets, which have a ContextLoc at 'foo'. This makes the whole
expression be correctly considered 'outdenting':
foo(a: "hello"
b: "hello")[x: {
print("hello")
}]
We were always dropping the error status when returning from parseExprImpl. We
were also incorrectly keeping error status after recovering by finding the
right close token in parseList. This change fixes both, and also updates a few
callers of parseList that assumed when they reported a failure parsing an
element the list as a whole would get error status, which isn't true due to
recovery.