Re-write the parameter list to use (U)Int if that was one of the substituted parameters. This is required because Windows will incorrectly map Int -> long long -> Int64.
Clients can explicitly ask for the opened existential type on the archetype's generic environment,
or use `getExistentialType` to obtain a specific archetype's upper bounds.
Situations like `T(...) { ... }` where `T` is a callable type
and trailing closure belongs to `.callAsFunction` should be
rewritten as `T.init().callAsFunction { ... }`.
`shouldCoerceToContextualType` used `solution.getType(ASTNode)`
which returns a type that has type variables in it. To properly
check whether result type needs a coercion it has to be resolved
first which is done via `solution.getResultType(ASTNode)`.
Resolves: rdar://88285682
Opened archetypes can be created in the constraint system, and the
existential type it wraps can contain type variables. This can happen
when the existential type is inferred through a typealias inside a
generic type, and a member reference whose base is the opened existential
gets bound before binding the generic arguments of the parent type.
However, simplifying opened archetypes to replace type variables is
not yet supported, which leads to type variables escaping the constraint
system. We can support cases where the underlying existential type doesn't
depend on the type variables by canonicalizing it when opening the
existential. Cases where the underlying type requires resolved generic
arguments are still unsupported for now.
Nested archetypes are represented by their base archetype kinds (primary,
opened, or opaque type) with an interface type that is a nested type,
as represented by a DependentMemberType. This provides a more uniform
representation of archetypes throughout the frontend.
We were never setting these opaque type substitutions, but code
generation was silently failing. Now we assert, so move the code into
the proper common location so we always set opaque type substitutions
on properties.
Fixes rdar://86800325.
Opaque opaque types and record them within the "opened types" of the
constraint system, then use that information to compute the set of
substitutions needed for the opaque type declaration using the normal
mechanism of the constraint solver. Record these substitutions within
the underlying-to-opaque conversion.
Use the recorded substitutions in the underlying-to-opaque conversion
to set the underlying substitutions for the opaque type declaration
itself, rather than reconstructing the substitutions in an ad hoc manner
that does not account for structural opaque result types.
Instead of checking the frontend flag directly, let's use dedicated
method on a constraint system to determine whether closure did
participate in inference or not.
Insert an implicit conversion from pack types to tuples with equivalent parallel structure. That means
1) The tuple must have the same arity
2) The tuple may not have any argument labels
3) The tuple may not have any variadic or inout components
4) The tuple must have the same element types as the pack
The heart of this patchset: An inference scheme for variadic generic functions and associated pack expansions.
A traditional variadic function looks like
func foo<T>(_ xs: T...) {}
Along with the corresponding function type
<T>(T [variadic]) -> Void
which the constraint system only has to resolve one two for. Hence it opens <T> as a type variable and uses each argument to the function to try to solve for <T>. This approach cannot work for variadic generics as each argument would try to bind to the same <T> and the constraint system would lose coherency in the one case we need it: when the arguments all have different types.
Instead, notice when we encounter expansion types:
func print<T...>(_ xs: T...)
print("Macs say Hello in", 42, "different languages")
We open this type as
print : ($t0...) -> ($t0...)
Now for the brand new stuff: We need to create and bind a pack to $t0, which will trigger the expansion we need for CSApply to see a coherent view of the world. This means we need to examine the argument list and construct the pack type <$t1, $t2, $t3, ...> - one type variable per argument - and bind it to $t0. There's also the catch that each argument that references the opened type $t0 needs to have the same parallel structure, including its arity. The algorithm is thus:
For input type F<... ($t0...), ..., ($tn..) ...> and an apply site F(a0, ..., an) we walk the type `F` and record an entry in a mapping for each opened variadic generic parameter. Now, for each argument ai in (a0, ..., an), we create a fresh type variable corresponding to the argument ai, and record an additional entry in the parameter pack type elements corresponding to that argument.
Concretely, suppose we have
func print2<T..., U...>(first: T..., second: U...) {}
print2(first: "", 42, (), second: [42])
We open print2 as
print2 : ($t0..., $t1...) -> Void
And construct a mapping
$t0 => <$t2, $t3, $t4> // eventually <String, Int, Void>
$t1 => <$t5> // eventually [Int]
We then yield the entries of this map back to the solver, which constructs bind constraints where each => exists in the diagram above. The pack type thus is immediately substituted into the corresponding pack expansion type which produces a fully-expanded pack type of the correct arity that the solver can actually use to make forward progress.
Applying the solution is as simple as pulling out the pack type and coercing arguments element-wise into a pack expression.
The new type, called ExistentialType, is not yet used in type resolution.
Later, existential types written with `any` will resolve to this type, and
bare protocol names will resolve to this type depending on context.
- Frontend: Implicitly import `_StringProcessing` when frontend flag `-enable-experimental-string-processing` is set.
- Type checker: Set a regex literal expression's type as `_StringProcessing.Regex<(Substring, DynamicCaptures)>`. `(Substring, DynamicCaptures)` is a temporary `Match` type that will help get us to an end-to-end working system. This will be replaced by actual type inference based a regex's pattern in a follow-up patch (soon).
- SILGen: Lower a regex literal expression to a call to `_StringProcessing.Regex.init(_regexString:)`.
- String processing runtime: Add `Regex`, `DynamicCaptures` (matching actual APIs in apple/swift-experimental-string-processing), and `Regex(_regexString:)`.
Upcoming:
- Build `_MatchingEngine` and `_StringProcessing` modules with sources from apple/swift-experimental-string-processing.
- Replace `DynamicCaptures` with inferred capture types.
With `-enable-experimental-string-processing`,
start lexing `'` delimiters as regex literals (this
is just a placeholder delimiter for now). The
contents of which gets passed to the libswift
library, which can return an error string to be
emitted, or null for success.
The libswift side isn't yet hooked up to the Swift
regex parser, so for now just emit a dummy
diagnostic for regexes starting with quantifiers.
If successful, build an AST node which will be
emitted as an implicit call to an
`init(_regexString:)` initializer of an in-scope
`Regex` decl (which will eventually be a known
stdlib decl).
Now that the CSApply just uses components, we can
better split up the key path constructors to either
accept a set of resolved components, or a parsed
root or path.
The logic here could form AST loops due to passing
in `anchor` for the key path's parsed path.
However setting a parsed path here seems to be a
holdover from the CSDiag days, so set the path to
`nullptr` and rip out and the rest of the synthesis
and SanitizeExpr logic for it.
rdar://85236369
This logic cannot live in `ActorIsolationChecker` because
all of the relevant information is only accessible through
constraint system while applying solutions.
Use newly added `isDistributedThunk` check to determine whether
the given call is to a remote distributed actor and if so mark
it as implicitly throwing.
Resolves: rdar://83610106