There's a general problem where a SubscriptExpr has an argument
that's a LoadExpr loading a tuple from an lvalue. For some reason
we don't construct the ParenExpr in this case, which confused
CSDiag.
Also, in Swift 3 mode, add a total hack to fudge things in
matchCallArguments() in the case where we erroneously lost
ParenType sugar.
In FailureDiagnosis::visitApplyExpr and CalleeCandidateInfo try to look
through ImplicitlyUnwrappedOptional function type, which improves diagnostics
for calls with invalid arguments.
Resolves: SR-3248.
- In functions called from resolveType(), consistently
use a Type() return value to indicate 'unsatisfied
dependency', and ErrorType to indicate failure.
- Plumb the unsatisfiedDependency callback through the
resolution of the arguments of BoundGenericTypes, and
also pass down the options.
- Before doing a conformance check on the argument of a
BoundGenericType, kick off a TypeCheckSuperclass request
if the type in question is a class. This ensures we don't
recurse through NominalTypeDecl::prepareConformanceTable(),
which wants to see a class with a valid superclass.
- The ResolveTypeOfDecl request was assuming that
the request was satisfied after calling validateDecl().
This is not the case when the ITC is invoked from a
recursive call to validateDecl(), hack this up by returning
*true* from isResolveTypeDeclSatisfied(); otherwise we
assert in satisfy(), and we can't make forward progress
in this case anyway.
- Fix a bug in cycle breaking; it seems if we don't invoke
the cycle break callback on all pending requests, we end
up looping forever in an outer call to satisfy().
- Remove unused TR_GlobalTypeAlias option.
This method gets the GenericTypeDecl for a typealias, nominal type, or
extension thereof. While the result is typed as GenericTypeDecl, it's
not always generic, so rename it accordingly.
An audit of the callers illustrated that they should be using
different entrypoints anyway, so fix all of the callers and make this
function private.
The code here is unprincipled, and mixes archetypes from
multiple sources:
1) The callee's generic environment
2) The caller's generic environment
3) Any random archetypes appearing in the original types of
typealiases, which could come from any generic environment
Initially, an UncurriedCandidate's type starts out as #1,
and then we sometimes set it to type #2 if the candidate is
a method on a base class.
We get #3 by virtue of walking the original types of
SubstitutedTypes created as part of dependent member type
substitution.
However, it turns out the real reason the SubstitutedType
walk existed was so that #2 archetypes that appear in the
UncurriedCandidate's type map to themselves. This doesn't
require looking at the original type of a SubstitutedType
at all; instead we just walk the UncurriedCandidate's type
normally, collecting archetypes.
If we do this, the code doesn't (erroneously) pick up random
archetypes from typealiases, and this changes the output in
the RangeDiagnostics test. While we can debate if the new
diagnostics are better or worse (I think it's either a wash,
or slightly better) the reason they changed is because more
in-depth diagnostic code is now executing, without breaking
things randomly as before. I suspect this is a good thing.
First, ensure all ParamDecls that are synthesized from scratch are given
both a contextual type and an interface type.
For ParamDecls written in source, add a new recordParamType() method to
GenericTypeResolver. This calls setType() or setInterfaceType() as
appropriate.
Interestingly enough a handful of diagnostics in the test suite have
improved. I'm not sure why, but I'll take it.
The ParamDecl::createUnboundSelf() method is now only used in the parser,
and no longer sets the type of the self parameter to the unbound generic
type. This was wrong anyway, since the type was always being overwritten.
This allows us to remove DeclContext::getSelfTypeOfContext().
Also, ensure that FuncDecl::getBodyResultTypeLoc() always has an interface
type for synthesized declarations, eliminating a mapTypeOutOfContext()
call when computing the function interface type in configureInterfaceType().
Finally, clean up the logic for resolving the DynamicSelfType. We now
get the interface or contextual type of 'Self' via the resolver, instead
of always getting the contextual type and patching it up inside
configureInterfaceType().
After recent changes, this asserts on all decls that are not VarDecls,
so we can just enforce that statically now. Interestingly, this turns
up some dead code which would have asserted immediately if called.
Also, replace AnyFunctionRef::getType() with
AnyFunctionRef::getInterfaceType(), since the old
AnyFunctionRef::getType() would just assert when called on
a Decl.
Consider expression with an implicit 'self.' reference:
extension Sequence {
func test() -> Int {
return max(1, 2)
}
}
We currently shadow names that appear in nested scopes, so the
top-level two-argument min()/max() functions are shadowed by the
versions of min()/max() in Sequence (which don’t take the same
number of arguments). This patch aims to improve situation on this
front and produce better diagnostics when that happens, hinting
the user about what went wrong and where matching function is
declared.
Resolves <rdar://problem/25341015>.
A pointless use of polymorphism -- the result values are not
interchangeable in any practical sense:
- For GenericTypeParamDecls, this returned getDeclaredInterfaceType(),
which is an interface type.
- For AssociatedTypeDecls, this returned the sugared AssociatedTypeType,
which desugars to an archetype.
- For TypeAliasDecls, this returned TypeAliasDecl::getAliasType(),
which desugars to a type containing archetypes.
- For NominalTypeDecls, this returned NominalTypeDecl::getDeclaredType(),
which is the unbound generic type, a special case used for inferring
generic arguments when they're not written in source.
This triggered with an existing test that was run against
some other changes I was working on, but the current code
does not demonstrate the problem.
However, it's "obviously" more "correct", so here we go.
When collecting callee candidates, get the interface type and map it
into context, rather than calling getType() directly.
This produces almost the same result as getType(), except for two
differences:
1) where getType() would return a PolymorphicFunctionType with
a reference to the original generic parameters, now CSDiag just
sees a simple FunctionType appears with discombobulated archetypes.
This is fine since CSDiag does not do anything specific with
PolymorphicFunctionTypes.
2) substGenericArgs() calls Type::subst(), which produces
SubstitutedType sugar. This would confuse CSDiag, which was not
prepared to see SubstitutedTypes that arise from this particular
call of Type::subst().
As mentioned in a previous commit message, CSDiag should be
refactored to get substitutions via a call to getMemberSubstitutions(),
instead of 'recovering' them from result of getTypeOfMember().
Until this can be fixed, I'm just manually stripping off any
SubstitutedTypes that appear from the call to substGenericArgs().
A better approach here would be to redo the callee diagnostics
stuff to look at interface types, plumbing through the correct
generic signatures. Then questions like 'what protocols does this
generic parameter conform to' can be asked of the signature and
interface type, instead of looking at archetypes.
When matching a potential callee type against the actual argument
types in a call expression, CSDiag would walk both types in parallel,
noting differences and recording archetype substitutions in a map.
We would also walk the callee type and attempt to recover substitutions
that came from a 'self.foo' method application where self is a generic
nominal type. This relied on the fact that Type::subst() leaves behind
the old types in the form of SubstitutedType sugar.
Only one of the two call sites of findGenericSubstitutions() used the
second feature, so move the SubstitutedType walk over to that call site
instead of doing it for both.
Unfortunately the SubstitutedType walk is somewhat fragile, because
multiple subst() calls with different generic contexts will leave behind
an unpredictable structure of SubstitutedTypes. The code in CSDiag has
some heuristics to bail out when things don't make sense, but I think it
would be simpler to just call getMemberSubstitutions() and pass around
the substitution map together with the substituted type and original decl.
Notably, the only other place we rely on SubstitutedType showing up in
the result of Type::subst() is accessibility and availability checking
for nested typealiases, like 'Foo<T>.Bar', so it might be wise to redo
these passes to operate on the types of decls before we apply generic
parameters. Then we could remove SubstitutedType altogether.
Previously, getInterfaceType() would return getType() if no
interface type was set. Instead, always set an interface type
explicitly.
Eventually we want to remove getType() altogether, and this
brings us one step closer to this goal.
Note that ParamDecls are excempt from this treatment, because
they don't have a proper interface type yet. Cleaning this up
requires more effort.
Add special logic to FailureDiagnosis::visitApplyExpr to
handle situation like following:
struct S {
mutating func f(_ i: Int) {}
func f(_ f: Float) {}
}
Given struct has an overloaded method "f" with a single argument of
multiple different types, one of the overloads is marked as
"mutating", which means it can only be applied on LValue base type.
So when struct is used like this:
let answer: Int = 42
S().f(answer)
Constraint system generator is going to pick `f(_ f: Float)` as
only possible overload candidate because "base" of the call is immutable
and contextual information about argument type is not available yet.
Such leads to incorrect contextual conversion failure diagnostic because
type of the argument is going to resolved as (Int) no matter what.
To workaround that fact and improve diagnostic of such cases we are going
to try and collect all unviable candidates for a given call and check if
at least one of them matches established argument type before even trying
to re-check argument expression.
Resolves: <rdar://problem/28051973>.
Make SolverState manage whether the ConstraintSystem it belongs to has a
current SolverState.
Also a couple minor formatting fixes for ternary expressions involving
solverState.
This function had a weird, pre-ProtocolConformanceRef interface that
returned true when the type conformed to the protocol, then had a
separate indirect return value for the concrete conformance (if there
is one). Refactor this API, and the similar
TypeChecker::containsProtocol(), to produce an optional
ProtocolConformanceRef, which is far more idiomatic and easier to
use. Push ProtocolConformanceRef into a few more places. Should be NFC
Expand the context conversion failure check used to fix
rdar://28909024 to cover the case where we have a single-argument
mismatch and there is a type parameter deduction as well.
When trying to diagnose ambigiuty with constraint system check if any of the
unresolved type variables are related to generic parameters, and if so
try to diagnose a problem based on the number of constraints associated with
each of the unresolved generic parameters.
Number of constraints related to a particular generic parameter
is significant indicator of the problem, because if there are
no constraints associated with it, that means it can't ever be resolved,
such helps to diagnose situations like: struct S<A, B> { init(_ a: A) {}}
because type B would have no constraints associated with it.
As an extension of SR-2208 apply contextual conversion failure checking
to all of the expressions diagnosed via FailureDiagnosis::visitApplyExpr.
Resolves <rdar://problem/28909024>.
The 'literalConformanceProto' field of
TypeVariableType::Implementation didn't take into account equivalence
classes of type variables. Eliminate it, and either look at the actual
expressions (for optimizing constraints during constraint generation)
or the actual constraints on a given type variable (for determining
whether to include optionals in the set of potential type variable
bindings).
(cherry picked from commit 6bdd9cfae5)
This reverts commit 6bdd9cfae5. This
commit *appears* to be breaking something in Dollar involving
inference with array literals and 'nil'; pull it back for more
investigation.
The 'literalConformanceProto' field of
TypeVariableType::Implementation didn't take into account equivalence
classes of type variables. Eliminate it, and either look at the actual
expressions (for optimizing constraints during constraint generation)
or the actual constraints on a given type variable (for determining
whether to include optionals in the set of potential type variable
bindings).
We had a few places that were performing ad hoc variants of
ConstraintSystem::getFixedTypeRecursive(); simplify it's interface so
we can use it everywhere consistently. Fixes rdar://problem/27261929.
Similar to “isTypeParameter,” this new entry point determines whether the type is a type variable or a nested type of a type thereof. The latter case isn’t actually formed yet, so this is NFC staging the trivial bits of this change.