When checking whether a particular protocol conformance satisfies all of
the protocol's requirements, we were suppressing substitution failures.
In some cases, this would mean that we marked a conformance "invalid"
without ever emitting a diagnostic, which would lead to downstream crashes.
Instead, treat substitution failures somewhat more lazily. If we encounter
one while performing the checking, put the conformance into a "delayed" list
rather than failing immediately. Teach the top-level type checking
loop to re-check these conformances, emitting a diagnostic if they
fail the second time around.
Fixes rdar://problem/35082483 and likely other issues that slipped
through the type checker or blew up in unpredictable ways.
When forming a specialized protocol conformance, we substitute into the
conditional requirements. Allow this substitution to look into the module
to find conformances, which might be required to accurately represented
the requirements. Otherwise, we can silently end up dropping them.
We should rethink this notion of eagerly substituting conditional
requirements, and consider whether clients should always handle this
substitution. For now, fixes rdar://problem/35837054.
Allow conformance lookup in module context for conditional
A while ago, we commented-out an obvious optimization when accessing the
associated types of a protocol conformance to get better coverage of the
general substitution path. Bring that code back, avoiding the creation of
substitution maps in a few cases.
When we substitute into an inherited conformance, make sure that we
follow the superclass chain from the new conforming type up to the
matching superclass *before* doing the substitution.
Fixes rdar://problem/35632543.
Substitute the type arguments of the conforming type into the conditional
requirements of the specialized conformance, so they reflect the specific
requirements of the specialized conformance.
Fixes rdar://problem/34944286.
Rather than storing contextual types in the type witnesses and associated
conformances of NormalProtocolConformance, store only interface types.
@huonw did most of the work here, and @DougGregor patched things up to
complete the change.
When calling an accessor, one has to pull the witness tables for each
conditional conformance requirement into a(n appropriately ordered) buffer that
is passed to the accessor. This is simple enough, if the appropriate
specialization of the relevant conformances are known, which the compiler didn't
track deep enough until now.
Previously ProtocolConformance::subst would crash because it was receiving
things with an unexpected relationship between the conformance's type and the
substituted self type. The compiler doesn't quite properly model "abstract"
inherited conformances, so we end up using normal conformances instead, and we
need to work around this in some cases.
Introduce GenericSignature::requirementsNotSatisfiedBy(otherSig) to
compute the set of requirements in a generic signature that aren't satisfied
by some other generic signature. This is used both for conditional
conformances (the conditional requirements) and for name mangling of
constrained extensions/protocol conformances.
This allows determining which requirements make a conformance conditional; as
in, which requirements aren't known as part of the type itself.
Additionally, use this to assert that a few builtin protocols aren't
conditionally-conformed-to, something we won't support for now.
When setting the signature conformances in a NormalProtocolConformance,
do so progressively rather than waiting until all of them are computed.
This allows later requirements to refer to earlier conformances.
If an enum has a raw type, and a SynthesizedProtocolAttr, don't add
two duplicate conformances, since that triggers an assertion.
Instead, make sure to only add each conformance once. We cannot just
drop the SynthesizedProtocolAttr in this case, because we need to
pull the LazyConformanceLoader out of it.
In almost all other places, 'resolver' means the ASTContext's LazyResolver,
which is just an abstract base class for the TypeChecker instance to break
circularity.
But here it is something totally different, used to lazily populate
imported and deserialized conformances, not parsed conformances.
Rather than pretend that the requirement signature of a protocol is a
full, well-formed generic signature that one can meaningfully query,
treat it as a flat set of requirements. Nearly all clients already did
this, but make it official. NFC
substitutions for calling a specialized declaration.
For full generality, this really ought to be a Witness, but the current
use cases where we're constructing calls to specialized witnesses never
need to call a generic requirement, and I'm not sure how to apply
substitutions to a Witness with a synthetic environment.
Extend SubstOptions, which controls how substitution is performed, to
allow the caller to subst() to provide a callback function that may
provide a type witness for a normal protocol conformance that is
undergoing type witness inference. In effect, it's allowing us to
provide tentative bindings for type witnesses so we can see the
effects of substitution.
SubstitutionMap::lookupConformance() would map archetypes out
of context to compute a conformance path. Do the same thing
in SubstitutionMap::lookupSubstitution().
The DenseMap of replacement types in a SubstitutionMap now
always has GenericTypeParamTypes as keys.
This simplifies some code and brings us one step closer to
a more efficient representation of SubstitutionMaps.
Enums with the ns_error_domain attribute represent codes for NSError,
which means Swift developers will expect to interact with them in
terms of Error. SE-0112 improved bridging for these enums to generate
a struct with the following form:
struct MyError: Error {
@objc enum Code: RawRepresentable {
case outOfMemory
case fileNotFound
}
var userInfo: [NSObject: AnyObject] { get }
static var outOfMemory: Code { get }
static var fileNotFound: Code { get }
}
where MyError.Code corresponds to the original MyError enum defined in
Objective-C. Until recently, both the enum and the synthesized struct
were marked as having the original enum as their "Clang node", but
that leads to problems: the struct isn't really ObjC-compatible, and
the two decls have the same USR. (The latter had already been worked
around.)
This commit changes the struct to be merely considered a synthesized
"external definition", with no associated Clang node. This meant
auditing everywhere that's looking for a Clang node and seeing which
ones applied to external definitions in general.
There is one regression in quality here: the generated struct is no
longer printed as part of the Swift interface for a header file, since
it's not actually a decl with a corresponding Clang node. The previous
change to AST printing mitigates this a little by at least indicating
that the enum has become a nested "Code" type.
Instead of just falling back to module lookup any time conformance
substitution fails, only do it in the case we know doesn't work.
This should shake out some more bugs, hopefully without causing
too much pain.
Looking up the conformance @dynamic_self C<T> : P simply returns
the normal conformance C<T> : P.
If we later apply a substitution map to the conformance to
specialize T, we would call isSpecialized() on the substituted
type, which would return false. We would then hit an assertion
because the type type @dynamic_self C<Int> is not equal to the
(erroneously unsubstituted) conformance type C<T>.
Tweak the logic slightly to avoid this.