Many clients of the conformance lookup operations would prefer to get
an invalid conformance (== there is no conformance) rather than a
missing conformance. Parameterize the conformance lookup operations so
that most callers won't see missing conformances, by filtering them
out at the end. Opt-in those callers that do want to see missing
conformances so they can be diagnosed.
Start treating the null {Can}GenericSignature as a regular signature
with no requirements and no parameters. This not only makes for a much
safer abstraction, but allows us to simplify a lot of the clients of
GenericSignature that would previously have to check for null before
using the abstraction.
This is just a straight port of the existing code in the GSB, with minimal changes.
It could be made more efficient in the future by trafficking in Terms rather than
Types, avoiding some intermediate conversion and canonicalization steps.
We compute the canonical type by first simplifying the type term, and
then checking if it is a concrete type. If there's no concrete type,
we convert the simplified term back to an interface type and return
that; otherwise, we canonicalize any structural sub-components of
the concrete type that contain interface types, and so on.
Due to a quirk of how the existing declaration checker works, we also
need to handle "purely concrete" member types, eg if I have a
signature `<T where T == Foo>`, and we're asked to canonicalize the
type `T.[P:A]` where Foo : A.
This comes up because we can derive the signature `<T where T == Foo>`
from a generic signature like `<T where T : P>`; adding the
concrete requirement 'T == Foo' renders 'T : P' redundant. We then
want to take interface types written against the original signature
and canonicalize them with respect to the derived signature.
The problem is that `T.[P:A]` is not a valid term in the rewrite system
for `<T where T == Foo>`, since we do not have the requirement T : P.
A more principled solution would build a substitution map when
building a derived generic signature that adds new requirements;
interface types would first be substituted before being canonicalized
in the new signature.
For now, we handle this with a two-step process; we split a term up
into a longest valid prefix, which must resolve to a concrete type,
and the remaining suffix, which we use to perform a concrete
substitution using subst().
This becomes a utility that maps the requirement's types into the
generic environment if needed before calling out to the new
Requirement::isSatisfied().
The new approach is to not look at RequirementSources at all. Instead,
we exhaustively enumerate all conformance access paths, beginning
from the root conformance requirements in the signature, then doing
all conformance requirements from those protocols' requirement
signatures, and so on.
We enumerate conformance access paths in breadth first order by
length until we find the one we want. The results are memoized.
This fixes a regression with another change I'm working on. The
test case does not fail with this PR alone, but I'm adding it now
anyway.
Doing this when computing a canonical signature didn't really
make sense because canonical signatures are not canonicalized
any more strongly _with respect to the builder_; they just
canonicalize their requirement types.
Instead, let's do these checks after creating the signature in
computeGenericSignature().
The old behavior had another undesirable property; since the
canonicalization was done by registerGenericSignatureBuilder(),
we would always build a new GSB from scratch for every
signature we compute.
The new location also means we do these checks for protocol
requirement signatures as well. This flags an existing fixed
crasher where we still emit bogus same-type requirements in
the requirement signature, so I moved this test back into
an unfixed state.
Previously we would look for a derived source before an explicit one,
on account of the explicit one possibly being redundant. However, the
presence of 'self-derived' sources meant that we had to call
getMinimalConformanceSource() to ensure the derived sources
were actually usable and would not produce an infinite conformance
access path.
I'd like to remove getMinimalConformanceSource() now that we have
an alternate algorithm to identify redundant explicit requirements.
Instead, we can handle the explicit case first, by checking for a
conformance requirement in the generic signature -- its presence
means it was not redundant, by construction.
Then once we handle that case, we know we're going to use a derived
source, and finding the shortest one seems to be good enough.
This fixes the IRGen crash in https://bugs.swift.org/browse/SR-11153;
the requirement signatures in that test still have unnecessary
same-type requirements printed, so I added a separate RUN: line
for those, and it's marked as known-failing with 'not %FileCheck'.
A new implementation from "first principles". The idea is that
for a given conformance, we either have an explicit source
which forms the root of the requirement path, or a derived
source, which we 'factor' into a parent type/parent protocol
pair, and a requirement signature requirement.
We recursively compute the conformance access path of the
parent type and parent protocol, and append the path element
for the requirement.
This fixes a long-standing crasher, and eliminates two hacks,
the 'usesRequirementSource' flag in RequirementSource, and
the 'HadAnyRedundantConstraints' flag in GenericSignatureBuilder.
Fixes https://bugs.swift.org/browse/SR-7371 / rdar://problem/39239511