To remove some callers of 'is<InOutType>' after Sema, start using what will soon be a structural invariant - the only expressions that can possibly have 'inout' type are semantically InOut expressions.
In anticipation of future attributes, and perhaps the ability to
declare lvalues with specifiers other than 'let' and 'var', expand
the "isLet" bit into a more general "specifier" field.
The outside representation already went to a flat set of requirements;
make the internal representation match so we aren't tempted to use the
requirement signature as inputs to a generic signature.
Rather than pretend that the requirement signature of a protocol is a
full, well-formed generic signature that one can meaningfully query,
treat it as a flat set of requirements. Nearly all clients already did
this, but make it official. NFC
Using the attribute in this position is a relic from the Swift 2
days, and fixing it required letting invalid code fall through to
Sema instead of being diagnosed in Parse proper. Treat 'var'
in this position like 'let' by simply offering to remove it
instead of extracting it into a separate variable.
Push the getName method from ValueDecl down to only those types that are
guaranteed to have a name that is backed by an identifier and that will
not be special.
The -enable-testing flag makes ValueDecl::getEffectiveAccess()
say that internal declarations are public.
This would lead us to emit spurious diagnostics if a default
argument of an internal function referenced a private symbol,
for example, which is something we actually want to allow.
This is a second revision of the patch -- instead of changing
getEffectiveAccess() to take an extra parameter, this changes
getFormalAccessScope() instead.
Fixes <rdar://problem/32592973>.
The -enable-testing flag makes ValueDecl::getEffectiveAccess()
say that internal declarations are public.
This would lead us to emit spurious diagnostics if a default
argument of an internal function referenced a private symbol,
for example, which is something we actually want to allow.
Hack around this by adding a new 'forLinkage' parameter to
getEffectiveAccess(). When this is false, we ignore the
-enable-testing flag, and only look for the @_versioned
attribute.
I'm not very happy with the fix, because it only compliates
the subtle behaviors of getFormalAccess(), getEffectiveAccess()
and getFormalAccessScope() further. But refactoring this is
a bigger change than I'm willing to put into swift-4.0-branch.
Fixes <rdar://problem/32592973>.
This changes `getBaseName()` on `DeclName` to return a `DeclBaseName`
instead of an `Identifier`. All places that will continue to be
expecting an `Identifier` are changed to call `getBaseIdentifier` which
will later assert that the `DeclName` is actually backed by an
identifier and not a special name.
For transitional purposes, a conversion operator from `DeclBaseName` to
`Identifier` has been added that will be removed again once migration
to DeclBaseName has been completed in other parts of the compiler.
Unify approach to printing declaration names
Printing a declaration's name using `<<` and `getBaseName()` is be
independent of the return type of `getBaseName()` which will change in
the future from `Identifier` to `DeclBaseName`
Protocol requirements involving same-type-to-Self constraints cannot
be witnessed by declarations in non-final classes that have the same
form of same-type requirement to the corresponding class type, because
it creates a soundness hole with subclasses:
protocol Q {
func foo<T: P>(_: T, _: T.T) where T.T == Self
}
class C: Q {
func foo<T: P>(_: T, _: C) where T.T == C {}
}
class D: C {
// in D, T.T == D does not hold
}
Warn about this in Swift 3 compatibility mode, error on it in Swift 4
mode. When possible, provide a note + Fix-It suggesting that the
same-type constraint might be weakened to a superclass constraint
(which works with subclassing).
Fixes rdar://problem/30398503.
The AST verifier was causing deserialization of generic environments,
which slows things down considerably and affects our ability to test
for laziness in deserialization. Prevent it from doing so---and only
do the extra checkig if something else deserialized the generic
environment already.
... except there are some cases where it happens through means that
are harder to control (e.g., the AST walker for patterns) that need
more thought.
Resolves: https://bugs.swift.org/browse/SR-4426
* Make IfConfigDecl be able to hold ASTNodes
* Parse #if as IfConfigDecl
* Stop enclosing toplevel #if into TopLevelCodeDecl.
* Eliminate IfConfigStmt
The "common return type" optimization that occurs at
constraint-generation time was overly complicated and wrong in a
couple of places. Fix up these issues:
* Don't cache information about this optimization in the decls
themselves; it ignores visibility as well as invalidation that
occurs in script mode and playgrounds. Just do the the work again
* Don't map type into context; we can bail out when there are type
parameters present
* Skip the "self" parameters of operators in types (since this only
ever occurs for operators).
Longer term, this optimization should move into the solver itself. But
for now, at least it's cleaner/more correct.
An early optimization in constraint generation attempted to simplify
type construction (e.g., X(...)) when the type in question has no
failable initializers. However, the optimization didn't appropriately
clear out the cached bit when new information became available (e.g.,
new conformances, new extensions), and didn't seem to help anything
performance-wise (type-checking times didn't increase at all when I
turned this off).
Fixes rdar://problem/30588177.
This isn't an inherent limitation of the language---in fact, it would
be a problem for library evolution if you had to know a superclass's
full vtable contents to generate the vtable for a subclass. However,
that's exactly where we are today, and that's not going to change for
Swift 4.
One small hole in the Swift 3 / Swift 4 story.
More rdar://problem/31878396
As such, we no longer insert two placeholders for initializers that
need two vtable slots; instead we record that in the
MissingMemberDecl. I can see MissingMemberDecl growing to be something
we'd actually show to users, that can be used for other kinds of
declarations that don't have vtable entries, but for now I'm not going
to worry about any of that.
This lets us serialize that decision, which means we can conceivably
/change/ the decision in later versions of the compiler without
breaking existing code. More immediately, it's groundwork that will
eventually allow us to drop decls from the AST without affecting
vtable layout.
This isn't actually a great answer; what we really want is for SIL
vtables to be serialized consistently and treated as the point of
truth. But that would be more change than we're comfortable taking in
the Swift 4 timeframe.
First part of rdar://problem/31878396.
* Allow CodingKey conformance to be automatically derived for enums
which have no raw type (with no associated values) and which have
a raw type of String or Int.
* Allow Encodable and Decodable conformance to be automatically derived
for classes and structs with Encodable/Decodable properties
* Add initial unit tests for verifying derived conformance
(which can happen if an imported class has un-importable initializers)
Our initializer model guarantees that it's safe to inherit convenience
initializers when a subclass has implemented all designated
initializers, since each convenience initializer will be implemented
by calling one of the designated initializers. If one of the
designated initializers /can't/ be implemented in Swift, however,
then inheriting the convenience initializer would not be safe.
This is potentially a source-breaking change, so the importer will
only actually record that it failed to import something in when
compiling in Swift 4 mode.
rdar://problem/31563662
* Use the presence of an argument type to check for associated values
hasOnlyCasesWithoutAssociatedValues returns true for any serialized
enum declaration whether or not it has cases. This never really came
up because it's mostly relevant to Sema's proto-deriving mechanism. Fix
this by using the presence of the case's argument type instead.
* Separate checks for presence of cases and enum simplicity
Necessary because the old behavior was an artifact of the
implementation.
This commit does a few things:
1. It uses SwitchEnumBuilder so we are not re-inventing any wheels.
2. Instead of hacking around not putting in a destroy for .None on the fail
pass, just *do the right thing* and recognize that we have a binary case enum
and in such a case, just emit code for the other case rather than use a default
case (meaning no cleanup on .none).
rdar://31145255