Since we only call one parsing function (i.e. parseAbstructFunctionBody,
parseDecl, or parseStmtOrExpr), the parser stops at the end of the node.
It's not necessary to limit the Lexer to set an ArtificialEOF.
To minimize the parsing range, modify the Parser to *not* parse the body
if the completion happens in the signature.
* [TypeChecker] Enclosing stubs protocol note within editor mode
* [test] Removing note from test where there is no -diagnostics-editor-mode flag
* Formatting modified code
* [tests] Fixing tests under validation-tests
Patch up all the places that are making a syntactic judgement about the
isInvalid() bit in a ValueDecl. They may continue to use that query,
but most guard themselves on whether the interface type has been set.
This is an amalgam of simplifications to the way VarDecls are checked
and assigned interface types.
First, remove TypeCheckPattern's ability to assign the interface and
contextual types for a given var decl. Instead, replace it with the
notion of a "naming pattern". This is the pattern that semantically
binds a given VarDecl into scope, and whose type will be used to compute
the interface type. Note that not all VarDecls have a naming pattern
because they may not be canonical.
Second, remove VarDecl's separate contextual type member, and force the
contextual type to be computed the way it always was: by mapping the
interface type into the parent decl context.
Third, introduce a catch-all diagnostic to properly handle the change in
the way that circularity checking occurs. This is also motivated by
TypeCheckPattern not being principled about which parts of the AST it
chooses to invalidate, especially the parent pattern and naming patterns
for a given VarDecl. Once VarDecls are invalidated along with their
parent patterns, a large amount of this diagnostic churn can disappear.
Unfortunately, if this isn't here, we will fail to catch a number of
obviously circular cases and fail to emit a diagnostic.
- Utilize ignoreToken() to skip tokens while keeping them in syntax tree
- SyntaxParsedResult now holds ParsedRawSyntaxNode and ParserStatus
- Simplify migration support for 'TypeName[]' type
- Use builder for generic argument clause parsing
Instead of creating the AST directly in the parser (and libSyntax or
SwiftSyntax via SyntaxParsingContext), make Parser to explicitly create
a tree of ParsedSyntaxNodes. Their OpaqueSyntaxNodes can be either
libSyntax or SwiftSyntax. If AST is needed, it can be generated from the
libSyntax tree.
Under non-editor mode, the fixit for inserting protocol stubs is associated with a note
pointing to the missing protocol member declaration which could stay in a separate file from
the conforming type, leading to the behavior of rdar://51534405. This change checks if
the fixit is in a separate file and issues another note to carry the fixit if so.
rdar://51534405
- When parsing a type or extension declaration, attempt to parse a function or property declaration when meeting an identifier, an operator or a paren (for tuple declarations).
- Produce the diagnostic with a fix-it suggesting to insert the needed keyword
- Recover parsing as if the declaration with the missing keyword is a function/property declaration
Resolves https://bugs.swift.org/browse/SR-10477
* Revert "Merge pull request #23791 from compnerd/you-know-nothing-clang"
This reverts commit 5150981150, reversing
changes made to 8fc305c03e.
* Revert "Merge pull request #23780 from compnerd/math-is-terrible"
This reverts commit 2d7fedd25f, reversing
changes made to 0205150b8f.
* Revert "Merge pull request #23140 from stephentyrone/mafs"
This reverts commit 777750dc51, reversing
changes made to 0c8920e747.
This commit implements SE-0246, by adding conformance to Real to the Float, CGFloat, Double, and Float80 types, implemented either in terms of the system's C math library, existing standard library functionality, or LLVM intrinsics. It includes basic test coverage for these new functions, and deprecates and obsoletes *some* existing functionality in the Platform overlay. We still need to make a decision about how to handle the remaining "tgmath" functions, because obsoleting them is technically a source-breaking change (if users have unqualified names like "exp(1)", it's fine, but it would break users who have used qualified names like "Darwin.exp(1)".)
The unresolved identifier 'esp' used to have '_exp' as a typo-correction option. That's no longer the case, and the type checker now singles out 'test' as the unique correction.
We still had unavailable versions of these for floating-point types
only. We shouldn't need to keep these around, and can instead just
emit a helpful diagnostic for anyone that attempts to use them.
Unfortunately I don't see any way for the diagnostic to produce an
actual fix-it, so it just suggests '+= 1' or '-= 1' without actually
producing a fix.
for any call that is applied to a name i.e. name() the diagnostic will use 'function'
for calls to anonymous functions i.e {...}() the diagnostic will use 'closure'
The amp_prefix token is currently tolerated in any unary expression
context and then diagnosed later by Sema. This patch changes parsing to
only accept tok::amp_prefix in its allowed position: parameter lists.
This also fixes two "compiler crasher" tests.
Also remove mention of the word “contextual” type from the diagnostic
that rewrites array literals into dictionary literals and scale back
the scope of the diagnostic. This method was catching and
mis-diagnosing too many errors that could better be handled by invalid
conversion diagnostics.
the item list if we find it. (While still erroring that we expected a brace on the first bad token.) Improves recovery in general and in SR-5943 in particular.