Otherwise the "nonisolated nonsending by default" mode blows up as
distributed thunk signatures dont match expectations.
This undoes the fix from https://github.com/swiftlang/swift/pull/83940
and applies the fix on the synthesis side of the distributed thunks,
such that they are @concurrent always -- which keeps their old semantics
basically, regardless of what "default" mode we have.
the new NonisolatedNonsendingByDefault upcoming feature breaks remote
calls in distributed actors, because the expected isolation doesn't
match and the runtime swift_distributed_execute_target_resume will
crash.
This is a short term fix to unblock adopters, however preferably we
should mark the thunks as nonisolated(nonsending), though that seems to
be more involved.
resolves rdar://159247975
Disabling a few tests. The distributed tests are failing for the same
reason they fail on Linux, the rpaths and library search paths are
mucked up. Fixing that shouldn't be too hard, but should be enabled on
both platforms at once.
CollectiveTransformers doesn't work because it imports Darwin directly.
There is a note that is several years old saying that we should port the
test to the other platforms, but that is beyond the scope of this PR at
the moment.
Or rather, the simulator. but there's no need to run it there to begin
with.
We'll be getting failures like:
```
line 2: .../llvm-macosx-x86_64/./bin/split-file: No such file or directory
```
if we try
resolves rdar://155987313
The previous message was just suggesting unchecked Sendable, but instead
we should be suggesting to add final to the class. We also don't
outright suggest using unchecked Sendable -- following
https://github.com/swiftlang/swift/pull/81738 precedent.
Resolves rdar://155790695
The reason I am doing this is that we have gotten reports about certain test
cases where we are emitting errors about self being captured in isolated
closures where the sourceloc is invalid. The reason why this happened is that
the decl returned by getIsolationCrossing did not have a SourceLoc since self
was being used implicitly.
In this commit I fix that issue by using SIL level information instead of AST
level information. This guarantees that we get an appropriate SourceLoc. As an
additional benefit, this fixed some extant errors where due to some sort of bug
in the AST, we were saying that a value was nonisolated when it was actor
isolated in some of the error msgs.
rdar://151955519
We had a number of problems either "only in" or "only without" library
evolution and protocols, so in order to increase the test coverage, run
a few of the crucial tests in both modes.
This actually manifested as an pointer auth crash, but the real reason
being is that we messed up the order of elements in the witness table.
If we'd skip the accessor like this, the types we sign/auth with would
no longer align and manifest in a crash.
There is no real reason to skip this entry so we just bring it back, and
avoid making this special in any way.
This unlocks a few tests as well as corrects any distributed+protocol
use where a requirement distributed var was _followed by_ other
requirements.
resolves rdar://125628060
In this case, what is happening is that in SILGen, we insert implicit
DistributedActor.asLocalActor calls to convert a distributed actor to its local
any Actor typed form. The intention is that the actor parameter and result are
considered the same... but there is nothing at the SIL level to enforce that. In
this commit, I change ActorInstance (the utility that defines actor identity at
a value level) to look through such a call.
I implemented this by just recognizing the decl directly. We already do this in
parts of SILGen, so I don't really see a problem with doing this. It also
provides a nice benefit that we do not have to modify SILFunctionType to
represent this or put a @_semantic attribute on the getter.
NOTE: Generally, Sema prevents us from mixing together different actors. In this
case, Sema does not help us since this call is inserted implicitly by the
distributed actor implementation in SILGen. So this is not a problem in general.
rdar://152436817
... in distributed_actor_accessor_thunks_32bit.swift
Spotted a typo while skimming through the test so let's fix it. it
should be CHECK-SAME and not CHECK_SAME which would just do nothing
When issuing warnings about an import not needing to be public, we did
not account for the Distributed module MUST be imported when a
distributed actor is declared. This also actually means that a public
distributed actor effectively is a public use of the DistributedActor
protocol
resolves rdar://152129980
Replaces generic `expression is 'async' but is not marked with 'await`
diagnostic with a tailed one for cases where there is an access to an
actor-isolated value outside of its actor without `await` keyword.
This makes the diagnostics for async and sync contexts consistent
and actually identifies a problem instead of simply pointing out
the solution.
Resolves: rdar://151720646
Both the syntax and relative order of the LLVM `nocapture` parameter
attribute changed upstream in 29441e4f5fa5f5c7709f7cf180815ba97f611297.
To reduce conflicts with rebranch, adjust FileCheck patterns to expect
both syntaxes and orders anywhere the presence of the attribute is not
critical to the test. These changes are temporary and will be cleaned
up once rebranch is merged into main.
This attribute was introduced in
7eca38ce76d5d1915f4ab7e665964062c0b37697 (llvm-project).
Match it using a wildcard regex, since it is not relevant to these
tests.
This is intended to reduce future conflicts with rebranch.
We had fixed this bug in https://github.com/swiftlang/swift/pull/79381
but missed to realize the same problem existed for parameters as well.
This corrects the swift_func_getParameterTypeInfo impl, and also removes
the entire "unsafe" method, we no longer use it anywhere.
Resolves rdar://146679254
This corrects how we were dealing with dispatch thunks -- mostly be
removing a lot of special casing we did but doesn't seem necessary and
instead we correct and emit all the necessary information int TBD.
This builds on https://github.com/swiftlang/swift/pull/74935 by further refining how we fixed that issue, and adds more regression tests. It also removes a load of special casing of distributed thunks in library evolution mode, which is great.
Resolves and adds regression test for for rdar://145292018
This is also a more proper fix to the previously resolved but in a not-great-way which caused other issues:
- resolves rdar://128284016
- resolves rdar://128310903
* [Distributed] Accessor must be available cross module in resilient mode
This is an important fix for libraries using @Resolvable in resilient
libraries. Without the fix we're missing an accessor and this will fail
some remote calls which make use of remote calls on resolvable
protocols. This would manifest as missing accessor error thrown by the
executeDistributedTarget function.
resolves rdar://148224780
* Disable test on windows since %env not supported
* [Distributed] Dont emit TBD also for distributed thunks
This resolves pedantic "all" TBD validation issues, i.e. we dont emit
unexpected records anymore - we would before as we only checked for
is_distributed but we also want to skip those for is_distributed_thunk
resolves rdar://128284016
* [Distributed] Accessor must be available cross module in resilient mode
This is an important fix for libraries using @Resolvable in resilient
libraries. Without the fix we're missing an accessor and this will fail
some remote calls which make use of remote calls on resolvable
protocols. This would manifest as missing accessor error thrown by the
executeDistributedTarget function.
resolves rdar://148224780
* Disable test on windows since %env not supported
This issue manifested in crashing when in Xcode one would do the Archive
workflow, and we would be missing the Distributed module types and
proceed to run into a nullpointer when faced with code like this:
```
public class TestViewModel {
public init() {}
public func onReturn() {
print("on return executed!")
}
}
```
where the name matched one of the ad hoc requirements, but we'd get null
for the protocol lookup since this does not import the distributed
module.
resolves rdar://148327936
This is a narrow fix, we are going to work on fixing this properly
and allowing both devirtualization and specialization for distributed
requirement witnesses.
Anything that uses an ad-hoc serialization requirement scheme cannot
be devirtualized because that would result in loss of ad-hoc conformance
in new substitution map.
Resolves: https://github.com/swiftlang/swift/issues/79318
Resolves: rdar://146101172
A protocol conformance can be ill-formed due to isolation mismatches
between witnesses and requirements, or with associated conformances.
Previously, such failures would be emitted as a number of separate
errors (downgraded to warnings in Swift 5), one for each witness and
potentially an extra for associated conformances. The rest was a
potential flood of diagnostics that was hard to sort through.
Collect all of the isolation-related problems for a given conformance
together and produce a single error (downgraded to a warning when
appropriate) that describes the overall issue. That error will have up
to three notes suggesting specific courses of action:
* Isolating the conformance (when the experimental feature is enabled)
* Marking the witnesses as 'nonisolated' where needed
*
The diagnostic also has notes to point out the witnesses/associated
conformances that have isolation problems. There is a new educational
note that also describes these options.
We give the same treatment to missing 'distributed' on witnesses to a
distributed protocol.
When diagnosing an isolation mismatch between a requirement and witness,
we would produce notes on the requirement itself suggesting the addition of
`async`. This is almost never what you want to do, and is often so far
away from the actual conforming type as to be useless. Remove this note,
and the non-function fallback that just points at the requirement, because
they are unhelpful.
This is staging for a rework of the way we deal with conformance-level
actor isolation problems.