Just making PartitionUtils.h a little easier to walk through by moving more of
the impl into the .cpp file. This reduces the header from ~1500 lines to ~950
lines which is more managable. This is especially important since I am going
to be adding IsolationHistory to the header file which will expand it even
further.
Specifically, I added a named version of the diagnostic:
func testSimpleTransferLet() {
let k = Klass()
- transferArg(k) // expected-warning {{binding of non-Sendable type 'Klass' accessed after being transferred; later accesses could race}}
+ transferArg(k) // expected-warning {{transferring 'k' may cause a race}}
+ // expected-note @-1 {{'k' used after being passed as a transferring parameter}}
useValue(k) // expected-note {{use here could race}}
}
and I also cleaned up the typed version of the diagnostic that is used
if we fail to find a name:
func testSimpleTransferLet() {
let k = Klass()
- transferArg(k) // expected-warning {{binding of non-Sendable type 'Klass' accessed after being transferred; later accesses could race}}
- transferArg(k) // expected-warning {{value of non-Sendable type 'Klass' accessed after being transferred; later accesses could race}}
useValue(k) // expected-note {{use here could race}}
}
This is the 2nd to the last part of a larger effort to rework all of
the region based diagnostics to first try and use names and only go back
to the old typed diagnostics when we fail to look up a name (which should
be pretty rare, but is always possible).
At some point if I really feel confident enough with the name lookup code, I am
most likely just going to get rid of the typed diagnostic code and just emit a
compiler doesnt understand error. The user will still not be able to ship the
code but would also be told to file a bug so that we can fix the name
inference.
[region-isolation] Clean up use after transfer error to use the dynamic isolation information of the transfered operand value in its diagnostic message.
As an example of the change:
- // expected-note @-1 {{'x' is transferred from nonisolated caller to main actor-isolated callee. Later uses in caller could race with potential uses in callee}}
+ // expected-note @-1 {{transferring disconnected 'x' to main actor-isolated callee could cause races in between callee main actor-isolated and local nonisolated uses}}
Part of the reason I am doing this is that I am going to be ensuring that we
handle a bunch more cases and I wanted to fix this diagnostic before I added
more incaranations of it to the tests.
I am making this specific API since I am going to make it so that
SILIsolationInfo::get(SILInstruction *) can infer isolation info from self even
from functions that are not apply isolation crossing points. For example, in the
following, we need to understand that test is main actor isolated and we
shouldn't emit an error.
```swift
@MainActor func test(_ x: NonSendable) {}
@OtherActor func doSomething() {
let x = NonSendable()
Task.init { @MainActor in print(x) }
test(x)
}
```
Long term I would like to get region analysis and transfer non sendable out of
the business of directly interpreting the AST... but if we have to do it now, I
would rather us do it through a helper struct. At least the helper struct can be
modified later to work with additional SIL concurrency support when it is added.
I need to start tracking the dynamic IsolationRegionInfo for the transferring
operand so I can ignore uses that are part of the same
IsolationRegionInfo. IsolationRegionInfo doesn't fit into a pointer, so just to
keep things the same, I am going to just allocate it.
This is an initial staging commit that tests out the bump ptr allocating without
expanding the type yet.
Change FieldSensitive's enum representation to allow distinguishing
among the elements with associated value. Consider
`unchecked_take_enum_data_addr` to consume all other fields than that
taken.
rdar://125113258
* Let the customBits and lastInitializedBitfieldID share a single uint64_t. This increases the number of available bits in SILNode and Operand from 8 to 20. Also, it simplifies the Operand class because no PointerIntPairs are used anymore to store the operand pointer fields.
* Instead make the "deleted" flag a separate bool field in SILNode (instead of encoding it with the sign of lastInitializedBitfieldID). Another simplification
* Enable important invariant checks also in release builds by using `require` instead of `assert`. Not catching such errors in release builds would be a disaster.
* Let the Swift optimization passes use all the available bits and not only a fixed amount of 8 (SILNode) and 16 (SILBasicBlock).
Enable KeyPath/AnyKeyPath/PartialKeyPath/WritableKeyPath in Embedded Swift, but
for compile-time use only:
- Add keypath optimizations into the mandatory optimizations pipeline
- Allow keypath optimizations to look through begin_borrow, to make them work
even in OSSA.
- If a use of a KeyPath doesn't optimize away, diagnose in PerformanceDiagnostics
- Make UnsafePointer.pointer(to:) transparent to allow the keypath optimization
to happen in the callers of UnsafePointer.pointer(to:).
An instruction can consume multiple (discontiguous) fields. Use a
SmallBitVector to track the fields consumed by an instruction rather
than a TypeTreeLeafRange.
rdar://125103951
Previously, whenever an instruction was recorded as a final consume, a
new entry was added to finalBlockConsumes. Here, this is changed to add
the new bits being consumed by the instruction to the preexisting
SmallBitVector, if there is one.