Now that the SwiftStdlib 5.9 macro has been defined, this test failing
with stdlib debug builds is exposed. That is occurring because the back
deployment fallback
`$ss9TaskLocalC13withValueImpl_9operation4file4lineqd__xn_qd__yYaKXESSSutYaKlFTwB`
is now being called.
In unoptimized stdlib builds, the version of that function that is
deserialized into the test case module features an unoptimized copy of
the argument `%1` into a temporary (`%10`):
```
%10 = alloc_stack $Value // users: %14, %13, %11
copy_addr %1 to [init] %10 : $*Value // id: %11
// function_ref swift_task_localValuePush
%12 = function_ref @swift_task_localValuePush : $@convention(thin) <τ_0_0> (Builtin.RawPointer, @in τ_0_0) -> () // user: %13
%13 = apply %12<Value>(%9, %10) : $@convention(thin) <τ_0_0> (Builtin.RawPointer, @in τ_0_0) -> ()
dealloc_stack %10 : $*Value // id: %14
```
This is a problem because `swift_task_localValuePush` allocates in the
async stack (see rdar://107275872) but that fact isn't encoded in SIL
(the fix for which is tracked by rdar://108260399), so the
`alloc_stack`/`dealloc_stack` surrounding that call result in a
violation of stack discipline.
```
push // alloc_stack
push // swift_task_localValuePush
pop // dealloc_stack -- oops
```
In optimized stdlib builds, the copy has been optimized away by the time
the function is deserialized into the test case module:
```
bb0(%0 : $*R, %1 : $*Value, %2 : @guaranteed $@noescape @async @callee_guaranteed @substituted <τ_0_0> () -> (@out τ_0_0, @error any Error) for <R>, %3 : @guaranteed $String, %4 : $UInt, %5 : @guaranteed $TaskLocal<Value>):
// function_ref _checkIllegalTaskLocalBindingWithinWithTaskGroup(file:line:)
%6 = function_ref @$ss039_checkIllegalTaskLocalBindingWithinWithC5Group4file4lineySS_SutF : $@convention(thin) (@guaranteed String, UInt) -> () // user: %7
%7 = apply %6(%3, %4) : $@convention(thin) (@guaranteed String, UInt) -> ()
// function_ref TaskLocal.key.getter
%8 = function_ref @$ss9TaskLocalC3keyBpvg : $@convention(method) <τ_0_0 where τ_0_0 : Sendable> (@guaranteed TaskLocal<τ_0_0>) -> Builtin.RawPointer // user: %9
%9 = apply %8<Value>(%5) : $@convention(method) <τ_0_0 where τ_0_0 : Sendable> (@guaranteed TaskLocal<τ_0_0>) -> Builtin.RawPointer // user: %11
// function_ref swift_task_localValuePush
%10 = function_ref @swift_task_localValuePush : $@convention(thin) <τ_0_0> (Builtin.RawPointer, @in τ_0_0) -> () // user: %11
%11 = apply %10<Value>(%9, %1) : $@convention(thin) <τ_0_0> (Builtin.RawPointer, @in τ_0_0) -> ()
```
The argument `%1` is forwarded into the apply of `swift_task_localValuePush`.
rdar://112898559
Because `_taskLocalValuePush` and `_taskLocalValuePop` can result in calls
to `swift_task_alloc` and `swift_task_dealloc` respectively, and because
the compiler hasn't been taught about that (e.g.
`SILInstruction::isAllocatingStack`,
`SILInstruction::isDeallocatingStack`, etc), calling them (push and pop)
from a function which makes use the stack for dynamically sized
allocations can result in violations of stack discipline of the form
```
swift_task_alloc // allocates %ptr_1
copy_value_witness // copies into %ptr_1
swift_task_localValuePush // calls swift_task_alloc and allocates %ptr_2
swift_task_dealloc // deallocates %ptr_1
swift_task_localValuePop // calls swift_task_dealloc and deallocates %ptr_2
```
Avoid the problem by not allocating dynamically sized stack space in the
function which calls `_taskLocalValuePush` and `_taskLocalValuePop`.
Split the calls to those functions into `withValueImpl` function which
takes its argument `__owned`. Call that function from `withValue`,
ensuring that the necessary copy (to account for the fact that withValue
takes its argument `__guaranteed` but `_taskLocalValuePush` takes its
`__owned`) and associated stack traffic occur in `withValue`.
Still, allow `withValueImpl` to be inlined. The stack nesting will be
preserved across it.
rdar://107275872