When building a static library with debug information, do not create a
dSYM generation job as it cannot be executed on a non-image target.
This is important for the case where the single invocation is both the
compile and link job.
This was detected in conjunction with @gottesmm.
The frontend supports this via new options -index-unit-output-path and
-index-unit-output-path-filelist that mirror -o and -output-filelist. These are
intended to allow sharing index data across builds in separate directories (so
different -o values) that are otherwise equivalent as far as the index data is
concerned (e.g. an ASAN build and a non-ASAN build) by supplying the same
-index-unit-output-path for both.
This change updates the driver to add these new options to the frontend
invocation 1) when a new "index-unit-output-path" entry is specified for one
or more input files in the -output-file-map json or 2) if -index-file is
specified, when a new -index-unit-output-path driver option is passed.
Resolves rdar://problem/74816412
In the legacy driver, these flags will merely be propagated to the
frontends to indicate that they should disable serialization of
incremental information in swift module files.
In the new driver, these flags control whether the Swift driver performs
an incremental build that is aware of metadata embedded in the module.
Kudos to David for coming up with our new marketing name: Incremental
Imports.
rdar://74363450
Add a new swift-frontend driver option that extract APIs in the swift
module and print in JSON format. This is to allow tooling to understand
and process swift APIs without the need to be a swift compiler or
understand swift module/AST.
Unlike the frontend, the driver doesn't account for any VFS overlays set up
by the -vfsoverlay option when processing its input files. It also errors
if any of those input files don't exist on the file system. This makes it
impossible to use a file that only exists in the VFS as input to the driver,
even though the same file would be handled without issue by the frontend.
If the file doesn't exist even accounting for the VFS the frontend emits
a missing file diagnostic, so this change just suppresses the existence
check for inputs when a -vfsoverlay option is present.
Resolves rdar://problem/72485443
We're going to play a dirty, dirty trick - but it'll make our users'
lives better in the end so stick with me here.
In order to build up an incremental compilation, we need two sources of
dependency information:
1) "Priors" - Swiftdeps with dependency information from the past
build(s)
2) "Posteriors" - Swiftdeps with dependencies from after we rebuild the
file or module or whatever
With normal swift files built in incremental mode, the priors are given by the
swiftdeps files which are generated parallel to a swift file and usually
placed in the build directory alongside the object files. Because we
have entries in the output file map, we can always know where these
swiftdeps files are. The priors are integrated by the driver and then
the build is scheduled. As the build runs and jobs complete, their
swiftdeps are reloaded and re-integrated. The resulting changes are then
traversed and more jobs are scheduled if necessary. These give us the
posteriors we desire.
A module flips this on its head. The swiftdeps information serialized
in a module functions as the *posterior* since the driver consuming the
module has no way of knowing how to rebuild the module, and because its
dependencies are, for all intents and purposes, fixed in time. The
missing piece of the puzzle is the priors. That is, we need some way of
knowing what the "past" interface of the module looked like so we can
compare it to the "present" interface. Moreover, we need to always know
where to look for these priors.
We solve this problem by serializing a file alongside the build record:
the "external" build record. This is given by a... creative encoding
of multiple source file dependency graphs into a single source file
dependency graph. The rough structure of this is:
SourceFile => interface <BUILD_RECORD>.external
| - Incremental External Dependency => interface <MODULE_1>.swiftmodule
| | - <dependency> ...
| | - <dependency> ...
| | - <dependency> ...
| - Incremental External Dependency => interface <MODULE_2>.swiftmodule
| | - <dependency> ...
| | - <dependency> ...
| - Incremental External Dependency => interface <MODULE_3>.swiftmodule
| - ...
Sorta, `cat`'ing a bunch of source file dependency graphs together but
with incremental external dependency nodes acting as glue.
Now for the trick:
We have to unpack this structure and integrate it to get our priors.
This is easy. The tricky bit comes in integrate itself. Because the
top-level source file node points directly at the external build record,
not the original swift modules that defined these dependency nodes, we
swap the key it wants to use (the external build record) for the
incremental external dependency acting as the "parent" of the dependency
node. We do this by following the arc we carefully laid down in the
structure above.
For rdar://69595010
Goes a long way towards rdar://48955139, rdar://64238133
Plumb the logic necessary to schedule merge-modules incrementally. This means that if any inputs jobs to merge-modules run, merge-modules is run. But, if they are all skipped, merge-modules will be skipped as well.
This requires some light special-casing of the legacy driver's incremental job handling because it assumes in a few places it can always extract a swiftdeps file. This invariant will be further broken when the precompile step for bridging headers is skipped as well.
rdar://65893400
With this option enabled, the dependency scanner gathers all import statements in source files of the main module (non-transitive) and outputs a list of imported modules.
This will be used by build systems and the swift-driver as a way to avoid redundant re-scanning in incremental contexts.
The driver can now schedule jobs which typecheck just-emitted module interfaces to ensure that they can be consumed later. This can be enabled manually by passing `-verify-emitted-module-interface` to the driver.
This commit adds LTO support for handling linker options and LLVM BC
emission. Even for ELF, swift-autolink-extract is unnecessary because
linker options are embeded in LLVM BC content when LTO.
- deduplicate the logic to compute the resource folder
- install headers and module files in shared and static resource folders
- forward -static flag when calling swiftc with -print-target-info
Previously the path to covered files in the __LLVM_COV / __llvm_covmap
section were absolute. This made remote builds with coverage information
difficult because all machines would have to have the same build root.
This change uses the values for `-coverage-prefix-map` to remap files in
the coverage info to relative paths. These paths work correctly with
llvm-cov when it is run from the same source directory as the
compilation, or from a different directory using the `-path-equivalence`
argument.
This is analogous to this change in clang https://reviews.llvm.org/D81122
This commit adds -lto flag for driver to enable LTO at LLVM level.
When -lto=llvm given, compiler emits LLVM bitcode file instead of object
file and perform thin LTO using libLTO.dylib plugin.
When -lto=llvm-full given, perform full LTO instead of thin LTO.
Implement a new "fast" dependency scanning option,
`-scan-dependencies`, in the Swift frontend that determines all
of the source file and module dependencies for a given set of
Swift sources. It covers four forms of modules:
1) Swift (serialized) module files, by reading the module header
2) Swift interface files, by parsing the source code to find imports
3) Swift source modules, by parsing the source code to find imports
4) Clang modules, using Clang's fast dependency scanning tool
A single `-scan-dependencies` operation maps out the full
dependency graph for the given Swift source files, including all
of the Swift and Clang modules that may need to be built, such
that all of the work can be scheduled up front by the Swift
driver or any other build system that understands this
option. The dependency graph is emitted as JSON, which can be
consumed by these other tools.