Spack FAQ , Common Issues, and Best Practices - NERSC/spack GitHub Wiki
FAQ
Q. I used spack cd [package_spec]
but it's telling me that I need to stage my package. I'm pretty sure it was already downloaded and staged.
A. There's a good chance that you might need to provide a more constrained spec. Any spec that you use without any constraints (i.e no compiler defined or architecture) will be concretized using defaults. These may not match the package you are currently building so Spack will not be able to find them.
Q. I'm getting a attempted static link of dynamic object error
for one of my builds.
A. We currently have the default for packages to disable shared libraries, however some packages will have this hardcoded into the package file. To fix this add a shared
variant to the package like so:
variant("shared", default=True, description="Build shared library")
and don't forget to also add the appropriate configure or cmake arguments:
def configure_args(self):
if "+shared" in self.spec:
return ["--enable-shared"]
else:
return ["--disable-shared"]
Q. My build failed, is there a log that I can check?
A. Spack provides logs for failed builds in the stage directory. You can easily access this using spack cd [package_spec]
. The two logs are called spack-build.out
and spack-build.env
which will store the build output and the build environment. Additionally, logs for successful build are kept in the .spack
hidden folder located in the install path.
Common Issues
Lots of packages have --enable-shared
hardcoded into the package.py file. This will probably cause some headaches and slow downs. Unfortunately, it is up to us to provide the variant to the package. It is a massive short term loss but a long term gain.
C compiler cannot make executable
error will occur if you're using the Intel
compiler and are cross-compiling for a different architecture. The work around is to edit the package file for what you're trying to install and add a --host=x86_64
in the configure argument.
Example:
class Foo(AutotoolsPackage):
# Meta data/ variants/ dependencies/ and patches all appear here
# ...
def configure_args(self):
if "%intel" in self.spec:
return ["--host=x86_64"]
# return other args
or if it's a Package class:
class Foo(Package):
# METADATA ...
def install(self, spec, prefix):
config_args = []
# add config_args depending on what spec is chosen...
if "%intel" in spec:
config_args.append("--host=x86_64")
configure(*config_args)
Something along those lines though your package may be different.
Best Practices
-
Determine whether your build will be a dynamic build or static build. Once you determine that you can manipulate Spack to use either the cray compiler wrappers for any module target or if you want to build a basic package you can build for the
login nodes
akafront end
nodes using this spec:spack install abinit target=fe os=fe
This will build a basic package without the cray compiler wrappers.
-
Document your spec and steps taken for installation. These could include some changes to configuration files. Save the spec in the software owner file so that future users can simply copy and paste.
-
Enable shell support to be able to easily use some of Spack's commands like
spack cd [package spec]
-
You never have to use explicit paths with Spack. Here are some of the other cool things you can do with
spack cd
spack cd -m
spack python module directoryspack cd -r
spack install rootspack cd -i
spack install prefixspack cd -p
spack package.py locationspack cd -P
top level packages directoryspack cd -s
stage directoryspack cd -b
build directory
-
Large installations are going to be difficult. The best thing to do is to use the hashes for dependencies as much as possible.
Example:
spack install hdf5+mpi^ /asdf
Where/asdf
is the hash forcray-mpich
or some external dependency.
Before installing the production software with swowner, perhaps consider having your own instance (forked from the Spack repository) and then testing some builds. Once you think you have the packages sorted, create a PR for it to nersc-spack to have it merged into the main branch. Install the package for production.
To re-emphasize: NO DEVELOPMENT WORK ON SWOWNER INSTANCE OF SPACK
Submitting a Pull Request
We will follow the git flow type of workflow Git Flow.
To submit a pull request, create a new branch with username/packagename as the branch name. Then you will push to that branch and create a PR by hitting the create a pull request button. Once that is done be sure to request a PR to the NERSC/spack repo and the nersc-spack main branch.
CRAYPE_LINK_TYPE (EXPERIMENTAL)
You can set the CRAYPE_LINK_TYPE variable in packages.yaml. At the moment the defaults are:
packages:
all:
variants: ~shared
craype_link_type: "static"
You can use dynamic typing on a per package basis so if your package is troublesome and needs shared linking then you can add this:
packages:
cmake:
variants: +shared
craype_link_type: "shared"
And then you can specify using the hash of that installed package. Eventually this should be turned into a spec into the command line. Unfortunately this will cause hash changes which means packages will be rebuilt using Spack.