Telcon: 2025 06 11 - spack/spack GitHub Wiki
Wednesday June 11th, 9am PT (UTC -7:00)
Attendees
- Peter Scheibel (host)
- Tammy Dahlgren
- Brad Richardson
- Diego Menéndez
- Mark Krentel
- Davide Del Vento
- Renjith
Agenda
This meeting is for general Q&A. There are no pre-planned topics (feel free to add one).
- Problems with wrapper building parallel NetCDF (Alan Sill, see this Slack help item)
- Peter: Usual starter questions: what spack version are you on (
spack debug report
)? Also what is theparallel-netcdf
spec it is trying to build (e.g. version, variants)?- Current definition of
parallel-netcdf
: https://github.com/spack/spack-packages/blob/develop/repos/spack_repo/builtin/packages/parallel_netcdf/package.py e4s
stack is building it: https://github.com/spack/spack-packages/blob/develop/.ci/gitlab/stacks/e4s/spack.yaml- From thread, it looks like this is on Linux
- It might be that wrappers can normally succeed, but the wrapper set up in your case will not. I looked at the autoconf check at https://github.com/Parallel-NetCDF/PnetCDF/blame/master/configure.ac and it seems like it has the potential to work as long as the wrapper doesn't automatically do detection/redirection of source code
- It looks like you are using
openmpi@5:
(at least when you install it w/Spack, what is your system version?). Thee4s
stack usesmpich
(Answer: Problem exists even when everything is built by Spack, problem persists if system version OpenMPI 4.1.6 is used.)
- Current definition of
- Alan: things to try? I'm building wrf
- Davide: build wrf serial
- Peter: try building against mpich
- Either way, we agree a constraint should probably be added to wrf/pnetcdf
- Tammy: spack stacks building wrf:
- https://github.com/spack/spack-packages/blob/develop/.ci/gitlab/stacks/aws-pcluster-neoverse_v1/spack.yaml
- https://github.com/spack/spack-packages/blob/develop/.ci/gitlab/stacks/aws-pcluster-x86_64_v4/spack.yaml (oneapi, wrf@4 build_type=dm+sm)
- https://github.com/spack/spack-packages/blob/develop/.ci/gitlab/stacks/e4s-oneapi/spack.yaml
- https://github.com/spack/spack-packages/blob/develop/.ci/gitlab/stacks/e4s/spack.yaml
- Alan: this is in relation with REPACSS project (build mainstream resource w/instrumentation of data centers)
- Peter: Usual starter questions: what spack version are you on (
- Brad Richardson 0.22.1 env: discuss various choices made for env, the problems it has now, and what it would look like to update it to 1.0
- Big picture question: how do you build a large set of packages for users?
- Build a large set of packages
- with different compilers
- and expose them to end users, w/o them knowing about spack
- and want to reproduce the env
- say we want to upgrade mpich version, then we create an entire reproduction of the env
- wanted to manage this with scripts, but the scripts seem to "fall over"
- "what was set as an option because i wanted it, and what was set because that was the only way it worked"
- Davide: ideally for things that must be, you encode them in the package.py and upstream it
- Peter: can separate this out into two include files
- one is the things you want
- one is the things you are mixing in to make things "just work"
- Chat info/links
- Diego: https://spack-tutorial.readthedocs.io/en/latest/tutorial_stacks.html#setup-the-compiler
- Diego: provided a quote from the Stacks tutorial
- Currently:
- multi-phase environment (i.e. a spack environment)
- 1st env is compilers
- next env is base
- just import 1 version of compiler
- build cmake, autotools, qt
- Turned of unify, try to reuse
- Suggestions
- Could integrate requirements to control rebuilding of build deps
- e.g.
packages:cmake:require:%gcc
- e.g.
- Could integrate requirements to control rebuilding of build deps
- Brad: example: ruby: We added ruby to our stack, and a preference from ruby causes julia not to build
- Davide: this is an example of the package PR upstreaming
- Brad: there's an issue with incremental updates to environments
- If I only reconcretize infrequently, then at that point I get many failures
- Peter: what if we tried a background full concretization with each incremental update?
- If incremental succeeds and full fails: that suggests a package change
- (Note: I think it's the build that's failing, which means you'd have to be trying more things in the background, or maybe keeping the old env around and looking at what it concretized to and successfully built when you get a build failure in the new env)
- Big picture question: how do you build a large set of packages for users?
- Diego:
- Managing production instance
- how do I manage it? do I modify it directly?
- Davide: I have one spack instance, but users only see built packages when I add a module
- Davide: ping on
- https://github.com/spack/spack/issues/50628
spack spec foo ^/hash
gets internal concretizer error
- https://github.com/spack/spack/pull/49950
- https://github.com/spack/spack/issues/50628