-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Spack CPU build exago~mpi
#47
base: develop
Are you sure you want to change the base?
Conversation
# See #39 - minimal build useful for sanity | ||
- exago@develop~mpi~ipopt~hiop~python~raja | ||
# See #16 / #44 - +python~mpi catches mpi4py / python config | ||
- exago@develop~mpi~ipopt~hiop+python~raja |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Issue in +mpi~python
should be equivalent to a build with ~mpi~python
as far as I can tell. Either way I think this is sufficient and we need to ensure these builds pass in future.
@@ -41,8 +38,10 @@ jobs: | |||
SPACK_SPEC: ${{ matrix.spack_spec }} | |||
run: | | |||
ls && pwd | |||
. ./tpl/spack/share/spack/setup-env.sh | |||
. /opt/spack/share/spack/setup-env.sh |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is more sustainable, and can be swapped out when we want to use ExaGO's spack.
It might just make more sense to have a package.py
stored in the repo for ExaGO as that is the main thing that is regularly modified instead of relying on submodules. Still would be nice to pin a spack version and support that workflow, but making it an optional clone would be best.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The spack repo is is incredibly active, I would be hesitant to pin a version of spack because that's just another package we would could get hung up on a "blessed" version or forget to upgrade that one clone...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To clarify, for 100% reproducible spack builds we need to pin the version in certain situations (such as profiling runs, or key performance runs).
For CI and automated builds, I do think that just using spack's develop like we do with this base image is the best way to go.
Is that in line with what you were thinking?
+python~mpi
, ~python[+/~]mpi
)+python~mpi
, ~python[+/~]mpi
)
+python~mpi
, ~python[+/~]mpi
)+python~mpi
, ~python[+/~]mpi
)
Marking as draft for now. While this PR would in theory solve all our build issues, the real problem (captured in #48), is that we don't build/test with MPI. For now, this will have to wait for 1.6.0 |
+python~mpi
, ~python[+/~]mpi
)exago~mpi
This PR will first reproduce the failures (or lack thereof), and then we can add and merge in fixes.
Closes:
exago+python~mpi
fails to build #16exago~mpi~python
fails to build #39