-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
FindMPI now not working for Intel MPI #3
Comments
The generic MPI names were for the profiling but I want to handle this differently than how it was organised. I re-ordered the names to match the changes in the CMake FindMPI module. This FindMPI module wasn't correctly reporting the libraries for fortran applications so basic fortran programs couldn't link successfully. No I didn't check against intel MPI I didn't expect my changes would effect that determination. I will look at fixing this now. |
The GENERIC names where not for profiling. They were put in to separate out the names of the mpi compilers so that if you specified a particular MPI then it would just look for those ones. This and the ordering was specifically to ensure that if, say, Intel MPI was specified it would find the mpiicc compiler and not mpicc first which it would if you also had gnu installed. |
The build of OpenCMISS on PAN is now also failing which is probably related i.e., Hi Chris, I ran this test again and it failed in exactly the same place. It looks like it is failing during the build of the PaStiX dependency (build output attached). Do you know if anything has changed in the last couple of weeks that could be related to this failure? I noticed some changes in the manage repository but I don't know enough about the build system to tell if they are related. For reference we are installing using the setup repository (https://github.com/OpenCMISS/setup). Regards, Chris On 26/09/17 23:10, Chris Scott wrote:
slurm-configure.txt |
I have setup a comparable system and I am looking into this issue (slowly). |
Happy to run a test on Pan when it gets to that point, if that will help. For reference, to reproduce on Pan we are loading |
The build on Pan last night was successful, so some recent changes (since last week) must have fixed the problem we were having on there. |
CMake have added a reconditioned FindMPI module which I think will help us find the correct MPI for a given toolchain. One aspect of this updated module is that maybe we won't have to specify an MPI anymore because the correct MPI will be found automatically. I have tested this module in a couple of different environments and the results have looked good. We will have to back port this module for CMake 3.4. Upgrading to this module is going to have massive repercussions and cause some breakages in the short term while breaks are fixed up. I think we should upgrade to this module and work through the breakages, if anyone disagrees say so! |
We don't necessarily want to leave it to find any MPI. We need to be able to specify a particular MPI and find that MPI. If OPENCMISS_MPI is not set then it can find any one but if it is set then it needs to be respected. |
What scenarios are you thinking of? If you give them to me I can try and test them out and determine if this is still an option to take us forward with finding the correct MPI. |
We need to be able to specify a particular MPI to use (hardware support, profiling, etc.). The requirement has always been to have an option to specify a particular MPI via OPENCMISS_MPI and have the build system use that MPI and not just any old MPI that it comes across. Thus if I go -DOPENCMISS_MPI=intel it needs to use the intel MPI and not, say, mpich. If I don't specify OPENCMISS_MPI then the build system is free to try and find an MPI. This is why we have a custom FindMPI.cmake module. |
@hsorby The FindMPI.cmake module was updated 13 days ago with the commit "Improve the MPI finding." Changes included removing the GENERIC mpi names and the ordering. Now the module will not find the Intel MPI first if OPENCMISS_MPI=intel is specified. This results in a mismatch between the compiler and MPI toolchain as gfortran mpich is picked up. What was the reasoning for these changes and were they tested with Intel MPI?
The text was updated successfully, but these errors were encountered: