Open MPI - shawfdong/hyades GitHub Wiki
We've recently built and installed Open MPI 1.8.3 on Hyades. Intel MPI is the default MPI on Hyades; and the intel_mpi module is loaded by default to set up the environment for Intel MPI. In order to use Open MPI instead of Intel MPI, you must swap the intel_mpi module with the openmpi_mpi module:
module swap intel_mpi open_mpiTo make it permanent, append the above line to your ~/.basrc if you use BASH, or ~/.cshrc if you use C Shell.
OpenMPI provides a set of mpicmds that support the compilation and execution of MPI programs over InfiniBand:
MPI Compiler Command | Supported Language(s) | TypeSuffix | Example |
---|---|---|---|
mpicc | C | .c | mpicc [compiler_options] prog.c |
mpiCC / mpicxx / mpic++ | C++ | .C, .cc, .cpp, .cxx | mpiCC [compiler_options] prog.cpp |
mpif77 | Fortran 77 | .f, .for, .ftn | mpif77 [compiler_options] prog.f |
mpif90 | Fortran 90 | .f90, .fpp | mpif90 [compiler_options] prog.f90 |
The mpicmds in the table above are just wrappers of the Intel compilers. They automatically link startup and message passing libraries for Intel MPI into the executables. For examples, to compile mpi_hello.c, an MPI "Hello world" program written in C, run
mpicc -o mpi_hello.x mpi_hello.c
Assume that we've successfully compiled the sample MPI program mpi_hello.c as described above, and we want to run the executable mpi_hello.x on 64 cores. First make sure the executable resides in the Lustre scratch storage (in /pfs/$USER or one of its subdirectories); then create a PBS script named ompi.pbs in the same directory, with the following content:
#!/bin/bash #PBS -N ompi #PBS -l nodes=4:ppn=16 #PBS -l walltime=0:10:00 cd $PBS_O_WORKDIR mpirun -n 64 ./mpi_hello.x
Annotations of ompi.pbs:
- #PBS -N ompi: the job name is ompi
- #PBS -l nodes=4:ppn=16: the job will run on 4 nodes (64 cores) in the default normal queue
- if we want to submit the job to the hyper queue instead, replace #PBS -l nodes=4:ppn=16 with the following 2 lines:
- #PBS -q hyper
- PBS -l nodes=2:ppn=32
- cd $PBS_O_WORKDIR: required; PBS starts the scripts from the home directory on the executing computing node
qsub ompi.pbs
I performed the following steps to install Open MPI 1.8.3 on Hyades:
$ wget http://www.open-mpi.org/software/ompi/v1.8/downloads/openmpi-1.8.3.tar.bz2 $ tar xvfj openmpi-1.8.3.tar.bz2 $ cd openmpi-1.8.3 $ ./configure --prefix=/pfs/sw/mpi/intel/openmpi-1.8.3 \ --with-verbs --with-tm=/opt/torque \ CC=icc CXX=icpc F77=ifort FC=ifort $ make all install