Message Passing Interface (MPI)
Start with a simple mpi program (mpi_hello.c
):
#include <mpi.h>
#include <stdio.h>
int main(int argc, char** argv) {
// Initialize the MPI environment
MPI_Init(NULL, NULL);
// Get the number of processes
int world_size;
MPI_Comm_size(MPI_COMM_WORLD, &world_size);
// Get the rank of the process
int world_rank;
MPI_Comm_rank(MPI_COMM_WORLD, &world_rank);
// Get the name of the processor
char processor_name[MPI_MAX_PROCESSOR_NAME];
int name_len;
MPI_Get_processor_name(processor_name, &name_len);
// Print off a hello world message
printf("Hello world from processor %s, rank %d out of %d processors\n",
processor_name, world_rank, world_size);
// Finalize the MPI environment.
MPI_Finalize();
}
To compile,
module load mpi/openmpi-x86_64
which mpicc
/usr/lib64/openmpi/bin/mpicc -o mpi_hello mpi_hello.c
To run interactively,
mpirun -n 4 mpi_hello
Note that, there are few mpi modules installed in the system. You can check with module available
module available
------------------------------------------------------------------------------------ /work/app/modules/modules/all -------------------------------------------------------------------------------------
Anaconda3/5.3.0 LibTIFF/4.0.10-GCCcore-8.3.0 XZ/5.2.4-GCCcore-8.3.0 libjpeg-turbo/2.0.3-GCCcore-8.3.0
Autoconf/2.69-GCCcore-8.3.0 M4/1.4.18-GCCcore-8.3.0 binutils/2.32-GCCcore-8.3.0 libpciaccess/0.14-GCCcore-8.3.0
Automake/1.16.1-GCCcore-8.3.0 M4/1.4.18 (D) binutils/2.32 (D) libpng/1.6.37-GCCcore-8.3.0
Autotools/20180311-GCCcore-8.3.0 MPFR/4.0.2-GCCcore-8.3.0 bzip2/1.0.8-GCCcore-8.3.0 libreadline/8.0-GCCcore-8.3.0
Bison/3.3.2-GCCcore-8.3.0 NASM/2.14.02-GCCcore-8.3.0 cURL/7.66.0-GCCcore-8.3.0 libtool/2.4.6-GCCcore-8.3.0
Bison/3.3.2 (D) Ninja/1.9.0-GCCcore-8.3.0 expat/2.2.7-GCCcore-8.3.0 libxml2/2.9.9-GCCcore-8.3.0
CMake/3.15.3-GCCcore-8.3.0 OpenBLAS/0.3.7-GCC-8.3.0 flex/2.6.4-GCCcore-8.3.0 libyaml/0.2.2-GCCcore-8.3.0
CUDA/10.1.243-GCC-8.3.0 OpenMPI/4.0.1-GCC-8.3.0-2.32 flex/2.6.4 (D) ncurses/6.0
DB/18.1.32-GCCcore-8.3.0 Perl/5.30.0-GCCcore-8.3.0 freetype/2.10.1-GCCcore-8.3.0 ncurses/6.1-GCCcore-8.3.0 (D)
EasyBuild/4.3.2 Pillow-SIMD/6.0.x.post0-GCCcore-8.3.0 gcccuda/2019b numactl/2.0.12-GCCcore-8.3.0
Eigen/3.3.7 Pillow/6.2.1-GCCcore-8.3.0 gettext/0.19.8.1 protobuf/3.10.0-GCCcore-8.3.0
GCC/4.8.1 PyYAML/5.1.2-GCCcore-8.3.0 help2man/1.47.4 pybind11/2.4.3-GCCcore-8.3.0-Python-3.7.4
GCC/8.3.0 Python/2.7.16-GCCcore-8.3.0 help2man/1.47.8-GCCcore-8.3.0 (D) xorg-macros/1.19.2-GCCcore-8.3.0
GCC/8.3.0-2.32 (D) Python/3.7.4-GCCcore-8.3.0 (D) hwloc/2.0.3-GCCcore-8.3.0 zlib/1.2.11-GCCcore-8.3.0
GCCcore/8.3.0 SQLite/3.29.0-GCCcore-8.3.0 hypothesis/4.44.2-GCCcore-8.3.0-Python-3.7.4 zlib/1.2.11 (D)
GMP/6.1.2-GCCcore-8.3.0 Tcl/8.6.9-GCCcore-8.3.0 libffi/3.2.1-GCCcore-8.3.0
------------------------------------------------------------------------------------------- /etc/modulefiles -------------------------------------------------------------------------------------------
mpi/mpich-x86_64 mpi/mpich-3.0-x86_64 mpi/mpich-3.2-x86_64 mpi/openmpi-x86_64 (L) mpi/openmpi3-x86_64 (D)
Example of the submission script,
#!/bin/bash
#
#SBATCH --qos=cu_hpc
#SBATCH --partition=cpugpu
#SBATCH --job-name=example6
#SBATCH --output=example6_log.txt
#SBATCH --ntasks=28
#SBATCH --tasks-per-node=4
#SBATCH --mem-per-cpu=1G
#SBATCH --time=00:10:00
module purge
module load mpi/openmpi-x86_64
srun ./mpi_hello
Example of the output,
==========================================
SLURM_JOB_ID = 82971
SLURM_NODELIST = gpu-1-[01-07]
==========================================
Hello world from processor gpu-1-01, rank 0 out of 1 processors
Hello world from processor gpu-1-01, rank 0 out of 1 processors
Hello world from processor gpu-1-01, rank 0 out of 1 processors
Hello world from processor gpu-1-01, rank 0 out of 1 processors
Hello world from processor gpu-1-02, rank 0 out of 1 processors
Hello world from processor gpu-1-03, rank 0 out of 1 processors
Hello world from processor gpu-1-02, rank 0 out of 1 processors
Hello world from processor gpu-1-02, rank 0 out of 1 processors
Hello world from processor gpu-1-02, rank 0 out of 1 processors
Hello world from processor gpu-1-05, rank 0 out of 1 processors
Hello world from processor gpu-1-03, rank 0 out of 1 processors
Hello world from processor gpu-1-03, rank 0 out of 1 processors
Hello world from processor gpu-1-03, rank 0 out of 1 processors
Hello world from processor gpu-1-07, rank 0 out of 1 processors
Hello world from processor gpu-1-05, rank 0 out of 1 processors
Hello world from processor gpu-1-05, rank 0 out of 1 processors
Hello world from processor gpu-1-05, rank 0 out of 1 processors
Hello world from processor gpu-1-04, rank 0 out of 1 processors
Hello world from processor gpu-1-07, rank 0 out of 1 processors
Hello world from processor gpu-1-07, rank 0 out of 1 processors
Hello world from processor gpu-1-07, rank 0 out of 1 processors
Hello world from processor gpu-1-04, rank 0 out of 1 processors
Hello world from processor gpu-1-04, rank 0 out of 1 processors
Hello world from processor gpu-1-04, rank 0 out of 1 processors
Hello world from processor gpu-1-06, rank 0 out of 1 processors
Hello world from processor gpu-1-06, rank 0 out of 1 processors
Hello world from processor gpu-1-06, rank 0 out of 1 processors
Hello world from processor gpu-1-06, rank 0 out of 1 processors
Last updated