1
3

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 3 years have passed since last update.

Slurm job manager 制御下でMPIジョブを起動する方法 ~~SlurmとInteMPIを密結合にする~~

Last updated at Posted at 2022-02-23

結果の要約

この機能を使うことで、Hydra Process Managerは、特定の環境変数をチェックすることにより、ジョブスケジューラを自動的に検出します。これらの変数は、割り当てられたノードの数、ノード、およびタスクごとのプロセスの数を決定するために使用されます。

結論から言うと...

最新の2022年時点でのIntelMPIでいうと、Slurm制御化にするためには、以下の方法がある。
ただし、案1, 案2はPMIはIntelMPI側のHydraを使うことになり、案3はIntelMPIから離れて指定したPMIを使うようになる。

  • 案1) export I_MPI_PIN_RESPECT_CPUSET=0 をランスクリプトの中で設定し、mpirun, mpiexecを利用する。
    • 以下のcase2-1,2,3の結果より
  • 案2) 案1の export I_MPI_PIN_RESPECT_CPUSET=0 と用心のために export I_MPI_HYDRA_BOOTSTRAP=slurm をランスクリプト中に設定する
    • 以下のcase4-1,2,3の結果より
  • 案3) export I_MPI_PMI_LIBRARY=/lib64/libpmi.so を設定し、srunを実行する。
    • 以下のcase6-1,2の結果より

書き方の例

以下の実行結果はすべて同じになります。

slurm制御外の実行方法 (一般的な実行方法)

org
mpirun -n ${SLURM_NTASKS} -machinefile ${NODEFILE} <mpi app>

slurm 制御内のときは以下のように簡略化可能

Slurmの環境変数からIntelMPIは自動的にコア数や使用マシンの情報を設定します。(案1)

command
export I_MPI_PIN_RESPECT_CPUSET=0
mpirun <mpi app> # or mpiexec <mpi app>

以下はslurmを利用していることを明示する方法

command
export I_MPI_HYDRA_BOOTSTRAP=slurm
mpirun <mpi app> # or mpiexec <mpi app>

以下は安全のため上記2つを併用しています(案2)

command
export I_MPI_PIN_RESPECT_CPUSET=0
export I_MPI_HYDRA_BOOTSTRAP=slurm
mpirun <mpi app> # or mpiexec <mpi app>

以下はSlurmの公式マニュアルにおける推奨方法となります(案3)

command
export I_MPI_PMI_LIBRARY=/lib64/libpmi.so
srun <mpi app>

テストの詳細

  • こちらのコードを利用して、IntelMPIでコンパイルし、テストしてみます。
  • case1の見方
    • ベースラインとしての実行であり、何も関連する環境変数を与えていません。
    • I_MPI_HYDRA_BOOTSTRAP=slurmはmpirunのみ設定されています。ホスト情報を指定しているのにも関わらず、読み込んでいます。
    • 1-2,1-3,1-4も何故かIntelMPIがSlurmの情報を読み込んでいます。マニュアルと異なる動きをしている。
  • Case2の見方
    • I_MPI_HYDRA_BOOTSTRAP=slurm はmpirunのみ設定されています。
    • すべて期待通り動作しています。
  • Case3の見方
    • I_MPI_HYDRA_BOOTSTRAP=slurm はすべてのケースで設定されています。
    • すべて期待通り動作しています。
  • Case4の見方
    • I_MPI_HYDRA_BOOTSTRAP=slurm はすべてのケースで設定されています。
    • すべて期待通り動作しています。
  • Case5の見方
    • PMI server not found と出力されており、正しく動作しておりません。
    • どのコアもrank 0 out of 1 processors と表示されており、プロセス間は独立で動作しています。
  • Case6の見方
    • PMI calls are forwarded to /lib64/libpmi.soと出力されており、PMIが正しく指定されています。
    • すべて期待通り動作しています。

Hello MPI のコンパイル

mpi_test.cという名前で以下のファイルを作成しコンパイルします。

mpi_test.c
#include <mpi.h>
#include <stdio.h>
int main(int argc, char** argv) {
  // Initialize the MPI environment. The two arguments to MPI Init are not
  // currently used by MPI implementations, but are there in case future
  // implementations might need the arguments.
  MPI_Init(NULL, NULL);

  // Get the number of processes
  int world_size;
  MPI_Comm_size(MPI_COMM_WORLD, &world_size);

  // Get the rank of the process
  int world_rank;
  MPI_Comm_rank(MPI_COMM_WORLD, &world_rank);

  // Get the name of the processor
  char processor_name[MPI_MAX_PROCESSOR_NAME];
  int name_len;
  MPI_Get_processor_name(processor_name, &name_len);

  // Print off a hello world message
  printf("Hello world from processor %s, rank %d out of %d processors\n",
         processor_name, world_rank, world_size);

  // Finalize the MPI environment. No more MPI calls can be made after this
  MPI_Finalize();
}

コンパイル方法。PATHは読み替えてください。

command
source /utils/opt/intel/oneapi/setvars.sh intel64
which mpicc
mpicc mpi_test.c -lm -o _mpi_test
:: initializing oneAPI environment ...
   bash: BASH_VERSION = 4.2.46(2)-release
:: clck -- latest
:: compiler -- latest
:: debugger -- latest
:: dev-utilities -- latest
:: inspector -- latest
:: itac -- latest
:: mpi -- latest
:: tbb -- latest
:: oneAPI environment initialized ::
 
/utils/opt/intel/oneapi/mpi/2021.2.0/bin/mpicc

pmiのPATH調査

ldconfig -p | grep pmi
ls -l  /lib64/ | grep pmi
	librpmio.so.3 (libc6,x86-64) => /lib64/librpmio.so.3
	libpmi2.so.0 (libc6,x86-64) => /lib64/libpmi2.so.0
	libpmi2.so (libc6,x86-64) => /lib64/libpmi2.so
	libpmi.so.0 (libc6,x86-64) => /lib64/libpmi.so.0
	libpmi.so (libc6,x86-64) => /lib64/libpmi.so
	libnetsnmpmibs.so.31 (libc6,x86-64) => /lib64/libnetsnmpmibs.so.31
lrwxrwxrwx   1 root root          24 Jul 22  2021 libnetsnmpmibs.so.31 -> libnetsnmpmibs.so.31.0.2
-rwxr-xr-x   1 root root     1636656 Jan 23  2021 libnetsnmpmibs.so.31.0.2
lrwxrwxrwx   1 root root          16 Jul 22  2021 libpmi2.so -> libpmi2.so.0.0.0
lrwxrwxrwx   1 root root          16 Jul 22  2021 libpmi2.so.0 -> libpmi2.so.0.0.0
-rwxr-xr-x   1 root root      240032 Apr 20  2021 libpmi2.so.0.0.0
lrwxrwxrwx   1 root root          15 Jul 22  2021 libpmi.so -> libpmi.so.0.0.0
lrwxrwxrwx   1 root root          15 Jul 22  2021 libpmi.so.0 -> libpmi.so.0.0.0
-rwxr-xr-x   1 root root      230152 Apr 20  2021 libpmi.so.0.0.0
lrwxrwxrwx   1 root root          17 Jul 22  2021 librpmio.so.3 -> librpmio.so.3.2.2
-rwxr-xr-x   1 root root      178928 Oct  1  2020 librpmio.so.3.2.2

Slurm Sbatch コマンド用の runscriptの作成

intel_mpi_test.shという名前で以下のファイルを作成します。

intel_mpi_test.sh
#!/bin/bash
#SBATCH -p AXXE-L_G3A
#SBATCH --job-name intel_mpi_test
#SBATCH --nodes=4
#SBATCH --ntasks=8
#SBATCH -o %x.%J.out
#SBATCH --comment "zombie workaround"

source /utils/opt/intel/oneapi/setvars.sh intel64

echo "---- machinefile ----"
NODEFILE="hosts"
srun -n ${SLURM_NTASKS} hostname | sort > ${NODEFILE}
cat ${NODEFILE}

cat <<ETX

---- INPUT ENVIRONMENT VARIABLES ----
NTASKS="${SLURM_NTASKS}"
CORES_PER_NODE="${SLURM_JOB_CPUS_PER_NODE}"
SLURM_PROCID="${SLURM_PROCID}"
ETX


export I_MPI_FABRICS=shm:ofi
export FI_PROVIDER=sockets
export I_MPI_DEBUG=10
 
###########################################
echo -e "\n\n---- case1-1 (mpirun) ----\n"
mpirun -n ${SLURM_NTASKS} -machinefile ${NODEFILE} ./_mpi_test | grep -e "I_MPI_" -e "Hello" -e "pmi" | sort

echo -e "\n\n---- case1-2 (mpirun) ----\n"
mpirun ./_mpi_test | grep -e "I_MPI_" -e "Hello" -e "pmi" | sort

echo -e "\n\n---- case1-3 (mpiexec) ----\n"
mpiexec ./_mpi_test | grep -e "I_MPI_" -e "Hello" -e "pmi" | sort

echo -e "\n\n---- case1-4 (mpiexec) ----\n"
mpiexec.hydra ./_mpi_test | grep -e "I_MPI_" -e "Hello" -e "pmi" | sort

###########################################
echo -e "\n\n---- case2-1 (mpirun) ----\n"
#export I_MPI_PROCESS_MANAGER="mpd"
#export I_MPI_PIN_MODE="mpd"
export I_MPI_PIN_RESPECT_CPUSET=0
mpirun -n ${SLURM_NTASKS} ./_mpi_test | grep -e "I_MPI_" -e "Hello" -e "pmi" | sort

echo -e "\n\n---- case2-2 (mpirun) ----\n"
mpirun ./_mpi_test | grep -e "I_MPI_" -e "Hello" -e "pmi" | sort

echo -e "\n\n---- case2-3 (mpiexec) ----\n"
mpiexec ./_mpi_test | grep -e "I_MPI_" -e "Hello" -e "pmi" | sort

echo -e "\n\n---- case2-4 (mpiexec) ----\n"
mpiexec.hydra ./_mpi_test | grep -e "I_MPI_" -e "Hello" -e "pmi" | sort

unset I_MPI_PIN_RESPECT_CPUSET
#unset I_MPI_PROCESS_MANAGER
#unset I_MPI_PIN_MODE

###########################################
echo -e "\n\n---- case3-1 ----\n"
export I_MPI_HYDRA_BOOTSTRAP=slurm
mpirun -n ${SLURM_NTASKS} ./_mpi_test | grep -e "I_MPI_" -e "Hello" -e "pmi" | sort

echo -e "\n\n---- case3-2 ----\n"
mpirun ./_mpi_test  | grep -e "I_MPI_" -e "Hello" -e "pmi" | sort

echo -e "\n\n---- case3-3 ----\n"
mpiexec ./_mpi_test  | grep -e "I_MPI_" -e "Hello" -e "pmi" | sort

echo -e "\n\n---- case3-4 ----\n"
mpiexec.hydra ./_mpi_test  | grep -e "I_MPI_" -e "Hello" -e "pmi" | sort
unset I_MPI_HYDRA_BOOTSTRAP

###########################################
export I_MPI_PIN_RESPECT_CPUSET=0
export I_MPI_HYDRA_BOOTSTRAP=slurm

echo -e "\n\n---- case4-1 ----\n"
mpirun ./_mpi_test  | grep -e "I_MPI_" -e "Hello" -e "pmi" | sort

echo -e "\n\n---- case4-2 ----\n"
mpiexec ./_mpi_test  | grep -e "I_MPI_" -e "Hello" -e "pmi" | sort

echo -e "\n\n---- case4-3 ----\n"
mpiexec.hydra ./_mpi_test  | grep -e "I_MPI_" -e "Hello" -e "pmi" | sort

unset I_MPI_HYDRA_BOOTSTRAP
unset I_MPI_PIN_RESPECT_CPUSET

###########################################
echo -e "\n\n---- case5 ----\n"
srun -n ${SLURM_NTASKS} ./_mpi_test | grep -e "I_MPI_" -e "Hello" -e "pmi" | sort

###########################################
echo -e "\n\n---- case6-1 ----\n"
export I_MPI_PMI_LIBRARY=/lib64/libpmi.so
srun -n ${SLURM_NTASKS} ./_mpi_test | grep -e "I_MPI_" -e "Hello" -e "pmi" | sort

unset I_MPI_PMI_LIBRARY

echo -e "\n\n---- case6-2 ----\n"
export I_MPI_PMI_LIBRARY=/lib64/libpmi.so
srun ./_mpi_test | grep -e "I_MPI_" -e "Hello" -e "pmi" | sort

unset I_MPI_PMI_LIBRARY
###########################################


テストの実行と結果

------------stdout/err---------------
 
:: initializing oneAPI environment ...
   slurm_script: BASH_VERSION = 4.2.46(2)-release
:: clck -- latest
:: compiler -- latest
:: debugger -- latest
:: dev-utilities -- latest
:: inspector -- latest
:: itac -- latest
:: mpi -- latest
:: tbb -- latest
:: oneAPI environment initialized ::
 
---- machinefile ----
kix11-g3-cn001
kix11-g3-cn001
kix11-g3-cn002
kix11-g3-cn002
kix11-g3-cn003
kix11-g3-cn003
kix11-g3-cn004
kix11-g3-cn004

---- INPUT ENVIRONMENT VARIABLES ----
NTASKS="8"
CORES_PER_NODE="40(x4)"
SLURM_PROCID="0"


---- case1-1 (mpirun) ----

[0] MPI startup(): 3       50960    kix11-g3-cn002  {20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39}Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_HYDRA_BOOTSTRAP=slurm
[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwloc
[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default
[0] MPI startup(): I_MPI_MPIRUN=mpirun
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors


---- case1-2 (mpirun) ----

[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_HYDRA_BOOTSTRAP=slurm
[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwloc
[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default
[0] MPI startup(): I_MPI_MPIRUN=mpirun
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors


---- case1-3 (mpiexec) ----

[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwloc
[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors


---- case1-4 (mpiexec) ----

[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwloc
[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors


---- case2-1 (mpirun) ----

[0] Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_HYDRA_BOOTSTRAP=slurm
[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwlocHello world from processor kix11-g3-cn003, rank 4 out of 8 processors
[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default
[0] MPI startup(): I_MPI_MPIRUN=mpirun
[0] MPI startup(): I_MPI_PIN_RESPECT_CPUSET=0
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors


---- case2-2 (mpirun) ----

[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_HYDRA_BOOTSTRAP=slurm
[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwloc
[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default
[0] MPI startup(): I_MPI_MPIRUN=mpirun
[0] MPI startup(): I_MPI_PIN_RESPECT_CPUSET=0
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors


---- case2-3 (mpiexec) ----

[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwloc
[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default
[0] MPI startup(): I_MPI_PIN_RESPECT_CPUSET=0
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors


---- case2-4 (mpiexec) ----

[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwloc
[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default
[0] MPI startup(): I_MPI_PIN_RESPECT_CPUSET=0
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors


---- case3-1 ----

[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_HYDRA_BOOTSTRAP=slurm
[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwloc
[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default
[0] MPI startup(): I_MPI_MPIRUN=mpirun
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors


---- case3-2 ----

[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_HYDRA_BOOTSTRAP=slurm
[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwloc
[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default
[0] MPI startup(): I_MPI_MPIRUN=mpirun
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors


---- case3-3 ----

[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_HYDRA_BOOTSTRAP=slurm
[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwloc
[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors


---- case3-4 ----

[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_HYDRA_BOOTSTRAP=slurm
[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwloc
[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors


---- case4-1 ----

[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_HYDRA_BOOTSTRAP=slurm
[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwloc
[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default
[0] MPI startup(): I_MPI_MPIRUN=mpirun
[0] MPI startup(): I_MPI_PIN_RESPECT_CPUSET=0
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors


---- case4-2 ----

[0] MPI startup(): 1       227815   kix11-g3-cn001  {20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39}Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_HYDRA_BOOTSTRAP=slurm
[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwloc
[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default
[0] MPI startup(): I_MPI_PIN_RESPECT_CPUSET=0
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors


---- case4-3 ----

[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_HYDRA_BOOTSTRAP=slurm
[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwloc
[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default
[0] MPI startup(): I_MPI_PIN_RESPECT_CPUSET=0
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors


---- case5 ----

[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 1 processors
Hello world from processor kix11-g3-cn001, rank 0 out of 1 processors
Hello world from processor kix11-g3-cn002, rank 0 out of 1 processors
Hello world from processor kix11-g3-cn002, rank 0 out of 1 processors
Hello world from processor kix11-g3-cn003, rank 0 out of 1 processors
Hello world from processor kix11-g3-cn003, rank 0 out of 1 processors
Hello world from processor kix11-g3-cn004, rank 0 out of 1 processors
Hello world from processor kix11-g3-cn004, rank 0 out of 1 processors
MPI startup(): PMI server not found. Please set I_MPI_PMI_LIBRARY variable if it is not a singleton case.
MPI startup(): PMI server not found. Please set I_MPI_PMI_LIBRARY variable if it is not a singleton case.
MPI startup(): PMI server not found. Please set I_MPI_PMI_LIBRARY variable if it is not a singleton case.
MPI startup(): PMI server not found. Please set I_MPI_PMI_LIBRARY variable if it is not a singleton case.
MPI startup(): PMI server not found. Please set I_MPI_PMI_LIBRARY variable if it is not a singleton case.
MPI startup(): PMI server not found. Please set I_MPI_PMI_LIBRARY variable if it is not a singleton case.
MPI startup(): PMI server not found. Please set I_MPI_PMI_LIBRARY variable if it is not a singleton case.
MPI startup(): PMI server not found. Please set I_MPI_PMI_LIBRARY variable if it is not a singleton case.


---- case6-1 ----

[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_PMI_LIBRARY=/lib64/libpmi.so
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors
MPIR_pmi_virtualization(): MPI startup(): PMI calls are forwarded to /lib64/libpmi.so
MPIR_pmi_virtualization(): MPI startup(): PMI calls are forwarded to /lib64/libpmi.so
MPIR_pmi_virtualization(): MPI startup(): PMI calls are forwarded to /lib64/libpmi.so
MPIR_pmi_virtualization(): MPI startup(): PMI calls are forwarded to /lib64/libpmi.so
MPIR_pmi_virtualization(): MPI startup(): PMI calls are forwarded to /lib64/libpmi.so
MPIR_pmi_virtualization(): MPI startup(): PMI calls are forwarded to /lib64/libpmi.so
MPIR_pmi_virtualization(): MPI startup(): PMI calls are forwarded to /lib64/libpmi.so
MPIR_pmi_virtualization(): MPI startup(): PMI calls are forwarded to /lib64/libpmi.so


---- case6-2 ----

[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_PMI_LIBRARY=/lib64/libpmi.so
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors
MPIR_pmi_virtualization(): MPI startup(): PMI calls are forwarded to /lib64/libpmi.so
MPIR_pmi_virtualization(): MPI startup(): PMI calls are forwarded to /lib64/libpmi.so
MPIR_pmi_virtualization(): MPI startup(): PMI calls are forwarded to /lib64/libpmi.so
MPIR_pmi_virtualization(): MPI startup(): PMI calls are forwarded to /lib64/libpmi.so
MPIR_pmi_virtualization(): MPI startup(): PMI calls are forwarded to /lib64/libpmi.so
MPIR_pmi_virtualization(): MPI startup(): PMI calls are forwarded to /lib64/libpmi.so
MPIR_pmi_virtualization(): MPI startup(): PMI calls are forwarded to /lib64/libpmi.so
MPIR_pmi_virtualization(): MPI startup(): PMI calls are forwarded to /lib64/libpmi.so

Appendix エラーが出たときのメモ

JOB_ID* 8929
              8929 AXXE-L_G3 intel_mp xdu-0590  R       0:00      4 kix11-g3-cn[001-004]
              8929 AXXE-L_G3 intel_mp xdu-0590  R       0:05      4 kix11-g3-cn[001-004]
              8929 AXXE-L_G3 intel_mp xdu-0590  R       0:10      4 kix11-g3-cn[001-004]
              8929 AXXE-L_G3 intel_mp xdu-0590  R       0:15      4 kix11-g3-cn[001-004]
              8929 AXXE-L_G3 intel_mp xdu-0590  R       0:20      4 kix11-g3-cn[001-004]
              8929 AXXE-L_G3 intel_mp xdu-0590  R       0:25      4 kix11-g3-cn[001-004]
              8929 AXXE-L_G3 intel_mp xdu-0590  R       0:30      4 kix11-g3-cn[001-004]
              8929 AXXE-L_G3 intel_mp xdu-0590  R       0:35      4 kix11-g3-cn[001-004]
              8929 AXXE-L_G3 intel_mp xdu-0590  R       0:40      4 kix11-g3-cn[001-004]
              8929 AXXE-L_G3 intel_mp xdu-0590  R       0:45      4 kix11-g3-cn[001-004]
              8929 AXXE-L_G3 intel_mp xdu-0590  R       0:50      4 kix11-g3-cn[001-004]
              8929 AXXE-L_G3 intel_mp xdu-0590  R       0:55      4 kix11-g3-cn[001-004]
              8929 AXXE-L_G3 intel_mp xdu-0590  R       1:00      4 kix11-g3-cn[001-004]
              8929 AXXE-L_G3 intel_mp xdu-0590  R       1:05      4 kix11-g3-cn[001-004]
------------stdout/err---------------
:: WARNING: setvars.sh has already been run. Skipping re-execution.
   To force a re-execution of setvars.sh, use the '--force' option.
   Using '--force' can result in excessive use of your environment variables.
---- machinefile ----

---- INPUT ENVIRONMENT VARIABLES ----
NTASKS="8"
CORES_PER_NODE="40(x4)"
SLURM_PROCID="0"
kix11-g3-cn001
kix11-g3-cn001
kix11-g3-cn002
kix11-g3-cn002
kix11-g3-cn003
kix11-g3-cn003
kix11-g3-cn004
kix11-g3-cn004


---- case1-1 (mpirun) ----

[0] MPI startup(): 5       132992   kix11-g3-cn003  {20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39}Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_HYDRA_BOOTSTRAP=slurm
[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwloc
[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default
[0] MPI startup(): I_MPI_MPIRUN=mpirun
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors


---- case1-2 (mpirun) ----

[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_HYDRA_BOOTSTRAP=slurm
[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwloc
[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default
[0] MPI startup(): I_MPI_MPIRUN=mpirun
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors


---- case1-3 (mpiexec) ----

[mpiexec@kix11-g3-cn001] check_exit_codes (../../../../../src/pm/i_hydra/libhydra/demux/hydra_demux_poll.c:117): unable to run bstrap_proxy on kix11-g3-cn002 (pid 212527, exit code 65280)
[mpiexec@kix11-g3-cn001] poll_for_event (../../../../../src/pm/i_hydra/libhydra/demux/hydra_demux_poll.c:159): check exit codes error
[mpiexec@kix11-g3-cn001] HYD_dmx_poll_wait_for_proxy_event (../../../../../src/pm/i_hydra/libhydra/demux/hydra_demux_poll.c:212): poll for event error
[mpiexec@kix11-g3-cn001] HYD_bstrap_setup (../../../../../src/pm/i_hydra/libhydra/bstrap/src/intel/i_hydra_bstrap.c:1037): error waiting for event
[mpiexec@kix11-g3-cn001] HYD_print_bstrap_setup_error_message (../../../../../src/pm/i_hydra/mpiexec/intel/i_mpiexec.c:1015): error setting up the bootstrap proxies
[mpiexec@kix11-g3-cn001] Possible reasons:
[mpiexec@kix11-g3-cn001] 1. Host is unavailable. Please check that all hosts are available.
[mpiexec@kix11-g3-cn001] 2. Cannot launch hydra_bstrap_proxy or it crashed on one of the hosts. Make sure hydra_bstrap_proxy is available on all hosts and it has right permissions.
[mpiexec@kix11-g3-cn001] 3. Ssh bootstrap cannot launch processes on remote host. Make sure that passwordless ssh connection is established across compute hosts.
[mpiexec@kix11-g3-cn001]    You may try using -bootstrap option to select alternative launcher.


---- case2-1 (mpirun) ----

[0] MPI startup(): 5       133146   kix11-g3-cn003  {20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39}Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_HYDRA_BOOTSTRAP=slurm
[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwloc
[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default
[0] MPI startup(): I_MPI_MPIRUN=mpirun
[0] MPI startup(): I_MPI_PIN_RESPECT_CPUSET=0
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors


---- case2-2 (mpirun) ----

[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_HYDRA_BOOTSTRAP=slurm
[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwloc
[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default
[0] MPI startup(): I_MPI_MPIRUN=mpirun
[0] MPI startup(): I_MPI_PIN_RESPECT_CPUSET=0
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors


---- case2-3 (mpiexec) ----

[0] MPI startup(): Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwloc
[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default
[0] MPI startup(): I_MPI_PIN_RESPECT_CPUSET=0
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors


---- case3-1 ----

[0] MPI startup(): Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_HYDRA_BOOTSTRAP=slurm
[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwloc
[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default
[0] MPI startup(): I_MPI_MPIRUN=mpirun
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors


---- case3-2 ----

[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_HYDRA_BOOTSTRAP=slurm
[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwloc
[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default
[0] MPI startup(): I_MPI_MPIRUN=mpirun
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors


---- case3-3 ----

[0] MPI startup(): Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_HYDRA_BOOTSTRAP=slurm
[0] MPI startup(): I_MPI_HYDRA_TOPOLIB=hwloc
[0] MPI startup(): I_MPI_INTERNAL_MEM_POLICY=default
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors


---- case4 ----

[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 1 processors
Hello world from processor kix11-g3-cn001, rank 0 out of 1 processors
Hello world from processor kix11-g3-cn002, rank 0 out of 1 processors
Hello world from processor kix11-g3-cn002, rank 0 out of 1 processors
Hello world from processor kix11-g3-cn003, rank 0 out of 1 processors
Hello world from processor kix11-g3-cn003, rank 0 out of 1 processors
Hello world from processor kix11-g3-cn004, rank 0 out of 1 processors
Hello world from processor kix11-g3-cn004, rank 0 out of 1 processors
MPI startup(): PMI server not found. Please set I_MPI_PMI_LIBRARY variable if it is not a singleton case.
MPI startup(): PMI server not found. Please set I_MPI_PMI_LIBRARY variable if it is not a singleton case.
MPI startup(): PMI server not found. Please set I_MPI_PMI_LIBRARY variable if it is not a singleton case.
MPI startup(): PMI server not found. Please set I_MPI_PMI_LIBRARY variable if it is not a singleton case.
MPI startup(): PMI server not found. Please set I_MPI_PMI_LIBRARY variable if it is not a singleton case.
MPI startup(): PMI server not found. Please set I_MPI_PMI_LIBRARY variable if it is not a singleton case.
MPI startup(): PMI server not found. Please set I_MPI_PMI_LIBRARY variable if it is not a singleton case.
MPI startup(): PMI server not found. Please set I_MPI_PMI_LIBRARY variable if it is not a singleton case.


---- case5-1 ----

[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_PMI_LIBRARY=/lib64/libpmi.so
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors


---- case5-2 ----

[0] MPI startup(): I_MPI_DEBUG=10
[0] MPI startup(): I_MPI_FABRICS=shm:ofi
[0] MPI startup(): I_MPI_PMI_LIBRARY=/lib64/libpmi.so
[0] MPI startup(): I_MPI_ROOT=/utils/opt/intel/oneapi/mpi/2021.2.0
Hello world from processor kix11-g3-cn001, rank 0 out of 8 processors
Hello world from processor kix11-g3-cn001, rank 1 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 2 out of 8 processors
Hello world from processor kix11-g3-cn002, rank 3 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 4 out of 8 processors
Hello world from processor kix11-g3-cn003, rank 5 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 6 out of 8 processors
Hello world from processor kix11-g3-cn004, rank 7 out of 8 processors

参考

1
3
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
1
3

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?