[Users] MPI killing issue
Karima Shahzad
karimashahzad at gmail.com
Tue Dec 22 10:35:06 CST 2020
Hello there,
I am running a simple binary black hole simulation from the parameter file
kept in the Cactus/par/arXiv-1111.3344 where I simulated high resolution.
Furthermore, I assume simulation for lower resolution is less costly than
higher, but I wonder it keeps giving me errors in the low-resolution run.
(kernel dies for the MedResolution). I tried almost all the ways of
adjusting the par file according to my local machine strength, but I
failed. The error is below along with the par file, *.err, and *.out
filesgenerators attached.
Note: There is no storage issue at all.
Any help would be highly appreciated.
Warning: Too many threads per process specified: specified
num-threads=2 (ppn-used is 2)
Warning: Total number of threads and number of threads per process are
inconsistent: procs=1, num-threads=2 (procs*num-smt must be an integer
multiple of num-threads)
Warning: Total number of threads and number of cores per node are
inconsistent: procs=1, ppn-used=2 (procs must be an integer multiple
of ppn-used)
+ set -e
+ cd /home/karima/simulations/bbhL/output-0000-active
+ echo Checking:
+ pwd
+ hostname
+ date
+ echo Environment:
+ export CACTUS_NUM_PROCS=1
+ export CACTUS_NUM_THREADS=2
+ export GMON_OUT_PREFIX=gmon.out
+ export OMP_NUM_THREADS=2
+ env
+ sort
+ echo Starting:
+ date +%s
+ export CACTUS_STARTTIME=1608653732
+ [ 1 = 1 ]
+ [ 0 -eq 0 ]
+ /home/karima/simulations/bbhL/SIMFACTORY/exe/cactus_sim -L 3
/home/karima/simulations/bbhL/output-0000/BBHLowRes.par
WARNING level 0 from host karima-Latitude-E5470 process 0
in thorn cactus, file BBHLowRes.par:1:
-> ERROR IN PARAMETER FILE:Parse Error
Expected one of the following characters: ':'
CarpetMask::excluded_surface_factor[1] = 1.0
CarpetMask::excluded_surface [2] = 2
CarpetMask::excluded_surface_factor[2] = 1.0
Contents successfully written to
/home/karima/Cactus/repos/simfactory2/etc/defs.local.ini
^
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
Thanks in advance
best,
Karima S
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.einsteintoolkit.org/pipermail/users/attachments/20201222/268449e7/attachment-0001.html
-------------- next part --------------
A non-text attachment was scrubbed...
Name: BBHLowRes.par
Type: application/octet-stream
Size: 22185 bytes
Desc: not available
Url : http://lists.einsteintoolkit.org/pipermail/users/attachments/20201222/268449e7/attachment-0003.obj
-------------- next part --------------
A non-text attachment was scrubbed...
Name: bbhL.err
Type: application/octet-stream
Size: 1285 bytes
Desc: not available
Url : http://lists.einsteintoolkit.org/pipermail/users/attachments/20201222/268449e7/attachment-0004.obj
-------------- next part --------------
A non-text attachment was scrubbed...
Name: bbhL.out
Type: application/octet-stream
Size: 1787 bytes
Desc: not available
Url : http://lists.einsteintoolkit.org/pipermail/users/attachments/20201222/268449e7/attachment-0005.obj
More information about the Users
mailing list