[Users] Issue with compiling ET on cluster

Roland Haas rhaas at illinois.edu
Tue Nov 15 07:14:35 CST 2022


Hello Shamim Haque,

Not all errors in config.log indicate an actual failure, some just show
up b/c a configure test fails and a workaround is implemented in
Cactus. 

Unfortunately the intel compiler have both a minimum and a maximum g++
(and STL) version that can be used. The old intel 2013 compiler may not
support g++ 7.0.

While I could not find the actual information for intel 2013 (Intel
hides this quite well), there is this stackoverflow post:

https://stackoverflow.com/questions/29371301/intel-c-compiler-what-is-highest-gcc-version-compatibility#

that reports tha the *newest* version of gcc supported by  icpc 15.0.1
(2014-10-23) is gcc 4.9 so the even older intel 13 compiler is not going to support any newer gcc.

Eg this

https://www.cism.ucl.ac.be/Services/Formations/ICS/ics_2013.0.028/composer_xe_2013.1.117/Documentation/en_US/Release_Notes_C_2013_L_EN.pdf

which is for an intel 2013 states that

Support for C++11 features in gcc 4.6 and 4.7 headers

so quite likely you need a newer Intel compiler than Intel 2013 (9
years old by now after all).

From the 2017 release notes

https://www.intel.com/content/www/us/en/developer/articles/release-notes/intel-c-compiler-170-for-linux-release-notes-for-intel-parallel-studio-xe-2017.html#sysreq

it would seem that it does support gcc 6:

--8<--
Linux Developer tools component installed, including gcc, g++ and
related tools

    gcc versions 4.3 - 6 supported
--8<--

Yours,
Roland

> A small correction to my previous email, the config log file for part A is
> the one attached here, which is compiling attempt with Intel 2013 +
> gcc-7.3.0. The one in my previous mail is from compiling attempt with Intel
> 2013 + gcc-4.9.3.
> 
> However, the errors in the config files look identical.
> 
> Regards
> Shamim Haque
> Senior Research Fellow (SRF)
> Department of Physics
> IISER Bhopal
> 
>> 
> On Tue, Nov 15, 2022 at 2:30 AM Shamim Haque 1910511 <shamims at iiserb.ac.in>
> wrote:
> 
> > Hello Roland,
> >
> > Thanks for the comments on the issues. I have attached the config.log file
> > for Part A, and I'll see what can be done about Part B.
> >
> > For Part C, I found a LAPACK library in our HPC, and the compilation
> > process was completed without that warning. The Helloworld is also executed
> > correctly.
> >
> > Currently, with this ET, I am facing issue on executing the gallery BNSM
> > simulation on 2 nodes. The info in the command line goes as follows:
> >
> > *./simfactory/bin/sim create-submit nsns_p32_t24_8 --procs=32 --ppn=16
> > --num-threads=1 --ppn-used=16 --num-smt=1 --parfile=par/nsnstohmns1.par
> > --walltime=25:00:00*
> >
> > In the out file, the error is:
> >
> >
> >
> >
> >
> >
> >
> >
> >
> >
> > *+ mpiexec -n 32 -npernode 16
> > /home2/mallick/simulations2/nsns_p32_t24_11/SIMFACTORY/exe/cactus_sim -L 3
> > /home2/mallick/simulations2/nsns_p32_t24_11/output-0000/nsnstohmns1.par--------------------------------------------------------------------------Your
> > job has requested more processes than the ppr forthis topology can
> > support:  App:
> > /home2/mallick/simulations2/nsns_p32_t24_11/SIMFACTORY/exe/cactus_sim
> > Number of procs:  32  PPR: 16:nodePlease revise the conflict and try
> > again.--------------------------------------------------------------------------Simfactory
> > Done at date: Mon 14 Nov 2022 07:11:57 PM IST*
> >
> > The simulations seem to work fine with 1 node, that is, --procs set to
> > 16,8 or 4 keeping --ppn to 16. I assume the mpiexec command in runscript
> > needs to be updated to provide clarity to the mpi library while using more
> > than 1 node.
> >
> > I need some help here, I don't know how to proceed at this point. I am
> > attaching the run, ini, cfg, sub script used for this ET and the output
> > file for reference. Thanks in advance for all the help.
> >
> > Regards
> > Shamim Haque
> > Senior Research Fellow (SRF)
> > Department of Physics
> > IISER Bhopal
> >
> > ᐧ
> >
> > On Mon, Nov 14, 2022 at 8:22 PM Roland Haas <rhaas at illinois.edu> wrote:
> >
> >> Hello Shamim Haque,
> >>
> >> > *Part A:*
> >> > Keeping compiler Intel 2013, I tried to add lines -gcc_name and
> >> -gxx_name
> >> > to CFLAGS and CXXFLAGS pointing to gcc/7.3.0. The error while
> >> compiling is
> >> > the following:
> >> >
> >> >
> >> >
> >> >
> >> >
> >> > *checking for C++ lambda expressions... yeschecking for C++ range-based
> >> for
> >> > statements... noCactus requires a C++11 compiler -- check your C++
> >> compiler
> >> > and C++ compiler flagsError reconfiguring sim-configmake: ***
> >> [sim-config]
> >> > Error 2*
> >>
> >> This still sounds like a compiler incompatibility. Somehow Cactus does
> >> not detect support for range based for statements, which however Intel
> >> 2013 claims to support in (search for "range-based", it is  item N2930)
> >>
> >>
> >> https://urldefense.com/v3/__https://www.intel.com/content/www/us/en/developer/articles/technical/c0x-features-supported-by-intel-c-compiler.html__;!!DZ3fjg!4p6GfR3s9DU1DFu1e47xeUg3PBbxUi312fkmUSjyr8brzIS1vwSemU9s5aD35GvJj6eOhHeniZM3HGkosXBN$ 
> >>
> >> What does the file config.log (that autoconf points you to for detailed
> >> error messages) contain? It usually is something like
> >> configs/sim/config-data/config.log . The options used for CXXFLAGS (if
> >> those are the ones used) look fine to me.
> >>
> >> > *PART B:*
> >>
> >> > *Error: Product support for your (Comp-CL) license has expired.License
> >> > file(s) used were (in this order):    1.  Trusted Storage**  2.
> >>
> >> Yes, this is a license issue. Note that you may still require -gxx-name
> >> options even for newer Intel compilers (they may default to the system
> >> g++ and system STL otherwise).
> >>
> >> > *Part C:*
> >> > I compiled another ET successfully using the modules gcc-7.3.0,
> >> > openmpi-3.1.4, FFTW3/3.3.3, gsl/1.16, openssl/1.1.1a, zlib/1.2.8,
> >> > cmake/3.15.4, libjpeg/1.2.1, HDF5/1.8.10,  openmpi/3.1.4.
> >> >
> >> > But there seems to be a repetitive warning while buliding ET, which I am
> >> > not sure if I should be worried about:
> >> > */usr/bin/ld: warning: libgfortran.so.3, needed by
> >> > /usr/lib64/atlas/liblapack.so, may conflict with libgfortran.so.4*
> >>
> >> Basically: /usr/lib64/atlas/liblapack.so (the system ATLAS library) has
> >> been compiled with at version of gfortran much older than the one you
> >> are using. This can be fail in particular when involving strings being
> >> passed to Fortran code.
> >>
> >> > Please let me know if I should consider changing something to get rid of
> >> > this warning, I have attached ini file (kanad_et8.ini), cfg file
> >> > (kanad_et8.cfg) and full terminal output (out_et8.txt) for reference.
> >>
> >> You may need to set:
> >>
> >> LAPACK_DIR = BUILD
> >> BLAS_DIR = BUILD
> >>
> >> to force the EinsteinToolkit to build its own (slow, but we do not rely
> >> on BLAS / LAPACK for speed) versions of LAPACK and BLAS (or you can try
> >> using OpenBLAS which is faster, but as said, speed of those two is not
> >> really relevant for typical ET simulations).
> >>
> >> > Now, having got this compiled successfully, should I continue to pursue
> >> > compiling ET with intel compilers? Though I am still not sure if this ET
> >> > (with gcc-7.3) will show up any errors in future as I aim to work on
> >> binary
> >> > neutron star merger simulations. Please let me know your thoughts on
> >> this.
> >>
> >> Historically we did see slightly faster code with the Intel compiler. I
> >> suspect that similar speeds can be reached using GNU compilers by now
> >> though if one sets -ffast-math and similar options (that Intel
> >> defaults to) in CFLAGS and CXXFLAGS (Fortran has some of those
> >> optimizations allowed by the language already so it does not do quite
> >> so much for Fortran code).
> >>
> >> See eg: https://urldefense.com/v3/__https://gcc.gnu.org/wiki/FloatingPointMath__;!!DZ3fjg!4p6GfR3s9DU1DFu1e47xeUg3PBbxUi312fkmUSjyr8brzIS1vwSemU9s5aD35GvJj6eOhHeniZM3HJ06ogmy$  for an explanation
> >> of the compromises this involves.
> >>
> >> Yours,
> >> Roland
> >>
> >> --
> >> My email is as private as my paper mail. I therefore support encrypting
> >> and signing email messages. Get my PGP key from https://urldefense.com/v3/__http://pgp.mit.edu__;!!DZ3fjg!4p6GfR3s9DU1DFu1e47xeUg3PBbxUi312fkmUSjyr8brzIS1vwSemU9s5aD35GvJj6eOhHeniZM3HE00Tvms$  .
> >>
> >

-- 
My email is as private as my paper mail. I therefore support encrypting
and signing email messages. Get my PGP key from http://pgp.mit.edu .
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 833 bytes
Desc: OpenPGP digital signature
Url : http://lists.einsteintoolkit.org/pipermail/users/attachments/20221115/440a0d28/attachment.bin 


More information about the Users mailing list