[Users] mpi.h

Joel Giedt giedtj at rpi.edu
Wed Mar 21 09:56:17 CDT 2018


Hi, My personal experience with MPI is that if you use the MPI wrapper version of the compiler, such as mpicc, etc., it automatically finds the related header files.  Could it be something as simple as setting CC or CXX to the correct target?  -J

-----Original Message-----
From: users-bounces at einsteintoolkit.org [mailto:users-bounces at einsteintoolkit.org] On Behalf Of Roland Haas
Sent: Wednesday, March 21, 2018 10:53 AM
To: David Garrison
Cc: Users at einsteintoolkit.org
Subject: Re: [Users] mpi.h

Hello DG,

hmm that all look fine to me. From the content of configs/sim/bindings/Configuration/Capabilities/make.MPI.defn  it is using a system installed MPI (openmpi-1.6.5 in /share/apps) which makes sense given that you pass MPI_DIR=/share/apps/openmpi-1.6.5 in your config-info (which is basically a copy of your option list). So MPI_DIR=BUILD was not used in the end (through an oversight or something else I guess). 

Since you are passing an explicit

MPI_INC_DIRS=/share/apps/openmpi-1.6.5/include

I would definitely myself try and see if indeed there is a file

/share/apps/openmpi-1.6.5/include/mpi.h

and that this file is readable to your user account (eg does less /share/apps/openmpi-1.6.5/include/mpi.h work?).

This is just in case /share/apps/openmpi-1.6.5 was eg retired by the sysadmins.

Yours,
Roland

> Hello,
> 
> Please see below.  I’m using UH’s Maxwell cluster and don’t use simfactory.
> 
> Thanks,
> 
> -DG
> 
> > On Mar 20, 2018, at 11:47 AM, Roland Haas <roland.haas at physics.gatech.edu> wrote:
> > 
> > Hello DG,
> > 
> > it could be that something failed while MPI was compiled. Would you mind attaching the following files from inside of your Cactus tree, please?
> > 
> > configs/sim/bindings/Configuration/Capabilities/make.MPI.defn
> 
> INC_DIRS +=  $(MPI_INC_DIRS)
> INC_DIRS_F +=  $(MPI_LIB_DIRS)
> MPI_BUILD       = 
> MPI_INSTALL_DIR = 
> HWLOC_DIR       = 
> CCTK_MPI     = 1
> MPI_DIR      = /share/apps/openmpi-1.6.5
> MPI_INC_DIRS = /share/apps/openmpi-1.6.5/include MPI_LIB_DIRS = 
> /share/apps/openmpi-1.6.5/lib
> MPI_LIBS     = mpi mpi_cxx  open-rte open-pal dl nsl util
> HAVE_CAPABILITY_MPI = 1
> 
> > configs/sim/bindings/Configuration/Capabilities/make.HDF5.defn
> 
> include 
> /home/dgarriso/Cactus/configs/fixedmhd/bindings/Configuration/Capabili
> ties/make.MPI.defn include 
> /home/dgarriso/Cactus/configs/fixedmhd/bindings/Configuration/Capabili
> ties/make.ZLIB.defn
> INC_DIRS +=  $(HDF5_INC_DIRS)
> INC_DIRS_F +=  $(HDF5_INC_DIRS)
> HDF5_BUILD          = 
> HDF5_ENABLE_CXX     = no
> HDF5_ENABLE_FORTRAN = yes
> LIBSZ_DIR           = 
> LIBZ_DIR            = 
> HDF5_INSTALL_DIR    = 
> HDF5_DIR            = /share/apps/hdf5-1.8.12
> HDF5_ENABLE_CXX     = no
> HDF5_ENABLE_FORTRAN = yes
> HDF5_INC_DIRS       = /share/apps/hdf5-1.8.12/lib /share/apps/hdf5-1.8.12/include  /share/apps/openmpi-1.6.5/include 
> HDF5_LIB_DIRS       = /share/apps/hdf5-1.8.12/lib   /share/apps/openmpi-1.6.5/lib 
> HDF5_LIBS           =  hdf5hl_fortran hdf5_fortran hdf5_hl hdf5 z mpi mpi_cxx  open-rte open-pal dl nsl util m z
> HAVE_CAPABILITY_HDF5 = 1
> > 
> > configs/sim/bindings/Configuration/Capabilities/make.HDF5.deps
> 
> include 
> /home/dgarriso/Cactus/configs/fixedmhd/bindings/Configuration/Capabili
> ties/make.MPI.deps include 
> /home/dgarriso/Cactus/configs/fixedmhd/bindings/Configuration/Capabili
> ties/make.ZLIB.deps
> 
> > 
> > as well as
> > 
> > confis/sim/config-info (as well as the option list that you used)?  
> 
> # CONFIGURATION  : fixedmhd
> # CONFIG-DATE    : Mon Mar 19 01:07:33 2018 (GMT)
> # CONFIG-HOST    : cusco.hpcc.uh.edu
> # CONFIG-STATUS  : 0
> # CONFIG-OPTIONS :
> CC=icc
> CPPFLAGS=-DCCTK_DISABLE_OMP_COLLAPSE
> CXX=icpc
> CXX_OPTIMISE_FLAGS=-Os -ip -pthread -no-prec-div C_OPTIMISE_FLAGS=-Os 
> -ip -pthread -no-prec-div F77=ifort F77_OPTIMISE_FLAGS=-Os -ip 
> -no-prec-div F90=ifort F90_OPTIMISE_FLAGS=-Os -ip -no-prec-div
> FFLAGS=-O3 -fast -Mipa=fast,inline -tp=x64 FFTW=yes
> FFTW_DIR=/home/dgarriso/fftw3
> FFTW_LIBS=fftw3_threads fftw3_mpi $(MPI_LIBS) fftw3 m 
> FPPFLAGS=-DCCTK_DISABLE_OMP_COLLAPSE
> HDF5_DIR=/share/apps/hdf5-1.8.12
> LAPACK=yes
> LAPACK_DIR=/home/dgarriso/ifort64/lib
> LAPACK_LIBS=acml ifcoremt_pic imf irc svml
> MPI_DIR=/share/apps/openmpi-1.6.5
> MPI_INC_DIRS=/share/apps/openmpi-1.6.5/include
> MPI_LIBS=mpi mpi_cxx  open-rte open-pal dl nsl util 
> MPI_LIB_DIRS=/share/apps/openmpi-1.6.5/lib
> OPTIMISE=yes
> PTHREADS=no
> VERBOSE=yes
> 
> > 
> > If possible also the full output of:
> > 
> > export VERBOSE=yes
> > simfactory/bin/sim build 2>&1 | tee make.log
> > 
> > or
> > 
> > export VERBOSE=yes
> > make sim 2>&1 | tee make.log
> > 
> > (depending on whether you used simfactory or not), please?
> > 
> > Yours,
> > Roland
> >   
> >> Hello,
> >> 
> >> Every time I try to compile my code I get the following error.  
> >> 
> >> COMPILING arrangements/CactusPUGHIO/IOHDF5Util/src/Startup.c
> >> /share/apps/hdf5-1.8.12/include/H5public.h(61): catastrophic error: cannot open source file "mpi.h"
> >>  #   include <mpi.h>
> >> 
> >> This is new as of the most recent versions of Cactus.  Previous compilations of the same code on the same cluster did not result in this error.  Any help is appreciated.
> >> 
> >> I used the config option: MPI_DIR = BUILD
> >> 
> >> -DG
> >> 
> >> Sent from my iPhone
> >> _______________________________________________
> >> Users mailing list
> >> Users at einsteintoolkit.org
> >> http://lists.einsteintoolkit.org/mailman/listinfo/users
> > 
> > 
> > Yours,
> > Roland
> > 
> > 
> > --
> > My email is as private as my paper mail. I therefore support 
> > encrypting and signing email messages. Get my PGP key from http://pgp.mit.edu .
> 



--
My email is as private as my paper mail. I therefore support encrypting and signing email messages. Get my PGP key from http://pgp.mit.edu .



More information about the Users mailing list