[Users] Error in exe/cactus_sim
Roland Haas
rhaas at illinois.edu
Mon Apr 27 07:58:14 CDT 2020
Hello Rishank Diwan,
glad to have been able to help.
Yours,
Roland
> Hello Roland and Ian,
>
> I removed the mpich-doc mpich mpich-doc libmpich-dev libmpich12 and other
> dependencies as you suggested and got the updated version of mpi file I
> sent you.
>
> ii compiz
> 1:0.9.13.1+18.04.20180302-0ubuntu1 all OpenGL
> window and compositing manager
> ii compiz-core
> 1:0.9.13.1+18.04.20180302-0ubuntu1 amd64 OpenGL
> window and compositing manager
> ii compiz-gnome
> 1:0.9.13.1+18.04.20180302-0ubuntu1 amd64 OpenGL
> window and compositing manager - GNOME window decorator
> ii compiz-plugins-default:amd64
> 1:0.9.13.1+18.04.20180302-0ubuntu1 amd64 OpenGL
> window and compositing manager - default plugins
> ii libcompizconfig0:amd64
> 1:0.9.13.1+18.04.20180302-0ubuntu1 amd64 Settings
> library for plugins - OpenCompositing Project
> ii libdecoration0:amd64
> 1:0.9.13.1+18.04.20180302-0ubuntu1 amd64 Compiz
> window decoration library
> ii libexempi3:amd64 2.4.5-2
> amd64 library to parse XMP metadata (Library)
> ii libhdf5-openmpi-100:amd64 1.10.0-patch1+docs-4
> amd64 Hierarchical Data Format 5 (HDF5) -
> runtime files - OpenMPI version
> ii libhdf5-openmpi-dev 1.10.0-patch1+docs-4
> amd64 Hierarchical Data Format 5 (HDF5) -
> development files - OpenMPI version
> ii libopenmpi-dev 2.1.1-8
> amd64 high performance message passing
> library -- header files
> ii libopenmpi2:amd64 2.1.1-8
> amd64 high performance message passing
> library -- shared library
> ii make 4.1-9.1ubuntu1
> amd64 utility for directing compilation
> ii openmpi-bin 2.1.1-8
> amd64 high performance message passing
> library -- binaries
> ii openmpi-common 2.1.1-8
> all high performance message passing
> library -- common files
>
> I don't see any mpich dependencies now. I also tried
> installing einsteintoolkit again and it happens to be working fine.
>
> Thank you very much for your help.
>
> Yours Sincerely,
> Rishank Diwan
>
>
> On Sun, 26 Apr 2020 at 21:24, Roland Haas <rhaas at illinois.edu> wrote:
>
> > Hello Rishank Diwan,
> >
> > looking at you file "mpi" there are still bits and pieces of mpich
> > installed. Namely:
> >
> > ii mpich-doc 3.3~a2-4
> > all Documentation for MPICH
> > ii mpich 3.3~a2-4
> > amd64 Implementation of the MPI Message
> > Passing Interface standard
> > ii mpich-doc 3.3~a2-4
> > all Documentation for MPICH
> > ii libmpich-dev 3.3~a2-4
> > amd64 Development files for MPICH
> > ii libmpich-dev 3.3~a2-4
> > amd64 Development files for MPICH
> > ii libmpich12:amd64 3.3~a2-4
> > amd64 Shared libraries for MPICH
> >
> > as well as a lot of MPICH using libraries:
> >
> > ii gromacs-mpich 2018.1-1
> > amd64 Molecular dynamics sim, binaries for
> > MPICH parallelization
> > ii libhdf5-mpich-100:amd64 1.10.0-patch1+docs-4
> > amd64 Hierarchical Data Format 5 (HDF5) -
> > runtime files - MPICH2 version
> > ii libhdf5-mpich-dev 1.10.0-patch1+docs-4
> > amd64 Hierarchical Data Format 5 (HDF5) -
> > development files - MPICH version
> > ii libmeep-mpich2-8 1.3-4build3
> > amd64 library for using parallel (OpenMPI)
> > version of meep
> > ii libmeep-mpich2-dev 1.3-4build3
> > amd64 development library for using
> > parallel (OpenMPI) version of meep
> > ii libmpich12:amd64 3.3~a2-4
> > amd64 Shared libraries for MPICH
> > ii libscalapack-mpich-dev 2.0.2-4
> > amd64 Scalable Linear Algebra Package -
> > Dev. files for MPICH
> > ii libscalapack-mpich2.0 2.0.2-4
> > amd64 Scalable Linear Algebra Package -
> > Shared libs. for MPICH
> > ii libtachyon-mpich-0:amd64 0.99~b6+dsx-8
> > amd64 Parallel/Multiprocessor Ray Tracing
> > Library - runtime - MPICH flavour
> > ii libtachyon-mpich-0-dev:amd64 0.99~b6+dsx-8
> > amd64 Parallel/Multiprocessor Ray Tracing
> > Library - development - MPICH flavour
> > ii meep-mpich2 1.3-4build3
> > amd64 software package for FDTD simulation,
> > parallel (OpenMPI) version
> > ii netpipe-mpich2 3.7.2-7.4build2
> > amd64 Network performance tool using MPICH2
> > MPI
> > ii yorick-mpy-mpich2 2.2.04+dfsg1-9
> > amd64 Message Passing Yorick (MPICH2 build)
> > ii libhdf5-mpich-100:amd64 1.10.0-patch1+docs-4
> > amd64 Hierarchical Data Format 5 (HDF5) -
> > runtime files - MPICH2 version
> > ii libhdf5-mpich-dev 1.10.0-patch1+docs-4
> > amd64 Hierarchical Data Format 5 (HDF5) -
> > development files - MPICH version
> > ii libmeep-mpich2-8 1.3-4build3
> > amd64 library for using parallel (OpenMPI)
> > version of meep
> > ii libmeep-mpich2-dev 1.3-4build3
> > amd64 development library for using
> > parallel (OpenMPI) version of meep
> > ii libscalapack-mpich-dev 2.0.2-4
> > amd64 Scalable Linear Algebra Package -
> > Dev. files for MPICH
> > ii libscalapack-mpich2.0 2.0.2-4
> > amd64 Scalable Linear Algebra Package -
> > Shared libs. for MPICH
> > ii libtachyon-mpich-0:amd64 0.99~b6+dsx-8
> > amd64 Parallel/Multiprocessor Ray Tracing
> > Library - runtime - MPICH flavour
> > ii libtachyon-mpich-0-dev:amd64 0.99~b6+dsx-8
> > amd64 Parallel/Multiprocessor Ray Tracing
> > Library - development - MPICH flavour
> >
> > You will have to remove *all* the mpich pieces, namely: mpich-doc
> > mpich mpich-doc libmpich-dev libmpich12 to be sure to avoid any strange
> > interference. Note that this will also remove the mpich using libraries.
> >
> > Yours,
> > Roland
> >
> > > Hello Roland and Ian,
> > >
> > > I am attaching the build.log file. I have also tried running the
> > simulation
> > > after removing the MPICH with this I also attaching to file obtained from
> > > "dpkg --list | grep -vi compile | grep -i mpi" command to compare from
> > > previous one. I also did make changes in PATH and LD_Library_Path. The
> > > current path are as follows:
> > >
> > > LD_LIBRARY_PATH=:/usr/lib/x86_64-linux-gnu/openmpi/lib/
> > >
> > >
> > PATH=/home/rishank/miniconda2/bin:/home/rishank/anaconda2/bin:/home/rishank/anaconda2/condabin:/home/rishank/bin:/home/rishank/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/bin
> > >
> > > and the previous one were:
> > >
> > > LD_LIBRARY_PATH=:/home/rishank/.openmpi/lib/:/home/rishank/.openmpi/lib/
> > >
> > >
> > PATH=/home/rishank/miniconda2/bin:/home/rishank/anaconda2/bin:/home/rishank/anaconda2/condabin:/home/rishank/bin:/home/rishank/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/home/rishank/.openmpi/bin:/home/rishank/.openmpi/bin
> > >
> > > Yours Sincerely,
> > > Rishank Diwan
> > >
> > >
> > >
> > >
> > > On Thu, 23 Apr 2020 at 00:19, Roland Haas <rhaas at illinois.edu> wrote:
> > >
> > > > Hello Rishank,
> > > >
> > > > > It's possible that you might have to uninstall mpich, but I'm not
> > > > > sure. There *shouldn't* be a fundamental reason why you can't have
> > > > > both installed and just use one of them, but there might be technical
> > > > > reasons why it doesn't work.
> > > > It is possible to have both installed at the same time. However things
> > > > then become fragile. Ubuntu will declare one as the "default" using
> > > > either the "alternatives" system (which is why eg /usr/bin/mpi and
> > > > /usr/bin/mpicc point to files in /etc/alternatives/). This however does
> > > > not include things like eg hdf5 which come in flavor packages
> > > > hdf5-openmpi, hdf5-mpich, hdf5-serial each of which has libraries
> > > > libhdf5_{openmpi,mpich,serial}.so. Then there's always the possibility
> > > > something being wrong in the package setup.
> > > >
> > > > Thus having multiple MPI stacks installed at the same time, while
> > > > possible, is outside of the realm of cases easily supported by our
> > > > automated setup and would likely require a hand-crafted option list
> > > > similar to the ones used on clusters. Eg setting
> > > >
> > > > MPI_DIR = /usr/lib/x86_64-linux-gnu/openmpi
> > > > MPI_INC_DIRS = /usr/lib/x86_64-linux-gnu/openmpi/include
> > > > MPI_LIB_DIRS = /usr/lib/x86_64-linux-gnu/openmpi/lib
> > > > MPI_LIBS = mpi
> > > >
> > > > *may* work (though see my note above about libraries that can use MPI).
> > > > Note that this still assumes that the mpirun found in $PATH is the one
> > > > that matches the used MPI stack, ie OpenMPI's so it may require changes
> > > > to the run script as well.
> > > >
> > > > Thus the simplest solution is likely to make sure only one MPI stack
> > > > (OpenMPI or MPICH) is installed and uninstall the other one.
> > > >
> > > > Yours,
> > > > Roland
> > > >
> > > > --
> > > > My email is as private as my paper mail. I therefore support encrypting
> > > > and signing email messages. Get my PGP key from http://pgp.mit.edu .
> > > >
> >
> >
> > --
> > My email is as private as my paper mail. I therefore support encrypting
> > and signing email messages. Get my PGP key from http://pgp.mit.edu .
> >
--
My email is as private as my paper mail. I therefore support encrypting
and signing email messages. Get my PGP key from http://pgp.mit.edu .
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 833 bytes
Desc: OpenPGP digital signature
Url : http://lists.einsteintoolkit.org/pipermail/users/attachments/20200427/aa35c291/attachment.bin
More information about the Users
mailing list