<div dir="ltr">Hello Roland and Ian,<div><br></div><div>I removed the mpich-doc mpich mpich-doc libmpich-dev libmpich12 and other dependencies as you suggested and got the updated version of mpi file I sent you.</div><div><br></div><div>ii compiz 1:0.9.13.1+18.04.20180302-0ubuntu1 all OpenGL window and compositing manager<br>ii compiz-core 1:0.9.13.1+18.04.20180302-0ubuntu1 amd64 OpenGL window and compositing manager<br>ii compiz-gnome 1:0.9.13.1+18.04.20180302-0ubuntu1 amd64 OpenGL window and compositing manager - GNOME window decorator<br>ii compiz-plugins-default:amd64 1:0.9.13.1+18.04.20180302-0ubuntu1 amd64 OpenGL window and compositing manager - default plugins<br>ii libcompizconfig0:amd64 1:0.9.13.1+18.04.20180302-0ubuntu1 amd64 Settings library for plugins - OpenCompositing Project<br>ii libdecoration0:amd64 1:0.9.13.1+18.04.20180302-0ubuntu1 amd64 Compiz window decoration library<br>ii libexempi3:amd64 2.4.5-2 amd64 library to parse XMP metadata (Library)<br>ii libhdf5-openmpi-100:amd64 1.10.0-patch1+docs-4 amd64 Hierarchical Data Format 5 (HDF5) - runtime files - OpenMPI version<br>ii libhdf5-openmpi-dev 1.10.0-patch1+docs-4 amd64 Hierarchical Data Format 5 (HDF5) - development files - OpenMPI version<br>ii libopenmpi-dev 2.1.1-8 amd64 high performance message passing library -- header files<br>ii libopenmpi2:amd64 2.1.1-8 amd64 high performance message passing library -- shared library<br>ii make 4.1-9.1ubuntu1 amd64 utility for directing compilation<br>ii openmpi-bin 2.1.1-8 amd64 high performance message passing library -- binaries<br>ii openmpi-common 2.1.1-8 all high performance message passing library -- common files<br></div><div><br></div><div>I don't see any mpich dependencies now. I also tried installing einsteintoolkit again and it happens to be working fine.</div><div><br></div><div>Thank you very much for your help.</div><div><div><div dir="ltr" class="gmail_signature" data-smartmail="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div dir="ltr"><div dir="ltr"><div style="color:rgb(136,136,136);font-family:arial"><font color="#444444"><br></font></div><div style="color:rgb(136,136,136);font-family:arial"><font color="#444444">Yours Sincerely,</font></div><div style="color:rgb(136,136,136);font-family:arial"><font color="#444444">Rishank Diwan</font></div><div style="color:rgb(136,136,136);font-family:arial"><br></div></div></div></div></div></div></div></div></div></div></div></div></div></div><img src="https://my-email-signature.link/signature.gif?u=544763&e=94523375&v=04a6aa07517f9a510132ee66b4b1c8fa08ff84806b42fe4d44b25e2880ffb6bb" style="width:2px;max-height:0;overflow:hidden"><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Sun, 26 Apr 2020 at 21:24, Roland Haas <<a href="mailto:rhaas@illinois.edu">rhaas@illinois.edu</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Hello Rishank Diwan,<br>
<br>
looking at you file "mpi" there are still bits and pieces of mpich<br>
installed. Namely:<br>
<br>
ii mpich-doc 3.3~a2-4 all Documentation for MPICH<br>
ii mpich 3.3~a2-4 amd64 Implementation of the MPI Message Passing Interface standard<br>
ii mpich-doc 3.3~a2-4 all Documentation for MPICH<br>
ii libmpich-dev 3.3~a2-4 amd64 Development files for MPICH<br>
ii libmpich-dev 3.3~a2-4 amd64 Development files for MPICH<br>
ii libmpich12:amd64 3.3~a2-4 amd64 Shared libraries for MPICH<br>
<br>
as well as a lot of MPICH using libraries:<br>
<br>
ii gromacs-mpich 2018.1-1 amd64 Molecular dynamics sim, binaries for MPICH parallelization<br>
ii libhdf5-mpich-100:amd64 1.10.0-patch1+docs-4 amd64 Hierarchical Data Format 5 (HDF5) - runtime files - MPICH2 version<br>
ii libhdf5-mpich-dev 1.10.0-patch1+docs-4 amd64 Hierarchical Data Format 5 (HDF5) - development files - MPICH version<br>
ii libmeep-mpich2-8 1.3-4build3 amd64 library for using parallel (OpenMPI) version of meep<br>
ii libmeep-mpich2-dev 1.3-4build3 amd64 development library for using parallel (OpenMPI) version of meep<br>
ii libmpich12:amd64 3.3~a2-4 amd64 Shared libraries for MPICH<br>
ii libscalapack-mpich-dev 2.0.2-4 amd64 Scalable Linear Algebra Package - Dev. files for MPICH<br>
ii libscalapack-mpich2.0 2.0.2-4 amd64 Scalable Linear Algebra Package - Shared libs. for MPICH<br>
ii libtachyon-mpich-0:amd64 0.99~b6+dsx-8 amd64 Parallel/Multiprocessor Ray Tracing Library - runtime - MPICH flavour<br>
ii libtachyon-mpich-0-dev:amd64 0.99~b6+dsx-8 amd64 Parallel/Multiprocessor Ray Tracing Library - development - MPICH flavour<br>
ii meep-mpich2 1.3-4build3 amd64 software package for FDTD simulation, parallel (OpenMPI) version<br>
ii netpipe-mpich2 3.7.2-7.4build2 amd64 Network performance tool using MPICH2 MPI<br>
ii yorick-mpy-mpich2 2.2.04+dfsg1-9 amd64 Message Passing Yorick (MPICH2 build)<br>
ii libhdf5-mpich-100:amd64 1.10.0-patch1+docs-4 amd64 Hierarchical Data Format 5 (HDF5) - runtime files - MPICH2 version<br>
ii libhdf5-mpich-dev 1.10.0-patch1+docs-4 amd64 Hierarchical Data Format 5 (HDF5) - development files - MPICH version<br>
ii libmeep-mpich2-8 1.3-4build3 amd64 library for using parallel (OpenMPI) version of meep<br>
ii libmeep-mpich2-dev 1.3-4build3 amd64 development library for using parallel (OpenMPI) version of meep<br>
ii libscalapack-mpich-dev 2.0.2-4 amd64 Scalable Linear Algebra Package - Dev. files for MPICH<br>
ii libscalapack-mpich2.0 2.0.2-4 amd64 Scalable Linear Algebra Package - Shared libs. for MPICH<br>
ii libtachyon-mpich-0:amd64 0.99~b6+dsx-8 amd64 Parallel/Multiprocessor Ray Tracing Library - runtime - MPICH flavour<br>
ii libtachyon-mpich-0-dev:amd64 0.99~b6+dsx-8 amd64 Parallel/Multiprocessor Ray Tracing Library - development - MPICH flavour<br>
<br>
You will have to remove *all* the mpich pieces, namely: mpich-doc<br>
mpich mpich-doc libmpich-dev libmpich12 to be sure to avoid any strange<br>
interference. Note that this will also remove the mpich using libraries.<br>
<br>
Yours,<br>
Roland<br>
<br>
> Hello Roland and Ian,<br>
> <br>
> I am attaching the build.log file. I have also tried running the simulation<br>
> after removing the MPICH with this I also attaching to file obtained from<br>
> "dpkg --list | grep -vi compile | grep -i mpi" command to compare from<br>
> previous one. I also did make changes in PATH and LD_Library_Path. The<br>
> current path are as follows:<br>
> <br>
> LD_LIBRARY_PATH=:/usr/lib/x86_64-linux-gnu/openmpi/lib/<br>
> <br>
> PATH=/home/rishank/miniconda2/bin:/home/rishank/anaconda2/bin:/home/rishank/anaconda2/condabin:/home/rishank/bin:/home/rishank/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/bin<br>
> <br>
> and the previous one were:<br>
> <br>
> LD_LIBRARY_PATH=:/home/rishank/.openmpi/lib/:/home/rishank/.openmpi/lib/<br>
> <br>
> PATH=/home/rishank/miniconda2/bin:/home/rishank/anaconda2/bin:/home/rishank/anaconda2/condabin:/home/rishank/bin:/home/rishank/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/home/rishank/.openmpi/bin:/home/rishank/.openmpi/bin<br>
> <br>
> Yours Sincerely,<br>
> Rishank Diwan<br>
> <br>
> <br>
> <br>
> <br>
> On Thu, 23 Apr 2020 at 00:19, Roland Haas <<a href="mailto:rhaas@illinois.edu" target="_blank">rhaas@illinois.edu</a>> wrote:<br>
> <br>
> > Hello Rishank,<br>
> ><br>
> > > It's possible that you might have to uninstall mpich, but I'm not<br>
> > > sure. There *shouldn't* be a fundamental reason why you can't have<br>
> > > both installed and just use one of them, but there might be technical<br>
> > > reasons why it doesn't work.<br>
> > It is possible to have both installed at the same time. However things<br>
> > then become fragile. Ubuntu will declare one as the "default" using<br>
> > either the "alternatives" system (which is why eg /usr/bin/mpi and<br>
> > /usr/bin/mpicc point to files in /etc/alternatives/). This however does<br>
> > not include things like eg hdf5 which come in flavor packages<br>
> > hdf5-openmpi, hdf5-mpich, hdf5-serial each of which has libraries<br>
> > libhdf5_{openmpi,mpich,serial}.so. Then there's always the possibility<br>
> > something being wrong in the package setup.<br>
> ><br>
> > Thus having multiple MPI stacks installed at the same time, while<br>
> > possible, is outside of the realm of cases easily supported by our<br>
> > automated setup and would likely require a hand-crafted option list<br>
> > similar to the ones used on clusters. Eg setting<br>
> ><br>
> > MPI_DIR = /usr/lib/x86_64-linux-gnu/openmpi<br>
> > MPI_INC_DIRS = /usr/lib/x86_64-linux-gnu/openmpi/include<br>
> > MPI_LIB_DIRS = /usr/lib/x86_64-linux-gnu/openmpi/lib<br>
> > MPI_LIBS = mpi<br>
> ><br>
> > *may* work (though see my note above about libraries that can use MPI).<br>
> > Note that this still assumes that the mpirun found in $PATH is the one<br>
> > that matches the used MPI stack, ie OpenMPI's so it may require changes<br>
> > to the run script as well.<br>
> ><br>
> > Thus the simplest solution is likely to make sure only one MPI stack<br>
> > (OpenMPI or MPICH) is installed and uninstall the other one.<br>
> ><br>
> > Yours,<br>
> > Roland<br>
> ><br>
> > --<br>
> > My email is as private as my paper mail. I therefore support encrypting<br>
> > and signing email messages. Get my PGP key from <a href="http://pgp.mit.edu" rel="noreferrer" target="_blank">http://pgp.mit.edu</a> .<br>
> ><br>
<br>
<br>
-- <br>
My email is as private as my paper mail. I therefore support encrypting<br>
and signing email messages. Get my PGP key from <a href="http://pgp.mit.edu" rel="noreferrer" target="_blank">http://pgp.mit.edu</a> .<br>
</blockquote></div>