[Users] MPI error in tutorial for new users
Praveer Krishna
max.praveer at gmail.com
Sat Mar 28 15:14:02 CDT 2020
Dear Dr. Roland,
Thank you for your detailed response. I really appreciate the time and
thought you must've put into it.
I have followed your suggestion of removing the current Cactus build and
compiling it again but received yet another error. The problem seems to be
a mixup of multiple MPI implementations, just as you had suggested. I
couldn't finesse my way out of it, so I re-installed Ubuntu
instead(deleting everything in the process) and began from scratch. I
installed only those packages mentioned on the Jupyter tutorial and the
code compiled successfully this time.
Thank you once again for your helpful advice.
Regards,
Praveer Krishna
On Fri, Mar 27, 2020 at 8:42 PM Roland Haas <rhaas at illinois.edu> wrote:
> Hello Praveer Krishna,
>
> since mpirun was not found initially it would seem that you had not MPI
> stack installed when compiling Cactus.
>
> Cactus is best used with an MPI stack installed beforehand. The new
> user tutorial
>
>
> https://nbviewer.jupyter.org/github/nds-org/jupyter-et/blob/master/CactusTutorial.ipynb
>
> contains instructions on how to install a set up packages that simplify
> building Cactus, including MPI. Did you follow those instructions or
> started from a vanilla, freshly installed OS? Also, what OS are you
> using (OSX, Linux, Linux-Subsystem-on-Windows)?
>
> The easiest way to get a working copy of Cactus, if you did not install
> an MPI stack before compiling Cactus, is:
>
> * install an MPI stack (OpenMPI is fine)
> * remove the current Cactus build using "rm -rf configs/sim"
> * build once more
>
> The solution above is the suggested solution.
>
> If you would like to avoid recompiling (and risk triggering other
> issues instead), here's an alternative: When it detected that no MPI
> stack was installed then Cactus built its own copy of OpenMPI. It will
> have copied "mpirun" to "exe/sim/mpirun" and you can use that by making
> sure that the full path to the directory "exe/sim" appears in your PATH
> before you start the simulation. Using an mpirun from a different MPI
> stack than the one used to compile Cactus is what gives the error that
> Erik descibes in the email you found.
>
> Yours,
> Roland
>
> > Dear ET Users,
> >
> > I'm a beginner attempting to follow the Jupyter tutorial for new users. I
> > have been able to successfully run the helloworld.par and tov_ET.par
> files
> > with procs set to 1 but I'm having issues when I try to use multiple
> > processors on my laptop.
> > I initially got this error <https://imgur.com/a/cuLLp79> saying "mpirun:
> > not found", following which I installed Open MPI on my system. The error
> > persisted, so I removed Open MPI and installed mpich instead. This
> > seemingly fixed the issue, but I received another error
> > <https://imgur.com/gdXSqCj> saying "CACTUS_NUM_PROCS is set to 4 but
> there
> > are 1 MPI processes"
> >
> > I found one
> > <
> https://www.mail-archive.com/search?l=users@einsteintoolkit.org&q=subject:%22%5C%5BUsers%5C%5D+problem+on+running+bbh+expample%22&o=newest&f=1
> >
> > or two
> > <https://www.mail-archive.com/users@einsteintoolkit.org/msg01591.html>
> > previous queries regarding this exact error, but the suggestions have not
> > worked for me so far. Having never worked with MPI before, I am not sure
> > what else I should try.
> >
> > I'd be grateful for a few words of advice on this issue from anyone in
> the
> > community.
> > Thank you in advance for your time.
> >
> > Regards,
> > Praveer Krishna
>
>
> --
> My email is as private as my paper mail. I therefore support encrypting
> and signing email messages. Get my PGP key from http://pgp.mit.edu .
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.einsteintoolkit.org/pipermail/users/attachments/20200329/fea94023/attachment.html
More information about the Users
mailing list