[Users] Meeting Minutes

Erik Schnetter schnetter at cct.lsu.edu
Mon Jun 3 19:28:21 CDT 2013


It seems then it would be a good idea update the Stampede parameter file in
Simfactory to use mvapich instead of impi -- do you agree?


On Mon, Jun 3, 2013 at 12:38 PM, Ian Hinder <ian.hinder at aei.mpg.de> wrote:

> On 3 Jun 2013, at 18:02, Roland Haas <roland.haas at physics.gatech.edu>
> wrote:
> Present: Erik, Peter, Ian, Yosef, Matt, Roland
> automated tests:
> * want to run tests periodically on production machines rather than just
> in Ubuntu VM to test other compilers/MPI implementation than gcc/OpenMPI
> * Ian has setup for Datura under development, add other machines once
> system is stable. This method runs Jenkins on the head node.
> * add short (20min or so) simulation to tests to check more complex setups
> ET workshop:
> * David Rideout offered to give a presentation (private email to Erik,
> hope I did not get this wrong)
> * goals:
> ** ES wants to improve Carpet scalability to >10^4-10^5 processors
> ** (known) bottlenecks are: bboxset class, any lower-d IO which
> serializes it seems
> * PLEASE think of topics for the workshop and post them to the mailing list
> occasional hangs in MPI reported by Yosef:
> * hangs occur with or without the parity thorn
> * hangs occur on RIT cluster (with parity thorn, mvapich)
> * hangs occur on Stampede (no parity thorn, IMPI)
> * Ian saw codes hang in ibrun when MPI versions did not match, this
> seems to not be the case here
> This refers to using a slightly different version of Intel MPI when
> building to the one used at runtime (because simfactory didn't load the
> module at the time, so it was using what was in the environment; this has
> now been fixed).  In the end, this was not the cause of the hangs.  The
> hangs persisted after this fix.  To eliminate the hangs (which happened
> immediately at startup), I switched from Intel MPI to mvapich on Stampede.
>  I have also experienced hangs during the simulation with Intel MPI on
> Supermuc; there, switching to IBM MPI fixed it.
> --
> Ian Hinder
> http://numrel.aei.mpg.de/people/hinder
> _______________________________________________
> Users mailing list
> Users at einsteintoolkit.org
> http://lists.einsteintoolkit.org/mailman/listinfo/users

Erik Schnetter <schnetter at cct.lsu.edu>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.einsteintoolkit.org/pipermail/users/attachments/20130603/72f20aa6/attachment.html 

More information about the Users mailing list