[Users] Meeting Minutes
Roland Haas
roland.haas at physics.gatech.edu
Mon Jan 20 13:52:33 CST 2014
Present: Frank Loeffler, Roland Haas, Stefan Ruehe, Erik Schnetter, Ian
Hinder, Joshua Faber, Matt Kinsey, Yosef Zlochower, James Healy
Stampede slowdown in Noether release:
* Ian suggests using mvapich2 rather than intel MPI (already in
simfactory/trunk)
* Roland suggests disabling LoopControl
* Erik suggest looking at Cactus' timer output
* Jim will test performance with these options
* Jim had tested old version with current setup and got good performance
* Jim will open a trac ticket to focus discussion
Using SPH (or other particle code with Cactus):
* some interest in SPH in Cactus, each participant presented what they
had done in the past and what they would like to do, Erik gave a very
short description of the SPH method
* Erik implemented a unigrid SPH type code in Cactus
** particles are tied to the 3d cells to find neighbours
** uses 1d Cactus arrays for particle storage
** not yet parallelized
** would need to add code to exchange particles as particles move
* Josh has a working MPI parallelized SPH code
** uses grid to compute gravity forces
** Newtonian version is available for download
** Josh has fixed background GR version
** each process keeps the full particle position list in memory
** neighbour list however is distributed across processes
** parallelized around particles, no attempt to keep particles local to grid
* Matt has code that evolves many geodesics
** own classes to handle and store particles
** keeps particles on process that owns grid under particle, otherwise
interpolation is too costly
** currently scales to billions of particles
** uses Boost serialization to move particles after each step
** works with AMR grid
* Ian has similar code to evolve geodesics
** wants to evolve geodesics, do raytracing
** have geodesic integrator using MoL and mesh refinement
** tested with relatively small number of geodesics (everything on one
process)
* Stefan has worked on an fixes spacetime SPH code for his diplom thesis
** would like to couple to a GR code
** to be used for stellar disruption
** more discussion via email
** invited to join the ET phone calls (they are all public)
* notice that geodesics and SPH evolution likely have different laod
characteristics since geodesics do not require gradients between
particles. For geodesics the time consuming factor is sending
interpolated data over the network.
* suggestion to implement a "particle" datatype in Cactus so that the
infrastructure knows about it and can offer specialized functions to
handle with them
* technical needed code to increase grid array size
using intel 14 compiler:
* Frank and Roland tried on stampede but failed to compile the full
configuration, received segfault
* Ian succeeded compiling the bare (or less than) full ET
* Erik suggest reducing debug information and/or optimization level
Elliptic solver in ET:
* currently the only one known to work with mesh refinement is Eloisa
Bentivegna's elliptic solver. Currently available on bitbucket
https://bitbucket.org/eloisa/cosmology/overview
* tested for Cosmology (written for bh lattice), and for binary black
holes (not particularly in this case)
simfactory 3:
* Erik, Ian, Barry working on new simfactory implementation
* designing interfaces and methods
* should be usable by non-cactus tools, offer direct interface from
python, eg "import mdb from simfactory"
* currently not yet feature complete
* still important architectural decisions to make
* Erik will consult with Ian and send out url of repository
use simfactory (2) to manage external libraries:
* want to have simfactory build external libraries so that they can be
used by non-Cactus projects as well
* ExternalLibraries revert to CactusExternal scheme
* need to handle multiple different configurations build at the same
time on the
* possible presentation next Monday on current status
Misc:
* Yosef to open ticket about being unable to checkpoint on stampede
Yours,
Roland
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 242 bytes
Desc: OpenPGP digital signature
Url : http://lists.einsteintoolkit.org/pipermail/users/attachments/20140120/b5acef9a/attachment.bin
More information about the Users
mailing list