[Users] Information about grid decomposition
schnetter at cct.lsu.edu
Thu Nov 10 08:46:05 CST 2016
This information is available from Carpet. Carpet maintains a "grid
hierarchy" gh, and if you use AMR, there will be a single gh that you can
look up in Carpet, probably as "Carpet::vhh.at(0)" (this is C++ code). This
grid hierarchy contains the information you want -- you can look up which
regions exist, to which process they are assigned, and you can also look up
on which region (and thus on which process) a particular point is located.
Having said this -- are you sure that this will save a noticeable amount of
time compared to letting Carpet handle the interpolation? Did you benchmark
your current implementation? If you leave out the interpolation in your
current implementation, how much time do you save?
I'd be happy to discuss further. A good venue would e.g. be one of the
Einstein Toolkit telecons that we have on Mondays. (I might not attend next
Monday because I'll be at SC16.)
On Thu, Nov 10, 2016 at 8:54 AM, <breu at th.physik.uni-frankfurt.de> wrote:
> Dear users of the Einstein Toolkit,
> I have a question: I would like to integrate the geodesic equations for a
> large quantity of particles in parallel (e.g. one million). To save
> computational time, I want to make sure that the arrays containing the
> particle data are handled by the same process that takes care of the
> corresponding grid patch on which the particles are moving and only
> communicate particle data if a particle moves to another grid patch.
> Since I only want a process to communicate with the processes that contain
> the adjacent grid patches, can I access information on which process owns
> which grid patch and if yes, how?
> So far I have only found functions that return the upper and lower bounds
> for the local process, but each process would need to know where all the
> rest of the grid is.
> Kind regards,
> Cosima Breu
> Users mailing list
> Users at einsteintoolkit.org
Erik Schnetter <schnetter at cct.lsu.edu>
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Users