[Users] Merging carpet hdf5 data without creating additional files

Barry Wardell barry.wardell at gmail.com
Tue May 17 14:24:05 CDT 2016


On Tue, May 17, 2016 at 12:52 PM, Michael Clark <michael.clark at gatech.edu>
wrote:

> I'm curious if there is a means to do this differently: for instance, if
> one could load the separate .h5 files in python as some collection of data
> objects and then combine those objects, no separate and combined .h5 file
> would need to be produced.  Then, if some means existed to create a
> visualization from that data object (e.g. through yt), the need for the
> combined .h5 file would be totally obsolete.
>
> I suspect this must already be possible in some manner, perhaps using
> other programs.  I'd be curious to hear how, for instance, SimulationTools
> handles this problem.
>

This is exactly what SimulationTools does. It abstracts away the fact that
there are potentially many datasets scattered across many HDF5 files (e.g.
*.file_N.h5) scattered across multiple restarts. You just ask it to read a
specified grid function from a simulation at a specified iteration and it
works out the details. It achieves this by scanning the list of datasets
across all files in a simulation and determining which datasets must be
loaded from which files for a given grid function, iteration, refinement
level, etc. It then loads the data from those datasets and merges the
components together into a single object (which we call a DataRegion) that
represents the data for that grid function at that iteration.

Barry
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.einsteintoolkit.org/pipermail/users/attachments/20160517/62f448f4/attachment.html 


More information about the Users mailing list