<div dir="ltr">On Tue, May 17, 2016 at 12:52 PM, Michael Clark <span dir="ltr"><<a href="mailto:michael.clark@gatech.edu" target="_blank">michael.clark@gatech.edu</a>></span> wrote:<br><div class="gmail_extra"><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex"><div dir="ltr"><div><div><div>I'm curious if there is a means to do this differently: for instance, if one could load the separate .h5 files in python as some collection of data objects and then combine those objects, no separate and combined .h5 file would need to be produced. Then, if some means existed to create a visualization from that data object (e.g. through yt), the need for the combined .h5 file would be totally obsolete.<br></div></div><br></div>I suspect this must already be possible in some manner, perhaps using other programs. I'd be curious to hear how, for instance, SimulationTools handles this problem.<br></div>
</blockquote></div></div><div class="gmail_extra"><br></div><div class="gmail_extra">This is exactly what SimulationTools does. It abstracts away the fact that there are potentially many datasets scattered across many HDF5 files (e.g. *.file_N.h5) scattered across multiple restarts. You just ask it to read a specified grid function from a simulation at a specified iteration and it works out the details. It achieves this by scanning the list of datasets across all files in a simulation and determining which datasets must be loaded from which files for a given grid function, iteration, refinement level, etc. It then loads the data from those datasets and merges the components together into a single object (which we call a DataRegion) that represents the data for that grid function at that iteration.</div><div class="gmail_extra"><br></div><div class="gmail_extra">Barry</div></div>