[Users] HDF5 3D output separate time files
    Ian Hinder 
    ian.hinder at aei.mpg.de
       
    Thu Jun 22 13:29:42 CDT 2017
    
    
  
On 20 Jun 2017, at 15:04, Hayley Macpherson <hayley.macpherson at monash.edu> wrote:
> Hello,
> I am using the Einstein Toolkit for cosmological simulations, and would like to ask a question regarding my HDF5 3D output. 
> Currently I am outputting all 3D snapshots to one HDF5 file (as per default). I would like to have my output as separate files for each time snapshot, to help with ease of copying and visualisation in post processing. 
> 
> I found the following parameter in IOUtil, and changed it accordingly for a simulation:
> IO::out_timesteps_per_file = 1
> However this made no difference to my output, I still had multiple time snapshots in the one file. 
> 
> Is there another way to separate my 3D HDF5 output into one file per snapshot? 
> Any help anyone can offer would be much appreciated! 
Hi Hayley,
Looking at the code, I don't immediately see why it's not working.  If you look in the file
arrangements/Carpet/CarpetIOHDF5/src/CarpetIOHDF5.cc
in the function OutputVarAs, around line 895, there is the code for outputting a fixed number of timesteps in a given file.  
  if (out_timesteps_per_file > 0) {
    // Round down to nearest multiple of out_timesteps_per_file
    int const iter =
        cctk_iteration / out_timesteps_per_file * out_timesteps_per_file;
    char buffer[32];
    snprintf(buffer, sizeof(buffer), ".iter_%d", iter);
    filename.append(buffer);
  }
Maybe you can add some debugging code to work out whether this code is being executed or not, and why it's not doing the right thing.
Do you get a new file with the ".iter_..." suffix?  Are you perhaps looking at an old file without this suffix which was left over from earlier runs?
-- 
Ian Hinder
http://members.aei.mpg.de/ianhin
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.einsteintoolkit.org/pipermail/users/attachments/20170622/56427d78/attachment.html 
    
    
More information about the Users
mailing list