[Users] Patch for 3D hdf5 output files

Francesco Zappa francesco.zappa at uni-jena.de
Wed Sep 8 08:01:27 CDT 2021


Hallo everybody,

I am Francesco Zappa, a PhD student at the Jena university. I am using a Cactus-based code and I am running simulations on Hawk cluster. Unfortunately, recent policy of Hawk have severe restrictions in terms of number of files that can be produced at the same time. The problem is that running several mpi processes generate many hdf5 output files and I would like to have them packed somehow. I have tried to use the option


CarpetIOHDF5::one_file_per_proc = "yes"


which works fine for 2D data files but it does not seem to work on 3D data files (which have the form <variable>.file_<process>.h5).


I have been aware of the fact that it exists a patch for the Carpet Thorn to have these files packed together somehow. Could you please help me with this issue?

Best regards,


Francesco Zappa

-------------------------------------

Friedrich-Schiller-Universität Jena
Theoretisch-Physikalisches Institut
Fröbelstieg 1, Office 219, Phone

Number 0049-3641-9-47133
D-07743 Jena
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.einsteintoolkit.org/pipermail/users/attachments/20210908/3570c92c/attachment-0001.html 


More information about the Users mailing list