[ET Trac] [Einstein Toolkit] #1282: Enable HDF5 compression by default in Carpet

Einstein Toolkit trac-noreply at einsteintoolkit.org
Sun Mar 22 08:32:55 CDT 2015


#1282: Enable HDF5 compression by default in Carpet
--------------------------+-------------------------------------------------
  Reporter:  hinder       |       Owner:  eschnett
      Type:  enhancement  |      Status:  new     
  Priority:  major        |   Milestone:          
 Component:  Carpet       |     Version:          
Resolution:               |    Keywords:          
--------------------------+-------------------------------------------------

Comment (by rhaas):

 My best suggestion is that compression may be beneficial for 3d datasets
 but not for 2d ones (since compression implies HDF5 chunking). This is
 based on data from Philipp's large MHD run where compressing 3d Bcons data
 is beneficial (on the 10% level) but compressing 2d data from the same
 files increases the 2d file size by about 11%. Since there is lots of
 turbulence in theses simulations, compression is hard, on the other hand
 the files are also using "large" patch sizes so that compression can be
 effective and the overhead of chunking is not bad.

 So for checkpoints and 3d output compression seems beneficial, but not for
 2d output. Assuming that 3d output is much bigger than 2d output even for
 a single 3d file, there probably is an overall benefit of enabling
 compression. We may want to introduce a set of out3d_compression_level etc
 parameters though and use the current compression_level one only for
 out_vars type output (which is 3d) and checkpointing (which uses the same
 routines).

-- 
Ticket URL: <https://trac.einsteintoolkit.org/ticket/1282#comment:4>
Einstein Toolkit <http://einsteintoolkit.org>
The Einstein Toolkit


More information about the Trac mailing list