[ET Trac] [Einstein Toolkit] #1874: Activate HDF5 shuffle filter
Einstein Toolkit
trac-noreply at einsteintoolkit.org
Wed Apr 27 03:18:47 CDT 2016
#1874: Activate HDF5 shuffle filter
------------------------------------+---------------------------------------
Reporter: eschnett | Owner:
Type: enhancement | Status: reviewed_ok
Priority: unset | Milestone:
Component: EinsteinToolkit thorn | Version: development version
Resolution: | Keywords:
------------------------------------+---------------------------------------
Changes (by hinder):
* status: review => reviewed_ok
Comment:
Replying to [comment:3 eschnett]:
> I think this issue is unrelated; to resolve it, you would need to
disable compression. There is a parameter for this. There is also
{{{h5repack}}} that can uncompress HDF5 files; you could run it before
recovering.
Yes, I just felt like ranting, and this seemed like a good opportunity :)
I now run without compression all the time. Uncompressing using h5repack
is an interesting idea. We could also patch the decompression filter to
use a saner memory allocation algorithm. It might also be useful to be
able to enable compression for all HDF5 files except for checkpoints,
which are the only ones which are read in again by Cactus.
The shuffle filter looks like a good idea. Sorry I didn't look at it in
time. Do you know of any expected/measured performance issues? The
document at https://www.hdfgroup.org/HDF5/doc_resource/H5Shuffle_Perf.pdf
describes the filter and tests of performance and improvements in
compression ratio. Based on that document, I would say it looks fine to
use. When we have standardised Cactus benchmarks, we can include one for
this. Consider this a retroactive "reviewed OK" (the pull request was
merged 2 days ago).
--
Ticket URL: <https://trac.einsteintoolkit.org/ticket/1874#comment:5>
Einstein Toolkit <http://einsteintoolkit.org>
The Einstein Toolkit
More information about the Trac
mailing list