[Users] Merging carpet hdf5 data without creating additional files
Frank Loeffler
knarf at cct.lsu.edu
Tue May 17 12:33:22 CDT 2016
On Tue, May 17, 2016 at 12:19:09PM -0500, Frank Loeffler wrote:
> Try the attached script. It uses python to get a list of datasets from
> however many hdf5 files you specify and creates a new (small) hdf5 file
> that contains only links to the original files. This can be used to,
> e.g., only specify a single file in VisIt
For some reason, I was thinking of VisIt when I read your email. If you
want to use python to process your data anyway, then an intermediate
file like this doesn't quite make sense. I'd suggest something like
what David suggested. The Parma group has a similar python framework,
and if I remember correctly, Wolfgang Kastaun as well.
The one of the Parma group can be found here:
https://einstein.pr.infn.it/svn/numrel/pub/PyCactus/
(anonymous / anon)
Among other things, it 'recombines' checkpoints, contains some
post-processing routines, e.g., for gw extraction, and so on. It also
reads (some of the) ASCII format, besides the obvious hdf5. It is
intended to be used as utility-library for other, more specific
post-processing scripts, e.g., plots for papers ect.
Frank
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 819 bytes
Desc: Digital signature
Url : http://lists.einsteintoolkit.org/pipermail/users/attachments/20160517/5a0c3999/attachment.bin
More information about the Users
mailing list