[Users] simulationfactory issue?
Erik Schnetter
schnetter at cct.lsu.edu
Sat Oct 16 14:48:51 CDT 2010
Richard
A cactus configuration is defined by its thorn list, i.e. by the set
of thorns which it includes. You can compile more thorns into a
configuration than you activate at run time. I usually have a single,
large configuration containing many thorns, and activate a few of them
at run time. I use the default name "sim" for this configuration.
(I also have a variant sim-debug with debugging enabled, which runs at
reduced speeds, but contains more run-time checks to catch coding
errors.)
Others prefer to have smaller configurations, and create a new
configuration for each project or simulation. I can see that this is a
good idea, since re-building my large configuration from scratch can
take some time.
No, there is no consistency between configurations on different
machines. We enforce consistency between source trees, but not
configuration. However, I think this is a good idea. We could e.g.
derive the configuration name from the thorn list, and replicate thorn
lists to remote systems -- this would (in a way) ensure consistency.
Of course, you then still need to ensure that a configuration is
re-built whenever the thorn list or the source tree changes, which we
don't do yet automatically.
We don't have submit scripts for Condor's Parallel Universe. Can you
give us a pointer to details of this system?
-erik
On Fri, Oct 15, 2010 at 4:21 PM, Richard O'Shaughnessy
<oshaughn at gravity.phys.uwm.edu> wrote:
> Hi Erik,
> Thanks -- I didn't realize I accidentally (and consistently!) was adding
> static_tov to the build command.
> How does the build system work? It's obvious now that I can (for example)
> build wavetoy, static_tov, and ks-mclachlan at once, and run instances of
> each independently. But what about maintaining the source trees -- do I
> need to rebuild the name? Can I check if there are updates to a particular
> configuration's source tree? Is there any consistency enforced between
> configurations used on a local host and remote build (i.e., if I want to be
> sure I use the same code tag on each of many target clusters)?
> -Richard
> PS: On a related note, are there submission scripts for condor's parallel
> universe?
> On Oct 15, 2010, at 3:58 PM, Erik Schnetter wrote:
>
> Richard
>
> You created a configuration with the non-default name "static_tov".
> (The default name would be "sim"). Therefore you need to specify this
> configuration name when you create and submit a simulation:
>
> ./simfactory/sim create-submit static_tov --configuration=static_tov
> --parfile=...
>
> You will then have a configuration and a simulation with the same
> name; this does not matter. I usually have a single configuration
> "sim", and use this configuration for all my simulations.
>
> -erik
>
> On Fri, Oct 15, 2010 at 1:16 PM, Richard O'Shaughnessy
> <oshaughn at gravity.phys.uwm.edu> wrote:
>
> Hi Erik
>
>
> After a seemingly successful compile, I tried a simple single-cpu run (using
>
> generic.sh submission) on my cluster head node. I believe this
>
> installation is the release version; the only changes have been to the
>
> optionlist and udb.pm. A similar configuration works on other machines
>
> (i.e, my laptop, albeit with svn rather than release versions), but for some
>
> reason not here. It's not creating the simulations directory at all.
>
> Thoughts?
>
> --- command-line
>
> [oshaughn at hydra Cactus]$ ./simfactory/sim create-submit static_tov
>
> --parfile=par/static_tov.par --procs=1 --walltime=8:0:0
>
> Simulation Factory:
>
> Configuration name(s) not specified -- using default configuration "sim"
>
> Uncaught exception from user code:
>
> Configuration "sim" contains no executable at ./simfactory/sim line
>
> 5216.
>
> at ./simfactory/sim line 5216
>
> main::get_executable() called at ./simfactory/sim line 1883
>
> main::command_create('static_tov') called at ./simfactory/sim line
>
> 2955
>
> main::command_create_submit('static_tov') called at ./simfactory/sim
>
> line 452
>
> Richard
>
> The error message "no executable" indicates that your build didn't
>
> complete. There are probably problems with your compiler or linker
>
> options.
>
> - All Cactus executables are stored in Cactus's "exe" directories.
>
> What does "ls exe" say?
>
> - Did you specify a different name for your configuration while
>
> building? If so, you need to use the --configuration=... option when
>
> submitting the simulation.
>
> - If you use the --debug or --profile flag while building, you also
>
> need to specify it while submitting a simulation, since you'll need to
>
> use the debugging or profiling executable.
>
>
> Ok. I'm simply trying to follow the EinsteinToolkit new user instructions
>
> on a new machine, as a test case, with one CPU.
>
> 1) executables are made
>
> [oshaughn at hydra Cactus]$ ls exe
>
> cactus_static_tov static_tov
>
> 2) I just changed udb.pm (see original email) The build command (which
>
> says it completed successfully) is
>
> [oshaughn at hydra Cactus]$ ./simfactory/sim build static_tov
>
> --thornlist=manifest/einsteintoolkit.th
>
> 3) I didn't specify any debugging options.
>
> -erik
>
> --
>
> Erik Schnetter <schnetter at cct.lsu.edu> http://www.cct.lsu.edu/~eschnett/
>
> Richard O'Shaughnessy oshaughn at gravity.phys.uwm.edu
>
> 462 Physics Building Phone: 414 229 6674
>
> Center for Gravitation and Cosmology
>
> University of Wisconsin, Milwaukee 53211
>
>
>
>
> --
> Erik Schnetter <schnetter at cct.lsu.edu> http://www.cct.lsu.edu/~eschnett/
>
> Richard O'Shaughnessy oshaughn at gravity.phys.uwm.edu
> 462 Physics Building Phone: 414 229 6674
> Center for Gravitation and Cosmology
> University of Wisconsin, Milwaukee 53211
>
--
Erik Schnetter <schnetter at cct.lsu.edu> http://www.cct.lsu.edu/~eschnett/
More information about the Users
mailing list