[Commits] [svn:einsteintoolkit] Paper_EinsteinToolkit_2010/ (Rev. 251)
knarf at cct.lsu.edu
knarf at cct.lsu.edu
Fri Feb 17 11:41:49 CST 2012
User: knarf
Date: 2012/02/17 11:41 AM
Modified:
/
ET.tex
Log:
first set of changes due to referee reports
File Changes:
Directory: /
============
File [modified]: ET.tex
Delta lines: +44 -33
===================================================================
--- ET.tex 2011-12-09 17:54:19 UTC (rev 250)
+++ ET.tex 2012-02-17 17:41:49 UTC (rev 251)
@@ -181,21 +181,22 @@
These breakthroughs had direct relevance
to astrophysics, and enabled exciting new results on recoil velocities
from BH-BH mergers~(e.g,
-\cite{Baker:2006vn,Campanelli:2007ew,HolleyBockelmann:2007eh,
+\cite{Gonzalez:2006md,Baker:2006vn,Gonzalez:2007hi,Campanelli:2007ew,HolleyBockelmann:2007eh,
Pollney:2007ss,Lousto:2007db,Lousto:2008dn} and references therein),
post-Newtonian (PN) and numerical waveform comparisons and waveform
template generation~(e.g.,~\cite{Baker:2006ha,
- Husa:2007rh,Baumgarte:2006en,Buonanno:2006ui,
+ Husa:2007rh,Baumgarte:2006en,Buonanno:2006ui,Hannam:2007ik,Boyle:2007ft,
Hannam:2007wf,Gopakumar:2007vh, Campanelli:2008nk, Buonanno:2007pf,
Ajith:2007kx} and references
therein), comparisons between numerical
-waveforms~\cite{Baker:2006yw,Baker:2007fb}, determination of the spin of
+waveforms~\cite{Baker:2006yw,Baker:2007fb,Hannam:2009hh}, determination of the spin of
the remnant BH formed in BH-BH mergers~(e.g,~\cite{Campanelli:2006uy,
Campanelli:2006fg,Campanelli:2006fy,
Herrmann:2007ex,Rezzolla:2007rz,Berti:2007nw} and references therein),
and studies of eccentric BH-BH binaries
\cite{Pretorius:2007jn,Sperhake:2007gu,Hinder:2007qu,Grigsby:2007fu,
-Pfeiffer:2007yz,Stephens:2011as}.
+Pfeiffer:2007yz,Stephens:2011as}. Also, see~\cite{Centrella:2010mx} for a review
+on black hole binaries, gravitational waves and numerical relativity.
Meanwhile, general relativistic magneto-hydrodynamics (GRMHD)
on fixed background spacetimes has been successful in multi-dimensional
@@ -226,7 +227,7 @@
Shibata:2006ks,Shibata:2007zm,Shibata:2009cn,
Shibata:2010zz,Stephens:2011as,Yamamoto:2008js} binaries (for reviews, see also~\cite{Faber:2009zz,Duez:2009yz}), allowing for further investigations into the former
and the first full GR simulations of the latter. All recent results use
-either the general harmonic formalism or the
+either the generalized harmonic formalism or the
BSSN formalism in the ``moving puncture'' gauge. Nearly all include some form of adaptive mesh
refinement, since unigrid models cannot produce accurate long-term evolutions
without requiring exorbitant computational resources, though some BH-NS simulations have been performed with a pseudospectral code \cite{Duez:2008rb,Duez:2009yy,Foucart:2010eq,Foucart:2011mz}. Many groups' codes
@@ -251,8 +252,8 @@
aim of providing a computational core that can enable new science,
broaden the community, facilitate collaborative and interdisciplinary research, promote software reuse and take
advantage of emerging petascale computers and advanced cyberinfrastructure:
-the Cactus computational toolkit~\cite{Cactuscode:web}.
-Although the development of Cactus was driven directly from the numerical relativity community, it was developed in collaboration
+the {\tt Cactus} computational toolkit~\cite{Cactuscode:web}.
+Although the development of {\tt Cactus} was driven directly from the numerical relativity community, it was developed in collaboration
with computer scientists and other computational fields to facilitate the incorporation of innovations in computational
theory and technology.
@@ -265,7 +266,7 @@
the {\tt Cactus} computational toolkit was not felt to fit within its rapidly
expanding scope. This triggered the creation of the Einstein
Toolkit~\cite{EinsteinToolkit:web}. Large parts of the Einstein toolkit
-presently do make use of the {\tt Cactus} toolkit, but this is not an
+currently do make use of the {\tt Cactus} toolkit, but this is not an
requirement, and other contributions are welcome, encouraged and have been
accepted in the past.
@@ -275,7 +276,7 @@
While the aforementioned studies collectively represent
breakthrough simulations that have significantly advanced the modeling of
-relativistic astrophysical systems, all simulations are presently
+relativistic astrophysical systems, all simulations are currently
missing one or more critical physical ingredients and are lacking the
numerical precision to accurately and realistically model the
large-scale and small-scale dynamics of their target systems simultaneously.
@@ -341,7 +342,7 @@
a high software quality level. Every substantial change or addition to
the toolkit must be reviewed by another Einstein Toolkit maintainer,
and is generally open for discussion on the users mailing list. This convention,
-though not being technically enforced, works well in practice and promotes
+though not being strictly enforced, works well in practice and promotes
active development.
\section{Core Technologies}
@@ -351,8 +352,8 @@
starting with code generation all to way to archiving of simulation
results: (i) the {\tt Cactus} framework ``flesh'' provides the underlying
infrastructure to build complex simulation codes out of independently
-developed modules and facilities communication between these modules. (ii) the
-adaptive mesh refinement driver, {\tt Carpet}, is build on top of {\tt Cactus}
+developed modules and facilitates communication between these modules. (ii) the
+adaptive mesh refinement driver, {\tt Carpet}, is built on top of {\tt Cactus}
and provides problem independent adaptive mesh refinement support for
simulations that need to resolve physics on length scales differing by many
orders of magnitude, while relieving the scientist of the need to worry about
@@ -367,8 +368,8 @@
The {\tt Cactus}
Framework~\cite{Cactuscode:web,Goodale:2002a,CactusUsersGuide:web} is
an open source, modular, portable programming environment for
-collaborative HPC computing primarily developed at Louisiana State University,
-which originated at the Albert Einstein Institute and also has roots
+collaborative HPC computing primarily developed at Louisiana State University.
+{\tt Cactus} originated at the Albert Einstein Institute and also has roots
at the National Center for Supercomputing Applications~(see, e.g.,~\cite{Anninos:1995am,Anninos:1996ai,Seidel:1999ey} for historical reviews).
The {\tt Cactus} computational toolkit consists of general modules which provide
parallel drivers, coordinates, boundary conditions, interpolators,
@@ -423,7 +424,7 @@
The Einstein Toolkit offers two drivers, \codename{PUGH} and
{\tt Carpet}. \codename{PUGH} provides domains consisting of a uniform
grid with Cartesian topology, and is highly scalable (up to more than
-130,000~\cite{Cactuscode:BlueGene:web}.)
+130,000 processors~\cite{Cactuscode:BlueGene:web}.)
{\tt Carpet}~\cite{Schnetter:2003rb, Schnetter:2006pg,
CarpetCode:web} provides multi-block methods and adaptive mesh
refinement (AMR\@). Multi-block methods cover the domain with a set of
@@ -444,6 +445,13 @@
simulation are re-evaluated, and the grid hierarchy is updated; this
step is called \emph{regridding}.
+Is most simulations using Carpet the sizes of refinement levels are
+predescribed by the user, while their locations are adapted according
+to the locations of interesting features such as black holes or stars.
+In addition, refinement levels are often activated is disabled depending
+on simulation dynamics, such as the formation of a common horizon, or
+an increasing stellar density due to collaps.
+
Since a finer grid spacing also requires smaller time steps for
hyperbolic problems, the finer grids perform multiple time steps for
each coarse grid time step, leading to a recursive time evolution
@@ -496,7 +504,7 @@
\end{figure}
Figure \ref{fig:weak-scaling} shows a weak scaling test of \texttt{Carpet},
where \texttt{McLachlan} (see section \ref{sec:Kevol} below) solves
-the Einstein equations on a grid structure with
+the Einstein equations evolving a Minkowski spacetime on a grid structure with
nine levels of mesh refinement. This demonstrates excellent
scalability up to more than ten thousand cores. In production
simulations, smaller and more complex grid structures, serial
@@ -676,8 +684,8 @@
tensor.
Relativistic spacetime evolution methods used within the {\tt Cactus} framework employ
different formalisms to accomplish this goal, but essentially all are based on the $3+1$ ADM
-construction~\cite{Arnowitt:1962hi}, which makes it the
-natural choice of a common foundation for exchange data between
+construction~\cite{Arnowitt:1962hi,York:1979sg}, which makes it the
+natural choice of a common foundation for exchanging data between
modules using different formalisms. In the $3+1$ approach, 4-dimensional spacetime is foliated into sequences of spacelike
3-dimensional hypersurfaces (slices) connected by timelike normal vectors. The $3+1$ split introduces 4
gauge degrees of freedom: the lapse function $\alpha$ that describes
@@ -687,7 +695,7 @@
that describes how spatial coordinates change from one slice to the
next.
- According to the ADM formulation, the spacetime metric is assumed to take the form
+Within the ADM formulation the spacetime metric is written in the form
\begin{equation}
ds^2=g_{\mu\nu}dx^\mu dx^\nu\equiv (-\alpha^2+\beta_i\beta^i)dt^2+2\beta_i dt~dx^i+\gamma_{ij} dx^idx^j,\label{eq:adm}
\end{equation}
@@ -723,9 +731,11 @@
(the lapse and shift) are set by the parameters
{\tt initial\_dtlapse} and
{\tt initial\_dtshift}, respectively.
-By default, {\tt ADMBase} initializes the 3-metric and extrinsic
-curvature to Minkowski (i.e., $\gamma_{ij}=\delta_{ij}$, the Kronecker delta, and $K_{ij}=0$), the shift to zero, and the lapse to unity. Initial data thorns
-override these defaults by extending the parameters.
+By default, {\tt ADMBase} initializes the components of the four-metric (3-metric,
+ extrinsic curvature, lapse, and shift) as instantaneously Minkowski in a standard
+ Cartesian coordinate system: $\gamma_{ij} = \delta_{ij}$ (the Kronecker delta),
+ $K_{ij} = 0$, $\alpha = 1$, $\beta^i = 0$.
+Initial data thorns override these defaults by extending the parameters.
Analogous to specifying initial data, evolution methods are chosen by
the parameters {\tt evolution\_method} (3-metric and extrinsic curvature),
@@ -799,7 +809,7 @@
\item Applying boundary conditions
\end{itemize}
-Through these, the initiation of the primitive variables, methods to recover the conservative
+Through these, the initialization of the primitive variables, methods to recover the conservative
variables, and basic atmosphere handling can be implemented in different thorns while allowing
a central access point for analysis thorns.
@@ -851,7 +861,7 @@
libraries. One example is the \codename{TwoPunctures}
module~\cite{Ansorg:2004ds} --- commonly
used in numerical relativity to generate BH-BH binary initial data --- which makes
-use of the GNU Scientific Library [GSL;~\cite{GSL:web,Galassi:2009}].
+use of the GNU Scientific Library~\cite{GSL:web,Galassi:2009} (GSL).
Several modules have also been implemented to read in data files generated by
the {\tt Lorene} code~\cite{Lorene:web,Gourgoulhon:2000nn}.
@@ -938,7 +948,7 @@
where $m_1,~m_2$ and $r_1,~r_2$ are the mass of and distance to each BH, respectively, and $\Psi$ is defined by the equation itself, the Hamiltonian
constraint may be written as
\begin{equation}
-\Delta u +\left[\frac{1}{8}\Psi^7K^{ij}K_{ij}\right](1+\Psi u)^{-7}\label{eq:twopunc_u}
+\Delta u +\left[\frac{1}{8}\Psi^7K^{ij}K_{ij}\right](1+\Psi u)^{-7}\label{eq:twopunc_u}=0
\end{equation}
subject to the boundary condition $u\rightarrow 1$ as $r\rightarrow\infty$. In Cartesian
coordinates, the function $u$ is infinitely differentiable everywhere except the
@@ -969,15 +979,15 @@
\subsubsection{Lorene-based binary data}
-The ET contains three routines that can read in publicly available data generated
+The Einstein Toolkit contains three routines that can read in publicly available data generated
by the {\tt Lorene} code~\cite{Lorene:web,Gourgoulhon:2000nn}, though it does not
currently include the capability of generating such data from scratch. For a
number of reasons, such functionality is not truly required; in particular,
{\tt Lorene} is a serial code and to call it as
-an ET initial data generator saves no time. Also, it is not guaranteed to be convergent for
+an Einstein Toolkit initial data generator saves no time. Also, it is not guaranteed to be convergent for
an arbitrary set of parameters; thus the initial data routine itself may never
finish its iterative steps. Instead, recommended practice is to let Lorene output
-data into files, and then read those into ET at the beginning of a run.
+data into files, and then read those into the Einstein Toolkit at the beginning of a run.
Lorene uses a multigrid spectral approach to solve the conformal thin-sandwich
equations for binary initial configurations~\cite{York:1998hy} and a single-grid
@@ -987,7 +997,8 @@
Matter source terms are ideal for this split, since they are compactly supported,
while extrinsic curvature source terms are spatially extended but with sufficiently
rapid falloff at large radii to yield convergent solutions. Around each object,
-a set of nested spheroidal sub-domains (see figure~\ref{fig:Lorene_coordinates}) is constructed to extending through all
+a set of nested spheroidal sub-domains (see figure~\ref{fig:Lorene_coordinates}) is
+constructed to extend through all
of space, with the outermost domain incorporating a compactification to allow
it to extend to spatial infinity. Within each of the nested sub-domains,
fields are decomposed into Chebyshev modes radially and into spherical harmonics
@@ -1047,7 +1058,7 @@
subject to the boundary condition that in the exterior,
\begin{eqnarray}
\bar{r} &=& \dfrac{1}{2}\left(\sqrt{r^2-2Mr}+r -M\right)\nonumber \\
-r&=&\bar{r}\left(1+\dfrac{M}{2\bar{r}}\right)^2 \ .
+r&=&\bar{r}\left(1+\dfrac{M}{2\bar{r}}\right)^2 \ ,
\end{eqnarray}
handling with some care the potentially singular terms that appear at the origin.
@@ -1807,7 +1818,7 @@
surface integrals at infinity or volume integrals over entire
hypersurfaces give a measure of the total energy and angular momentum
in the spacetime. The module \codename{ML\_ADMQuantities} of the
-McLachlan code~\cite{McLachlan:web} uses the latter method, creating
+{\tt McLachlan} code~\cite{McLachlan:web} uses the latter method, creating
gridfunctions containing the integrand of the volume
integrals~\cite{Yo:2002bm}:
\begin{eqnarray}
@@ -2126,7 +2137,7 @@
tracking objects as they move through the domain. One can also add or
remove stacks if, for instance, the number of objects changes. Full AMR based on
a local error estimate is supported by \codename{Carpet}, but the
-Einstein Toolkit does not presently provide a suitable regridding thorn
+Einstein Toolkit does not currently provide a suitable regridding thorn
to create
such a grid. If initial
conditions are constructed outside of {\tt Carpet} (which is often the
@@ -2832,7 +2843,7 @@
available~(see, e.g.,~\cite{Palenzuela:2008sf,DelZanna:2007pk}),
but have yet to be implemented widely in many branches of numerical relativity.
- Most presently published 3D GR(M)HD simulations, with the
+ Most currently published 3D GR(M)HD simulations, with the
exception of recent work on massive star collapse
(see, e.g.,~\cite{Ott:2006eu}) and binary mergers
(see, e.g.,~\cite{Sekiguchi:2011zd}),
More information about the Commits
mailing list