[Commits] [svn:einsteintoolkit] Paper_EinsteinToolkit_2010/ (Rev. 2)

gallen at cct.lsu.edu gallen at cct.lsu.edu
Fri Apr 23 01:49:32 CDT 2010


User: gallen
Date: 2010/04/23 01:49 AM

Modified:
 /
  ET.tex

Log:
 Trying to add some text to kickstart this paper

File Changes:

Directory: /
============

File [modified]: ET.tex
Delta lines: +190 -5
===================================================================
--- ET.tex	2010-04-09 11:11:30 UTC (rev 1)
+++ ET.tex	2010-04-23 06:49:32 UTC (rev 2)
@@ -25,6 +25,16 @@
 \usepackage[all]{hypcap}
 \urlstyle{rm}
 
+
+\def\grb#1{gamma-ray burst#1 (GRB#1)\gdef\grb{GRB}}
+\def\gw#1{gravitational wave#1 (GW#1)\gdef\gw{GW}}
+\def\bh#1{black hole#1 (BH#1)\gdef\bh{BH}}
+\def\ns#1{neutron star#1 (NS#1)\gdef\ns{NS}}
+\def\gt#1{Georgia Tech#1 (GaTech#1)\gdef\gt{GaTech}}
+\newcommand{\codename}[1]{\texttt{#1}}
+
+
+
 \newcommand{\todo}[1]{{\color{blue}$\blacksquare$~\textsf{[TODO: #1]}}}
 
 % Don't use tt font for urls
@@ -78,34 +88,209 @@
 
 Scientific progress in the field of numerical relativity has always been closely tied with the availability and ease-of-use of enabling software and computational infrastructure. This document describes the Einstein Toolkit. 
 
+This is a particularly exciting time for numerical relativity and
+relativistic astrophysics. Recent computational
+breakthroughs~\cite{Pretorius:2005gq, Campanelli:2005dd, Baker:2005vv}
+have transformed the field with our ability to
+effectively evolve the Einstein field equations for coalescing
+black hole binaries and other systems containing moving black holes.
+Following, and in part parallel to, these breakthroughs there have
+been many advancements with direct relevance to astrophysics, such as
+exciting new results on recoil velocities from BH-BH mergers~(e.g,
+\cite{Baker:2006vn,Campanelli:2007ew,HolleyBockelmann:2007eh,
+  Pollney:2007ss,Lousto:2007db,Lousto:2008dn} and references therein),
+post-Newtonian (PN) and numerical waveform comparisons and waveform
+template generation~(e.g., \cite{Baker:2006ha,
+  Husa:2007rh,Baumgarte:2006en,Buonanno:2006ui,
+  Hannam:2007wf,Gopakumar:2007vh, Campanelli:2008nk, Buonanno:2007pf, 
+  Ajith:2007kx} and references
+therein), comparisons between numerical
+waveforms~\cite{Baker:2006yw,Baker:2007fb}, determining the spin of
+the remnant BH~(e.g, \cite{Campanelli:2006uy,
+  Campanelli:2006fg,Campanelli:2006fy,
+Herrmann:2007ex,Rezzolla:2007rz,Berti:2007nw} and references therein), 
+and eccentric black hole binaries
+\cite{Pretorius:2007jn,Sperhake:2007gu,Hinder:2007qu,Grigsby:2007fu,
+Pfeiffer:2007yz}. 
 
+It is remarkable that many of the successful techniques used to
+evolve binary BHs have proven equally applicable to merging BH-NS and
+NS-NS binaries, allowing for the first full GR simulations of
+these systems (e.g., 
+\cite{Shibata:2006bs,Shibata:2006ks,Loeffler06a,Shibata:2007zm,
+Etienne:2007jg,Baiotti:2008ra,Duez:2008rb}).
+
+
+GRMHD on fixed background spacetimes has been carried out in multi-dimensional
+settings, focusing on BH accretion processes and relativistic jet
+production and evolution, and has yielded results since the mid 1990s
+(e.g., \cite{font:08}). On the other hand, GRMHD coupled with
+curvature evolution, crucial for modeling large-scale bulk
+dynamics in compact binary or single-star collapse scenarios, has
+started to produce astrophysically interesting results only in the
+past $\sim 3-5$ years, enabled primarily by the availability of long-term
+stable curvature evolution systems as well as improved GRMHD
+algorithms \cite{font:08}.
+
+The first full GR simulations of merging NS-NS and/or BH-NS binaries
+have recently been carried out by groups at Tokyo University
+(Shibata~et~al., GRHD and GRMHD e.g.,
+\cite{Shibata:2006bs,Shibata:2007zm,Shibata:2003ga,Shibata:2005ss,Shibata:2006nm,Yamamoto:2008js}), at UIUC (Shapiro and
+collaborators, including co-PI Faber, e.g.,
+\cite{Etienne:2007jg,Liu:2008xy}), LSU (Lehner et al.,
+\cite{Anderson:2007kz,Anderson:2008zp}), Caltech-Cornell (Duez et
+al.~\cite{Duez:2008rb}), and at the AEI~(Rezzolla and collaborators,
+\cite{Baiotti:2008ra,loeffler:06}).  The first simulations in 3D GRHD
+of massive star collapse to protoneutron stars have been carried out
+by co-PI Ott and collaborators \cite{ott:07prl,ott:07cqg} and by
+Shibata~et~al.~\cite{shibata:053D}. The collapse of rotating,
+hypermassive NSs to \bh{s} has been studied in 2D and 3D GR by
+Shibata~et~al.~(GRHD/GRMHD, Tokyo, e.g.,
+\cite{shibata:06,shibata:00}), Shapiro, Duez, and collaborators (GRMHD,
+UIUC, e.g., \cite{Duez:2005sf,duez:06}), and by Baiotti, Rezzolla~et~al.
+(GRHD, AEI/LSU, e.g., \cite{Baiotti04,baiotti:05,
+baiotti:07}). Simulations of nonaxisymmetric instabilities in
+rapidly rotating polytropic \ns{} models have been carried out
+in full GR by Shibata~et~al.~\cite{shibata:00},
+Baiotti~et~al.~\cite{baiotti:07,manca:07}.
+
+ 
+
 \section{Requirements}
 
 \subsection{Scientific}
 
+While the above list of studies represents an ensemble of breakthrough
+simulations that have significantly advanced the modeling of
+relativistic astrophysical systems, all simulations are presently
+missing one or multiple physical ingredients or are lacking the
+numerical precision to accurately and realistically model the
+large-scale and small-scale dynamics of their target systems.
 
+
+{\bf MHD} Many studies, in particular those concerned with
+  massive star collapse, binary NS-NS or BH-NS, and rotational
+  nonaxisymmetric instabilities are still performed in pure GRHD\@.
+  Without doubt, these systems must be simulated with GRMHD to capture
+  the effects of magnetic fields that in many cases will
+  alter the simulation outcome on a qualitative level and may be 
+  the driving mechanisms behind much of the observable EM signature
+  from GRBs (e.g., \cite{wb:06}) 
+  and magneto-rotationally exploding core-collapse supernovae (e.g.,
+  \cite{Burrows:2007yx}). In addition, all simulations that have
+  taken into account magnetic fields are still limited to the
+  ideal MHD approximation (perfect conductivity). 
+  Non-ideal GRMHD schemes are just becoming 
+  available~(e.g., \cite{Palenzuela:2008sf,DelZanna:2007pk}).
+
+
+ {\bf Equation of state (EOS), microphysics, and radiation
+  transport}. All presently published 3D GR(M)HD simulations, with the
+  sole exception of the work by co-PI Ott on massive star collapse,
+  relied on a simple zero-temperature polytropic description of
+  \ns{} stellar structure and followed the dynamical evolution
+  with simple $\Gamma$-law type EOS\@. Such EOS are computationally
+  efficient, but are not necessarily a good description for matter in
+  relativistic astrophysical systems. The inclusion of 
+  finite-temperature EOS, derived from the microphysical descriptions of
+  high-density matter, will lead to qualitatively different and much
+  more astrophysically reliable results (see, e.g.,
+  \cite{ott:07prl}). In addition, most GR(M)HD studies are
+  neglecting transport of neutrinos and photons
+  and their interactions with matter. Neutrinos in
+  particular play a crucial role in core-collapse supernovae and in
+  the cooling of NS-NS merger remnants and must not be left out when
+  attempting to accurately model such events.  Only few studies have
+  incorporating neutrino and/or photon transport and interactions in
+  approximate ways \cite{ott:07prl,Farris:2008fe}.
+
+ {\bf High-order schemes and AMR\@}. Numerical accuracy is a
+  central issue in long-term GR(M)HD simulations and must be addressed
+  by a combination of (1) adaptive mesh refinement (AMR), focusing
+  grid points to regions where most resolution is needed, and (2),
+  high-order numerical techniques. Presently, a large number of GRMHD
+  calculations are still performed without AMR and, due to physically
+  limited computational resources, may be underresolving the dynamics
+  in the systems under investigation.  In addition, most GRMHD
+  schemes, while implementing high-resolution shock-capturing methods,
+  are still limited to 2nd-order numerical accuracy while performing
+  curvature evolution typically with 4th-order accuracy. Higher order
+  GRMHD schemes are in use in fixed-background simulations (e.g.,
+  \cite{Tchekhovskoy:2007zn}), but still await implementation in 
+  fully dynamical simulations. AMR codes, e.g., the Carpet driver,
+  are available and coupling of existing and future GRMHD codes
+  with AMR must be facilitated.
+
+
 \subsection{Academic and Social}
 
+\begin{itemize}
 
+\item A primary concern for research groups is securing reliable funding to maintain graduate students and postdoctoral researchers. 
+
+\item 
+
+\end{itemize}
+
 \section{Design and Strategy for the Einstein Toolkit}
 
-Overall design, strategy.
+\todo{Overall design, strategy.}
 
-Mention: Web pages, open SVN, mail lists.
+The mechanisms for the development and support of the Einstein Toolkit are designed to be open, transparent and community driven. All the source code, documentation and tools included in the Einstein Toolkit are distributed under open source licenses. The Einstein Toolkit maintains an SVN repository ({\tt svn.einsteintoolkit.org}) with open access which contains software supported by the Einstein Toolkit Maintainers, the toolkit web pages and documentation. An open wiki for documentation ({\tt docs.einsteintoollkit.org}) has been established where the community can contribute either through anonymous or personal authentication. All discussion about the toolkit takes place on an open mail list ({\tt users at einsteintoolkit.org}). The regular weekly meetings for the Einstein Toolkit Maintainers are open and the community are invited to participate, and minutes are recorded on the wiki.  The Einstein Toolkit blog requires users to first request a login, but then can be freely posted to. Any user can post comments to entries already on the blog. 
 
 \section{Core Technologies}
 
 \subsection{Cactus Framework}
 
-\paragraph{CactusEinstein}
+The \textbf{Cactus
+  Framework}~\cite{CS_cactus_web,CS_Goodale02a,CS_cactususersguide} is
+an open source, modular, portable programming environment for
+collaborative HPC computing, primarily developed at LSU\@. Cactus has
+a generic parallel computational toolkit with modules providing
+parallel drivers, coordinates, boundary conditions, interpolators,
+reduction operators, and efficient I/O in different data
+formats. Generic interfaces are used, making it possible to use
+external packages and improved modules which are immediately available
+to its users.  Cactus is involved in the NSF Blue Waters consortium
+for petascale computing, has funding from NSF SDCI (PI Schnetter) to
+develop new application level tools for performance and correctness,
+as well as the current NSF XiRel award for scalable AMR\@.
+\S\ref{broader} describes the growing impact of Cactus in other
+application domains.
 
-\paragraph{Whisky Code}
 
+\subsection{Carpet}
+
+While Cactus is distributed with a structured-mesh unigrid MPI
+parallel driver (\codename{PUGH}\footnote{Recent results with a
+  numerical relativity benchmark and PUGH show excellent scaling to
+  130,000 BG/P cores}), most GR work is now carried out
+with the adaptive-mesh refinement (AMR) driver
+\textbf{Carpet}\footnote{Other mesh refinement libraries that have
+  been integrated with Cactus include \codename{PAGH}
+  (\codename{GrACE}~\cite{CS_grace_web}), \codename{ParCa}
+  (\codename{PARAMESH}~\cite{CS_PARAMESH_web}) and \codename{Taka}
+  (\codename{SAMRAI}~\cite{CS_SAMRAIweb})}~\cite{CS_Schnetter-etal-03b,
+  CS_carpet_web} whose development is overseen by Schnetter.
+Carpet provides parallel (MPI and OpenMP) AMR capabilities
+on block-structured grids to applications, handling memory management,
+parallelization, I/O, while providing an interface to user application
+code to select and change the desired grid hierarchy.  In addition,
+Carpet provides a multi-block infrastructure, allowing the coupling of
+multiple logically Cartesian, but general grid blocks.  Carpet is open
+source and is openly developed, with the main development located at
+LSU (co-PI Schnetter) and contributions from AEI and others.
+
+
+
+\subsection{CactusEinstein} The Cactus Framework was developed by the numerical relativity community, and although it is a general component framework that supports different application domains its core user group has remained from numerical relativity. The Cactus Team have traditionally developed and supported a set of core modules for numerical relativity, as part of the {\tt CactusEinstein} arrangement. Over the last few years however, the relevance of many of the modules has declined, and more and more of the basic infrastructure for numerical relativity has been provided by open modules provided and distributed by research groups in the community. 
+
+\subsection{Whisky Code}
+
 \subsection{Simulation Factory}
 
 \subsection{Kranc}
 
-\subsection{Carpet}
 
 
 



More information about the Commits mailing list