[ET Trac] #2639: CarpetLib internal error with PreSync and analysis thorns
Samuel Cupp
trac-noreply at einsteintoolkit.org
Fri Sep 16 20:18:09 CDT 2022
#2639: CarpetLib internal error with PreSync and analysis thorns
Reporter: Samuel Cupp
Status: new
Milestone:
Version: development version
Type: bug
Priority: major
Component: Carpet
I’ve found a strange error involving variables from analysis thorns with multiple timelevels \(such as WeylScal4 and ML\_ADMConstraints\). The error is
```
WARNING level 0 from host r1i6n25.ib0.sawtooth.inl.gov process 0
in thorn CarpetLib, file /home/cuppsamu/toolkit/Cactus/arrangements/Carpet/CarpetLib/src/gdata.cc:315:
-> Internal error: extrapolation in time. variable=ML_ADMCONSTRAINTS::H time=0.040625000000000001 times=[0.018749999999999999,0,-0.018749999999999999]
cactus_intel19: /home/cuppsamu/toolkit/Cactus/arrangements/Carpet/Carpet/src/helpers.cc:275: int Carpet::Abort(const _cGH *, int): Assertion `0' failed.
Rank 0 with PID 185127 received signal 6
```
This error comes from repos/carpet/CarpetLib/src/gdata.cc:
```c++
void gdata::find_source_timelevel(vector<CCTK_REAL> const ×,
CCTK_REAL const time, int const order_time,
operator_type const op, int &timelevel0,
int &ntimelevels) const {
// Ensure that the times are consistent
assert(times.size() > 0);
assert(order_time >= 0);
CCTK_REAL const eps = 1.0e-12;
CCTK_REAL const min_time = *min_element(times.begin(), times.end());
CCTK_REAL const max_time = *max_element(times.begin(), times.end());
// TODO: Use a real delta-time from somewhere instead of 1.0
CCTK_REAL const some_time = std::fabs(min_time) + std::fabs(max_time) + 1.0;
if (op != op_copy) {
if (time < min_time - eps * some_time or
time > max_time + eps * some_time) {
ostringstream buf;
buf << setprecision(17) << "Internal error: extrapolation in time.";
if (varindex >= 0) {
char *const fullname = CCTK_FullName(varindex);
buf << " variable=" << fullname;
::free(fullname);
}
buf << " time=" << time << " times=" << times;
CCTK_ERROR(buf.str().c_str());
}
}
```
I attached the backtrace, as well as the thornlist and parfile I am using.
If I comment out ML\_ADMConstraints, I get the same error for WeylScal4. If I also comment out WeylScal4, the simulation proceeds until I hit an error related to the read/writes I’m working on. Since this is happening in CycleTimeLevels → SyncProlongateGroups, it should be related to how we’re handling the timelevels for these variables. Even though these are analysis variables, my understanding of why they need multiple timelevels is because the IO thorns may try to output at an iteration where the coarsest level isn’t available, so it has to interpolate in time to get the values. However, these variables are only for analysis and are therefore only ever written except for IO. Since the error only happens if I turn on PreSync \(Cactus::presync\_mode = "mixed-error"\), I am thinking this could somehow connect to how we’re handling the automation of ghost zones/outer boundaries \(no syncs automatically triggered since there’s no READs for these variables\), but I don’t have a clear picture of how that could affect this.
More confusingly, if I just add ML\_ADMConstraints to magnetizedTOV-Baikal.par in Baikal, it doesn’t error out. I don’t have a guess for what is causing the behavior, or what triggers it in this parfile.
attachment: backtrace.1.txt (https://api.bitbucket.org/2.0/repositories/einsteintoolkit/tickets/issues/2639/attachments/backtrace.1.txt)
attachment: bbh.par (https://api.bitbucket.org/2.0/repositories/einsteintoolkit/tickets/issues/2639/attachments/bbh.par)
attachment: thornlist.th (https://api.bitbucket.org/2.0/repositories/einsteintoolkit/tickets/issues/2639/attachments/thornlist.th)
--
Ticket URL: https://bitbucket.org/einsteintoolkit/tickets/issues/2639/carpetlib-internal-error-with-presync-and
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.einsteintoolkit.org/pipermail/trac/attachments/20220917/19cb371a/attachment.html
More information about the Trac
mailing list