[Users] using WeylScal4 and Multipole for WF output -- scheduling issues

Ian Hinder ian.hinder at aei.mpg.de
Fri Oct 22 12:06:03 CDT 2010


On 22 Oct 2010, at 15:32, Bernard Kelly wrote:

> Hi Ian.
>
> Thanks for looking at this so carefully.
>
> On 10/22/10 3:55 AM, Ian Hinder wrote:
>> Could you post the schedule output?  According to the copy of the  
>> code
>> that I have from the release version, Multipole_Calc is called in
>> ANALYSIS and WeylScal4_Calculate is called in MoL_PseudoEvolution,
>> which is called in EVOL (and some other places).  The calc_np is a
>> relic from when both were called in ANALYSIS.
> Aha. Yes, I changed that back (i.e., put WeylScal4_Calculate back  
> into ANALYSIS), because I wanted both in the same schedule bin.

Scheduling in Cactus, especially with mesh refinement, is very  
complicated.  The schedule for WeylScal4 has been carefully chosen to  
work (by me, Roland and Erik) - if you change it, you're on your own :)

> Why did that change?

The difference is related to prolongation of mesh refinement  
boundaries.  Psi4 is computed using spatial derivatives, and hence  
needs the boundaries of the grid to be filled in via prolongation from  
the next coarser grid.  Functions scheduled in ANALYSIS cannot have  
time interpolation, due to the time at which ANALYSIS is scheduled  
relative to the Berger-Oliger time stepping.  In order that all points  
in the grid have sensible values, analysis quantities which take  
spatial derivatives should be scheduled in EVOL so that time  
interpolation onto their refinement boundaries is possible.  It is  
also necessary to recompute these quantities after regridding and in  
several other places.  To make this convenient,  a new schedule group,  
MoL_PseudoEvolution, was introduced in EVOL in which such quantities  
should be computed.  Variables set in this group should have three  
timelevels, just like evolved variables, since time interpolation  
requires this.

Erik: is this description accurate?

> Are variables in MoL_PseudoEvolution available for ANALYSIS  
> automatically (i.e. will they retain their stored values)?

This depends on how you schedule storage for the variables.  If you  
schedule storage for the variables unconditionally, then yes.  In  
fact, since output of the variables will happen after ANALYSIS, the  
storage had better be permanent otherwise you won't be able to output  
them.

> I think part of my problem is that I haven't caught up with the  
> modern set of Cactus bins -- it used to be much simpler.

We used to not have mesh refinement :)

> I'm attaching my local copy of WeylScal4's schedule.ccl, as well as  
> the overall scheduling header for the run. The ANALYSIS-bin lines I  
> didn't understand are 415-417 & 620-622.

> I have noted that though the routine "Multipole_Calc" should happen  
> after "calc_np" according to its schedule.ccl, it appears -before-  
> GROUP WeylScal4_Calculate in the run's initial printing of scheduled  
> routines.


WeylScal4_Calculate is no longer scheduled "as calc_np" as it used to  
be, so there is no dependency information between it and Multipole.   
In the unmodified version, the dependency was automatically enforced  
because WeylScal4 was scheduled in MoL_PseudoEvolution and Multipole  
was scheduled in ANALYSIS.

> I've restored the original schedule.ccl now, have recompiled, and am  
> now rerunning. I'll let you know if this helps.

>> I tried to run your parameter file to test, but unfortunately I have
>> some work-in-progress hacks to my local copy of WeylScal4, so it
>> didn't run.  Nevertheless, I think I know what the problem is.  The
>> spatial triad used by WeylScal4 is a coordinate triad (d/dph, d/dr,  
>> d/
>> dth), suitably orthonormalised.  This triad is singular on the z  
>> axis,
>> and the way it is computed there leads to a NaN.  When you compute
>> spherical harmonic modes, points on the z axis (th = 0) are  
>> multiplied
>> by sin(th) = 0, so the value on the axis is never used, but
>> numerically the NaN still causes problems.
>
> I had a different Carpet grid structure before, and then the same  
> extraction didn't give NaNs. I wonder why?

When Multipole calls the interpolator to find the values of Psi4 on  
the extraction spheres, Carpet uses the finest grid that the point  
exists on.  This could mean the last points in a buffer zone, and if  
you haven't filled these correctly with prolongation, these will be  
poisoned (or undefined, if you're not using poisoning).  So if your  
extraction spheres intersect your refinement boundaries (very  
difficult to avoid in 3D), you might pick up poison from them.

In my opinion, the interpolator should not give points from buffer  
zones, and instead use the coarser grid.  I think this might be the  
case in the new (Mercurial) version of Carpet.  Erik: is that the case?

> I don't think quasi-Kinnersley will be a panacea; theta = 0,pi is  
> still a problem unless you take special precautions. Personally, I  
> think the most "correct" thing to do is your option 2 -- use an open  
> integration formula that avoids evaluation on the poles.

Right, or compute sin(th) Psi4, taking the appropriate limit in the  
neighbourhood of the axis.  But I was more interested in computing the  
gravitational waves themselves, not the mode decomposition.  At Scri+,  
Psi4 is supposed to represent the waves, a physical observable, right?

>> Other comments about your parameter file:
>> * You have asked for the mode decomposition of Psi4r and Psi4i
>> individually, and you have not set the spin weight parameter, so it
>> will default to 0, which is not what you want.  Try
>>
>> Multipole::variables = "WeylScal4::Psi4r{sw=-2
>> cmplx='WeylScal4::Psi4i' name='Psi4'}"
>>
>> instead.  This tells Multipole to use spin weight -2 for Psi4, and to
>> treat Psi4r and Psi4i as the real and imaginary parts of a complex
>> number.  This means you don't have to recombine the modes of the real
>> and imaginary parts by hand.  WeylScal4 and Multipole could be
>> modified to work with Cactus complex grid functions, but I don't have
>> any experience with using them, and Kranc does not currently support
>> them.
>
> Thanks; for some reason, I thought the grid function metadata had  
> the appropriate spin-weight baked in.

No - the spin weight is not stored in the grid function.  But that's a  
good idea!

-- 
Ian Hinder
ian.hinder at aei.mpg.de



More information about the Users mailing list