[Users] coupling Llama to an evolution code

Miguel Zilhão miguel.zilhao.nogueira at tecnico.ulisboa.pt
Wed May 8 18:08:51 CDT 2019

hi again,

i have a follow-up question regarding this. i'm following Roland's implementation of the WaveToy 
code with Llama, and i'm running into the following issue.

when i inherit the Coordinates thorn, the function MultiPatch_GetDomainSpecification becomes 
aliased, and this becomes a problem if i want to use the same thorn and *not* use Llama. in order 
words, when adding

  inherits: Coordinates

to a thorn's interface.ccl file, one then needs to activate the Coordinates thorn in the parfile 
upon running the code whether or not one wants to use Llama. but then, if multipatch is not used 
(ie, with Carpet::domain_from_coordbase = yes), the following error occurs:

   void Carpet::get_domain_specification(const cGH*, int, const ivect&,
   CarpetLib::rvect&, CarpetLib::rvect&, CarpetLib::rvect&): Assertion `not
   CCTK_IsFunctionAliased("MultiPatch_GetDomainSpecification")' failed.

is there a simple way of having a Llama-aware thorn which can also run without multipatch if so desired?

i've found a previous discussion with a similar issue 
when using CTGamma, where the suggestion was to activate the thorn CTGamma/CartesianCoordinates when 
not using multipatch. i'm guessing that this thorn provides all the grid functions that Coordinates 

is this then the only solution, ie, creating a helper thorn with a "trivial" Coordinates implementation?


On 22/04/19 21:45, Miguel Zilhão wrote:
> thanks Roland!
> this should be enough to get me started. i'll report back if i run into any difficulty.
> cheers,
> Miguel
> On 22/04/19 13:32, Haas, Roland wrote:
>> Hello Miguel,
>> I gave a tutorial on this (for a WaveToy code) at the NCSA ET meeting:
>> https://drive.google.com/open?id=0B4gNfWainf-5dGcxQzNuOUtEUFk
>> The code is (likely, given its name) in the the "rhaas/llama" branch of
>> the cactusexample repo:
>> cd repos/cactusexamples
>> git checkout rhaas/llama
>> should get them for you.
>> Yours,
>> Roland
>>> hi all,
>>> i have a few evolution codes that i would like to make Llama-aware. one of them would be the
>>> LeanBSSNMoL thorn, that was included in the latest ET release.
>>> is there a canonical procedure to do this, or any documentation that i should follow? i understand
>>> that the main thing to change are the finite differencing operations... is there a standard way of
>>> performing this change? or anything else i should be aware of?
>>> thanks,
>>> Miguel
>>> _______________________________________________
>>> Users mailing list
>>> Users at einsteintoolkit.org
>>> http://lists.einsteintoolkit.org/mailman/listinfo/users
> _______________________________________________
> Users mailing list
> Users at einsteintoolkit.org
> http://lists.einsteintoolkit.org/mailman/listinfo/users

More information about the Users mailing list