Two couplings to one interface reloaded

The topic is related to the thread Two couplings to one interface (mixed dimensional coupling), but it will be a bit more extensive. I also drew nicer pictures this time!

Warning: This will be a long post, but with some pictures (I tried to stick with the preCICE color scheme :wink: ).

Short summary

I need to couple a quantity that exists on two parallel meshes on a third mesh. From investigations I see that is would be beneficial to map the quantities to the third mesh, sum them and then carry out the acceleration method on the sum. I am convinced that preCICE has all the capabilities for that implemented, but I am not sure whether I can access as a user or work around that.

When running the solver in serial I can add the quantity within the solver that has the two meshes. However, it become complicated for parallel simulations as I cannot expect all data that needs to be added to exist withing the same process.

Problem description

This is the current situation:

initial_situation, 100%

I have two solvers, Solver A and Solver B, where the domain of Solver A surrounds the domain of Solver B. Now I want to run a coupled situation where data between the meshes has to be exchanged.

The current coupling situation looks like this:

I have two solvers, Solver A and Solver B, where the domain of Solver A surrounds the domain of Solver B. Now I want to run a coupled situation where data between the meshes has to be exchanged.

The current coupling situation looks like this:

  • Solver A computes a deformation of the top an the bottom (these are in general not the same). The deformations are communicated to Solver B.
  • Solver B computes a pressure that is communicated to Solver A.

Solver B is not interested in the separate deformations d_top and d_bottom, but rather in its sum d_top+d_bottom. It also turns out that the coupled situation is more stable if we exchange d_top+d_bottom and run the acceleration on the combined quantity. So we figured out the current workaround that looks (roughly) like that:

We ignore the bottom mesh and compute d_top+d_bottom in Solver A and afterwards hand it over to preCICE. Data that needs to be available at meshABottom is copied there within Solver A if necessary.

This works fine as long as we do not run any of the participants parallel. If we run, for example, Solver A in parallel the following situation is very common:

The two threads of Solver A do not have matching parts of meshATop and meshABottom and therefore I cannot compute d_top+d_bottom locally in Solver A.

Solution ideas

The idea to solve it would be roughly the following:

parallel-potential-solution

  • I handle the top and bottom mesh to different meshes with d_top and d_bottom defined on them.
  • Use the information preCICE has to map them onto one mesh. This could be meshB or within Solver A.
  • Do the mathematical operation I want to do.
  • Do the acceleration on the combined quantity d_top+d_bottom

Implementing this directly into preCICE would be nice, and also one my agenda, but this needs some time and thought. At the moment need a quick solution.

Realization of solution

I am not sure if any of the ideas are feasible/legal in preCICE. Note that the code is currently using preCICE 1.6. This are the ideas that came up so far

Option 1: Map from Solver A to Solver A

I could make sure that each thread of Solver A has matching parts of meshATop and meshABottom defined. In the first step the solvers write the data they have to their meshes via preCICE. Afterwards they read from the local mesh using preCICE. All data should be available locally such that the computed d_top+d_bottom is written to Solver B and the acceleration shall be executed on d_top+d_bottom.

In order for the data to be mapped I guess I would have to introduce a fake time window and call advance twice.

Option 2: Map from Solver A to Solver B then send it back and forth

Solver A writes their local deformations d_top and d_bottom to Solver B via preCICE. No acceleration shall be used. Solver be gets matching pairs of d_top and d_bottom and computes d_top+d_bottom. Solver B sends d_top+d_bottom back to solver A. Solver A sends d_top+d_bottom back to solver B and acceleration is carried out on the combined quantity. Alternatively Solver B could send the combined quantity d_top+d_bottomto itself (if allowed)

In order for the data to be mapped I guess I would have to introduce a fake time window and call advance twice.

Option 3: Force matching partitioning

In case of a parallel simulation make sure that the domain of Solver A is always partitioned to contain matching parts of meshATop and meshABottom.

I am not sure if that is possible. It might be a hard restriction for possible partitions.

Option 4: Communicate data within Solver A without preCICE

This avoids a lot of problems with the coupling and is hard for me to implement. It feels also unnecessary since preCICE has all the information that would be needed to do that.

If anything is unclear, please let me know and I will draw some more figures and/or come up with more explanations. :wink:

1 Like

This is an excellent presentation of the topic, with really good figures!

As a quick answer for a quick solution, I think that this is something that the actions feature can do (please correct me if I am wrong). I don’t have enough experience with it to help, but maybe the limited documentation we currently have helps.

Thanks for the quick response! The action interface looks indeed interesting. I am not sure whether it is possible two have to source datasets. All the examples and also the action:python example has only one data set as input.

The action summation sounds as it could be the right thing, but it is not clear what it sums up. It is mentioned in the documentation as pre-implemented action, but it does not appear in the XML reference. Is this a mistake or is it not part of preCICE anymore?

Edit: Regarding summing up data from multiple input datasets. Could I have a chain of actions that write the source data into the target data after each other?

Edit 2: I also realized that I run into the problem that I technically want to map from two meshes onto one which was not well defined according to the uekerman’s post. I am not sure If I can easily workaround this using actions. I gave it a quick try, but I failed so far. :frowning:

Sry for the late reply. This is really a tricky case. I am afraid there is no simple solution though there should be one. Let me think a bit more about this. I think I understand your case.

The summation action already helps quite a bit. The feature will be released with v2.1 end of July 2020. It’s already implemented and merged to develop.

Issue: https://github.com/precice/precice/issues/699
PR: https://github.com/precice/precice/pull/707
Example integration test with corresponding config.

An ugly workaround could be to treat “A” as two participants: “A_bottom” and “A_top”. So if A uses the C++ API to create two SolverInterfaces.

Btw, problem with the documentation is indeed that we don’t have versions for the documentation yet. That’s why the feature is listed in the wiki, but not in the xml reference. With the new user documentation this should get better. See also: Restructuring precice.org and unifying the documentation.

No worries! Thanks for the suggestions. I will try them and see if that helps.

Is the branch develop save for testing such that I could compile preCICE from source or should I wait for the release of preCICE v2.1?

It is safe enough to try it and feedback on pre-released states is always welcome!

Thanks I will try the suggestions and report back. I hope that I will have enough time for that this week. :grinning:

I finally had the time to do some testing. So far I made the following observations:

Observation 1: Multi-coupling configuration and acceleration needs three data vectors?!

When I was setting up the precice configuration (xml file) I set the case to coupling-scheme:multi. I would have two solvers, but Solver A would create SolverInterface twice and thus act as two independent solvers and thus preCICE would “think” that there are three solvers in total. During testing I ran into the following error message

ERROR: For multi coupling, the number of coupling data vectors has to be at least 3, not: 1. Please check the <data .../> subtags in your <acceleration:.../> and make sure that you have at least 3.

I had not set everything up yet, thus I only have 1 coupling data vector. However it raises the question whether my initial goal can achieved with this approach. I want the acceleration scheme to only act on pressure and d_sum (=d_top + d_bottom). The error message sounds like I would still have to do the accelerating on pressure, d_top and d_bottom. In that case, I don’t think I gain anything in terms of coupling stability, if the acceleration scheme still works on all three quantities.

Observation 2: Creating two instances of SolverInterface in one solver (in Python) fails

I continued with my implementation and adjusted the solvers. Solver A now creates two interfaces:

solver_name_top = preciceSolverName + "Top"
interface_top    = precice.Interface( solver_name_top,    preciceConfigFileName, rank, size )
solver_name_bottom = preciceSolverName + "Bottom"
print( "Trying to initialize {}".format(solver_name_bottom ) )
interface_bottom = precice.Interface( solver_name_bottom, preciceConfigFileName, rank, size )

When starting the solvers I would that they are hanging while trying to initiate the communication between the solvers (initialize: Setting up master communication to coupling partner/s). Starting the simulation with a debug build of preCICE results in the following assertion being triggered:

ASSERTION FAILED
  Location:          static void precice::utils::Parallel::initializeManagedMPI(int*, char***)
  File:              /home/jaustar/software/compilescripts/precice/precice-82cf90a/src/utils/Parallel.cpp:196
  Rank:              0
  Failed expression: !_isInitialized
  Argument 0: A managed MPI session already exists.

I have omitted the stack trace that follows. If I interpret this correctly, it means that I cannot create two instances of SolverInterface. Is this cause by the Python bindings or is this a general preCICE limitation?

I am thankful for any hints on how to solve this!

Concerning Obs 1

I turned the error message for multi coupling into a warning. Still, I think that this constraint makes sense in general. Every coupled needs at least one output in the multi coupling such that the quasi-Newton can get a grasp on its behavior. Of course, in your case you actually only have two solvers.
But … you also have three vectors in your case (in the latest config you send me) and I guess this makes sense for you as well:

<acceleration:IQN-IMVJ>
  <data name="DisplacementTop" mesh="FractureMeshTop"/> 
  <data name="DisplacementBottom" mesh="FractureMeshBottom"/> 
  <data name="Pressure" mesh="HDFlowMesh"/> 
  <initial-relaxation value="0.001"/>
  <imvj-restart-mode type="RS-SVD" truncation-threshold="0.01" chunk-size="8" />
  <max-used-iterations value="100"/>
  <time-windows-reused value="10"/>
  <filter limit="1e-4" type="QR2"/>
  <preconditioner type="residual-sum"/> 
</acceleration:IQN-IMVJ>

Concerning Obs 2

Yes, this could be a deal breaker for now. It has nothing to do with the Python binding, but is a general problem of preCICE. We still have some static state. Something we want to fix in the long run, for example https://github.com/precice/precice/issues/385. I also opened an issue for your case: https://github.com/precice/precice/issues/828. But not something we can fix directly now, I am afraid of :cry:

Maybe we have to find another solution after all. Fixing observation 2 would still lead to observation 1 which is what I want to avoid. I already have the solvers up and running using a standard implicit coupling with two solvers and the three data sets being exchanged. The addition can be simply done in Solver B where the mapped data is available so I don’t need the summation action in that case. So even if I can create two SolverInterface instances, I would (more or less) end up with a similar coupling I already have.

I can see that it makes sense for the multi-coupling to require (or more) data vectors. Maybe I need something in between the multi-coupling and the “normal” implicit coupling.

We are working on a proper solution in https://github.com/precice/precice/pull/849.

That sounds good. :slight_smile:

3 Likes