I have a preCICE adapter for an in-house fluid solver that I am verifying with a variation of the partitioned pipe tutorial. My case is inviscid flow of air at Mach 0.1 entering from the left into quiescent air. My coupled run arrives at the correct steady-state solution but the transient behavior differs from that when the two pipes are a single grid (no coupling). For example, the horizontal velocity at time 0.1 seconds (vertical black lines mark the coupling interface’s location):
Presumably the difference arises from how I implemented the coupling interface in the adapter. In the tutorial, one solver writes pressure and the other writes density and velocity. This makes sense for the slow flow in the tutorial and might work on my problem, but the intended use for this solver involves compressible flow with supersonic regions possibly crossing the coupling interface. The solver is an upwind code and I am implementing the interface boundary condition in that framework so that it will work in all the flow regimes we need. At this point I encounter problems.
The code implements boundary conditions by setting values in ghost cells. Analogous to the single-grid case, a reasonable implementation of the interface would be to have each solver write the density, velocity, and pressure from its interior side of the coupling interface, read the corresponding values written by the other solver, set the ghost cell states to the read values, and proceed as usual. I have found that this does not work. The values read from preCICE are only read once per coupling iteration, so within one iteration the ghost states would be constant which is not analogous to the single-grid case and leads to a crash. To get the picture above, I used the state read from preCICE to set a farfield-like boundary condition at the coupling interface for each solver. This approach does allow the ghost values to vary within a coupling iteration but, clearly, not in a completely correct way.
Has anyone else encountered this problem and how did you solve it?
This sounds not that simple. Why do you have varying ghost states during a coupling timestep? Do you use subcycling within the individual solvers or does the ghost state changes due to the iterative solver as part of a single timestep? If the latter applies, would you expect the data to vary within each iteration?
Hi @DavidSCN. In my test case there is one solver time step per time window, so no subcycling. My solver obtains the solution at the end of a time step by Newton iterations. In the single-grid case, each Newton iteration involves all the cells and updates all the cells. So any cell will see its state change and the states in its neighbors change multiple times during a time step.
In the partitioned case, the ghost cells on the coupling interface are analogous to the neighbor cells (on the other side of the interface) in the single grid case. So to mimic the single-grid solution process the ghost states would have to change with each Newton iteration, because they are identified with these neighbor cells whose states would change. But those neighbors are on the other solver so their values have to be communicated through preCICE, so their values are only read once per time step, not once per Newton iteration as they would need to be. I am trying to mimic the single-grid case as closely as possible so that the partitioned case will be time-accurate.
I have so far found two approaches that somewhat qualify as solutions. The first is to fix bugs, in particular to make sure that restoring a checkpoint does not interfere with the data from the previous time step needed for the Newton iteration to solve the correct problem. With that fixed, and using the standard preCICE control flow of read → solve → write → restore or advance, I get a much better but not quite perfect solution:
The interface still causes some inaccuracy, the cause of which I am still investigating.
The second approach is to use preCICE only to communicate data and alter the loop structure (after fixing the bugs too). Instead of an inner loop of multiple Newton iterations in each outer timestep loop, where data exchange happens once per outer loop and preCICE controls the outer loop, perform only one inner Newton iteration and do not overwrite the solution with the checkpoint. This has the effect of performing Newton iterations until preCICE deems the solution converged, but crucially the data exchange happens once per Newton iteration which was the original goal.
The downside to this approach is that one cannot use preCICE’s acceleration features, but in my case the second approach runs about 4x faster than the first anyway and is considerably more accurate:
That’s indeed problematic and not the original purpose of preCICE. Usually, you want to implement boundary conditions and exchange them once per time-step. The accuracy is supposed to originate from the implicit coupling, but I guess you already know this.
Depending on your time discretization you might also need to restore data from previous time steps.
I think there are multiple possible workarounds. I did not yet try anything similar, but you could try to define the Newton iterations as explicit coupling in your config and add an additional implicit coupling for the outer loop. You might need to register different data sets in the preCICE config in order to handle things in the adapter properly. As I said, I cannot guarantee that this works, but it might be worth a trial. In general, the much more native approach would be to exchange data only once per time step.
Just out of curiosity: What are the absolute/relative convergence criteria that you use for the Newton solver, other iterative solvers inside the Newton solver and also for the preCICE iterations? I can see that you solve a slightly different problem when splitting the domain in two and that this affects the overall solution and also the Newton solver. Nevertheless, I would not expect that the Newton solver steps and solutions would change too much as long as the accuracies are tight enough.
It would also be easier if you could plot the velocity over a line and for comparison or actually compute some error. At least for me it is hard to see what deviations you see and how big they are.