2D-3D Coupling OpenFOAM-DUNE

Hello preCICE-Users,

I get the same error as Alphaoo1 posted above, so I thought replying here would be the best option.
I try to couple a Solid-Solver written with DUNE with OpenFoam and the 2d case works fine without further problems. Switching to 3d results in the mentioned errors from above with the same behavior of the coupling methods.
I already checked the meshes of fluid and solid, which both look fine to me (see picture). Also the coordinates that are given to preCICE of the coupling interface are correct.

I use preCICE-Version 2.1 and the fluid setup of the 3d fenics-openfoam testcase
(the one with the flap in the channel).

Has someone an idea for the solution of that problem ? Or at least where it could
come from ?

Best regards,

Hi Max,

If OpenFOAM fails with a floating point error that probably just means that the coupled simulation crashed. It’s the symptom, not the problem.
When does it crash? Which preCICE config do you use? What do the convergence and iteration files of preCICE say?

My best guess right now would be that sth with the 2D-3D mapping does not work. Your case is quasi-2D, but you simulate it in 3D. Does a real 3D case work? Sth like https://github.com/precice/tutorials/tree/master/FSI/3D_Tube/OpenFOAM-CalculiX ?

Btw, great that you use preCICE with DUNE. Please keep us posted how that goes. As far as I know nobody has done that before. Any contributions in that direction would be very much appreciated.



I basically use exactly the same precice-config as in the 3d fenics FSI tutorial with the flap. I can either use nearest-neighbor or rbf mapping, but the result is always, that the simulation crashes in around 1-5 time windows after the start of the simulation. I begin with the fluid part and the first mapping to the structure part works fine. Then it crashes, while mapping from structure to fluid. The same behavior is true for both explicit- and implicit-coupling.

For me trying to simulate the flap in 3d (or quasi-2d) was the most sensible way after the 2d one was working. So i don’t know, if a real 3d case would work, haven’t tested one.


@MaxFirmbach could you please post the preCICE-related log from both participants? I assume that you only get this error with RBF mapping?

We have seen such problems with CalculiX lately, but maybe there is something that we need to look deeper into here.

Hey Makis,

unfortunately it doesn’t work for both mappings.
I hope the log files are the ones you need, there are the ones for convergence and the debug ones
that are written/configured via the precice-config.xml file.

I also thought that for the 2d case preCICE does a 2d-3d mapping, due to the fact that OpenFoam always uses a 3d mesh and in the quasi-2d case the coupling is done on surfaces so this is a 3d coupling… Or am I misunderstanding something there ?

precice-FluidSolver-iterations.log (69 Bytes)
precice-FluidSolver-iterations.log (69 Bytes)
precice-StructureSolver-convergence.log (2.2 KB)
debug_Solid.log (253.3 KB)
debug_Fluid.log (9.7 KB)

What is your preCICE config?
Could you also please upload precice-StructureSolver-iterations.log?
Your coupling convergence measures seem rather tight. Does it get more robust if you loosen those?

No, you are right. In the 2D-3D case, the OpenFOAM adapter does the mapping between 2D and 3D. If you do 3D-3D for a quasi-2D case, however, you can mess up things in the span-wise direction. Which coupling mesh (which coordinates in z direction) do you use in DUNE then? Try to export all coupling meshes as vtk.

It seems like the Fluid-iteration.log and Structure-iteration.log files are the same.

I use gmsh for creating the mesh as .msh file. I can load it in with Dune and can loop over the mesh entities like the vertices. In this case each vertex has three coordinates, where the z-coordinate is related to the spanwise direction. I then pick all vertices related to the coupling interface and write them into the array that preCICE needs for the setup.
So the coupling mesh should be consisting of the vertices on the surface of my flap, like in the picture from above, where red and grey have the same interface.

precice-config.xml (3.5 KB)
precice-StructureSolver-iterations.log (69 Bytes)

You currently use no acceleration. For a strong physical coupling this is known to diverge. What happens if use the acceleration you have currently commented out? Then, the structure iterations file will also contain more information than the fluid iterations file, namely information about the acceleration, which only the second participant of the coupling scheme knows (StructureSolver in your case).

There could be a problem with how you currently treat the z coordinates. OpenFOAM provides forces at the cell centers, which you try to read at the cell vertices of DUNE. In between, you use a nearest-neighbor mapping. For each cell, both DUNE vertices have the same distance from the OpenFOAM cell center … on which side the force gets mapped to is arbitrary. You could circumvent this problem by e.g.

<mapping:rbf-thin-plate-splines z-dead="true" direction="write" 
  from="Fluid-Mesh-Faces" to="StructureMesh" constraint="conservative" timing="initial"/>

z-dead means that the z coordinate is ignored in the mapping. The mapping problem is reduced to a 2D problem.
Was that understandable?


yes that was understandable. So I tried to run the simulation with the
recommended changes and it shows a bit of a better behavior.
Still, the simulation crashes. I’ll attach the structure iterations file
with the acceleration information.

Thanks for the help so far.

precice-StructureSolver-iterations.log (146 Bytes)
precice-StructureSolver-convergence.log (767 Bytes)

I also tried to play with different parameters. The maybe most interesting thing is, that for a lowering of the fluid density, the simulation archives a max-time of around 2s before the floating point exception occurs.
During the simulation the flap also shows the physical behavior of bending backwards, which looks correct. Why it is breaking randomly is still strange, because the simulation shows no oscillations or strange mesh movement.

Lowering the fluid density decreases the added-mass effect / instability. See e.g. Causin et al. or Brummelen. So it’s not unexpected that the coupling becomes more stable.

So, it might be just a coupling instability, no bug.

For this case, which breaks after two seconds could you please provide:

  • your preCICE config,
  • the iterations and convergence files,
  • the tolerance and convergence measures, which you use within your solvers, and
  • a screenshot of your setup just before diverging (ideally also showing the mesh).


Ok, that could be the reason. I don’t have all the files for the 2,00s run, but the uploaded files should show the same behavior for a run to 0,68s.

On the structure side I use a RK45 scheme with adaptive timestepping (Here I’m not sure about all the tolerance values).
On the fluid side in OpenFoam I use something like this:

solver PCG;
preconditioner DIC;
tolerance 1e-8;
relTol 1.0e-3;

solver smoothSolver;
smoother symGaussSeidel;
tolerance 1e-6;
relTol 1e-4;
minIter 2;
This should hopefully be the most important tolerances on the fluid side.

Picture of Fluid- and Solid-Mesh before the crash, the fluid density is 1e-2, so the deformation is quite small:

And all the config, iteration and convergence files:
precice-StructureSolver-iterations.log (1.7 KB)
precice-StructureSolver-convergence.log (9.2 KB)
precice-FluidSolver-iterations.log (1.1 KB)
precice-config.xml (3.2 KB)

Your case convergence quite nicely.
There are still a few things where you could tune the parameters, but that’s apparently not the problem here. Just before the crash the last force residual is 0. How does your case actually crash? Could you please also upload the terminal output of both solvers?

So the fluid solver crashes with an floating point exception and with that the solid solver ends communication without an error.

The fluid solver output:

Courant Number mean: 1.24619 max: 12.2085
Time = 0.68

smoothSolver:  Solving for cellDisplacementy, Initial residual = 0.0181902, Final residual = 9.9823e-07, No Iterations 182
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 8 FPE: Floating Point Exception,probably divide by zero
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run 
[0]PETSC ERROR: to get more information on the crash.
[0]PETSC ERROR: User provided function() line 0 in  unknown file  
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 59.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.

The output from an iteration before:

Courant Number mean: 1.43696 max: 5.28101
Time = 0.67

smoothSolver:  Solving for cellDisplacementx, Initial residual = 0.019039, Final residual = 9.99036e-07, No Iterations 203
smoothSolver:  Solving for cellDisplacementy, Initial residual = 0.0149607, Final residual = 9.98013e-07, No Iterations 137
DICPCG:  Solving for pcorr, Initial residual = 1, Final residual = 9.56222e-09, No Iterations 102
DICPCG:  Solving for pcorr, Initial residual = 0.000108822, Final residual = 7.19479e-09, No Iterations 79
time step continuity errors : sum local = 5.22968e-13, global = 1.52252e-14, cumulative = 1.246e-08
smoothSolver:  Solving for Uy, Initial residual = 0.00662969, Final residual = 1.52762e-07, No Iterations 4
DICPCG:  Solving for p, Initial residual = 0.579885, Final residual = 0.000408769, No Iterations 73
DICPCG:  Solving for p, Initial residual = 6.08125e-05, Final residual = 4.35898e-08, No Iterations 67
time step continuity errors : sum local = 7.24805e-08, global = -3.5555e-10, cumulative = 1.21044e-08
DICPCG:  Solving for p, Initial residual = 0.0166688, Final residual = 1.2164e-05, No Iterations 69
DICPCG:  Solving for p, Initial residual = 2.15173e-05, Final residual = 1.95785e-08, No Iterations 45
time step continuity errors : sum local = 5.17453e-08, global = -1.18473e-08, cumulative = 2.57179e-10
DICPCG:  Solving for p, Initial residual = 0.00910024, Final residual = 8.23503e-06, No Iterations 69
DICPCG:  Solving for p, Initial residual = 1.34088e-05, Final residual = 1.11752e-08, No Iterations 65
time step continuity errors : sum local = 3.63172e-08, global = -1.63888e-09, cumulative = -1.3817e-09
DICPCG:  Solving for p, Initial residual = 0.00615916, Final residual = 5.598e-06, No Iterations 69
DICPCG:  Solving for p, Initial residual = 8.67066e-06, Final residual = 9.91412e-09, No Iterations 60
time step continuity errors : sum local = 3.60104e-08, global = -2.50135e-09, cumulative = -3.88305e-09
ExecutionTime = 127.99 s  ClockTime = 174 s

I guess the courant number before the crash is quite high compared to the mean value and the one from an iteration before.

In the beginning it also solves for Ux with the smooth solver, but that entry is also missing at some point.

Is there any other warning or error further above in the fluid output?
Do you run with preCICE in Debug mode? If no that could be an option to get more insight into the problem.
I see that you don’t use z-dead="true" in your config. Did you try this?

The Courant number is suspicious, but that could be just the symptom, not the original problem.

So the missing velocity component in the fluid solver information output seems to be related to the “empty” boundary condition from OpenFOAM. I switched to wall boundary conditions for front and back and now it works for different fluid densities without the floating point exception.

I guess the error is really related to the quasi-2D calculation of OpenFoam, if boundary conditions of front and back are set to "empty and the full 3D calculation of my solid solver, where the z-components should really be non existent or manually set to 0. If OpenFoam does a “full” 3D simulation the 3D solid response seems to work as intended.

I’m pretty sure that’s the reason for the behavior of the simulation, so I’ll mark this as solution
and try to carry on with different boundary conditions or at least make sure that no quasi-2D
calculation is happening.

Thanks again for the help so far.