Implicit Coupling With OpenFOAM

Hi everyone,

I would like to understand the coupling behaviour of the OpenFOAM Adapter in implicit coupling mode.
I’ve coupled a FSI simulation in parallel-explicit mode and now I’d like to switch to the parallel-implicit mode.
When switching to implicit-parallel mode,

Klick here to view the xml code
   <time-window-size value="1e-4" />
   <max-time value="0.25" />
   <max-iterations value="100"/>
   <min-iteration-convergence-measure min-iterations="2" data="Force_Data_Right" mesh="Solid-Mesh-Right" />
   <participants first="Fluid" second="Solid" />
   <exchange data="Force_Data_Right" mesh="Solid-Mesh-Right" from="Fluid" to="Solid" />
   <exchange data="Force_Data_Left" mesh="Solid-Mesh-Left" from="Fluid" to="Solid" />
   <exchange data="Displacement_Data_Right" mesh="Solid-Mesh-Right" from="Solid" to="Fluid" />
   <exchange data="Displacement_Data_Left" mesh="Solid-Mesh-Left" from="Solid" to="Fluid" />

OpenFOAM crashes.

Klick here to view the error message
[scw-a5ymyf:10418] *** Process received signal ***
[scw-a5ymyf:10418] Signal: Segmentation fault (11)
[scw-a5ymyf:10418] Signal code:  (-6)
[scw-a5ymyf:10418] Failing at address: 0x3e9000028b2
[scw-a5ymyf:10418] [ 0] /lib/x86_64-linux-gnu/[0x7f66d5cbd3c0]
[scw-a5ymyf:10418] [ 1] /lib/x86_64-linux-gnu/[0x7f66d5cbd24b]
[scw-a5ymyf:10418] [ 2] /lib/x86_64-linux-gnu/[0x7f66d5cbd3c0]
[scw-a5ymyf:10418] [ 3] /usr/lib/openfoam/openfoam2012/platforms/linux64GccDPInt32Opt/lib/[0x7f66d6cd1c23]
[scw-a5ymyf:10418] [ 4] /usr/lib/openfoam/openfoam2012/platforms/linux64GccDPInt32Opt/lib/[0x7f66d6d0afa9]
[scw-a5ymyf:10418] [ 5] /usr/lib/openfoam/openfoam2012/platforms/linux64GccDPInt32Opt/lib/[0x7f66d6d0b712]
[scw-a5ymyf:10418] [ 6] /usr/lib/openfoam/openfoam2012/platforms/linux64GccDPInt32Opt/lib/[0x7f66d8d30359]
[scw-a5ymyf:10418] [ 7] /usr/lib/openfoam/openfoam2012/platforms/linux64GccDPInt32Opt/lib/[0x7f66d8c6d9b1]
[scw-a5ymyf:10418] [ 8] /usr/lib/openfoam/openfoam2012/platforms/linux64GccDPInt32Opt/lib/[0x7f66d8c4eaed]
[scw-a5ymyf:10418] [ 9] /usr/lib/openfoam/openfoam2012/platforms/linux64GccDPInt32Opt/lib/[0x7f66d6f7fe3f]
[scw-a5ymyf:10418] [10] /usr/lib/openfoam/openfoam2012/platforms/linux64GccDPInt32Opt/lib/[0x7f66d6cb23d3]
[scw-a5ymyf:10418] [11] overPimpleDyMFoam(+0x37da5)[0x55b1c74f2da5]
[scw-a5ymyf:10418] [12] /lib/x86_64-linux-gnu/[0x7f66d5adb0b3]
[scw-a5ymyf:10418] [13] overPimpleDyMFoam(+0x3e48e)[0x55b1c74f948e]
[scw-a5ymyf:10418] *** End of error message ***
[scw-a5ymyf:10413] [13] overPimpleDyMFoam(+0x3e48e)[0x55b677e8748e]
[scw-a5ymyf:10413] *** End of error message ***

As already suggested on gitter, I’ve tried several acceleration methods, which only helped partially (the simulation still crashes after max. 4 iterations).

I’ve found out that the same issue is appearing without MBDyn (using just OpenFOAM and preCICE python interface).
A case with the working explicit case (and a config file with the implicit case) can be found here.

Maybe you have an idea why this failure is occuring :slight_smile:

Hi @JulianSchl ,

could you post a log of the OpenFOAM run as well? I tried running your case, but I get

--> FOAM FATAL IO ERROR: (openfoam-2012 patch=210618)
Unknown solver type solidBodyDisplacementLaplacianMultiZone

Hi @JulianSchl

What could also help would be the preCICE config with an IQN acceleration that your tried and the preCICE convergence file.

Maybe the last note on this documentation page helps

I could imagine that sth with how you treat the rigid body displacement is not yet covered in the checkpointing.

1 Like

Just for completness: the OpenFOAM adapter does not even know if it is working in a serial-implicit or parallel-implicit coupling scheme. It only knows if it is working in an explicit or implicit scheme, to enable checkpointing.

I would also imagine the same and I would guess that the simulation is already unstable in the serial-implicit scheme, but not unstable enough to crash.

Hi @DavidSCN, @uekerman and @Makis,

thank you for your messages and sorry for my late reply :slight_smile:

Sorry I forgot to add the custom lib here. You can compile it with wmake, then it should run as expected.
In addition I’ve added the log file here: overPimpleDyMFoam.log (126.6 KB)
The convergence log can be found here: precice-Solid-convergence.log (86 Bytes)

The config file is located in the repository linked above.

But why would the simulation then crash in the 4th time window and not in the 1st?

The same configuration in the parallel-explicit coupling configuration runs >3000 time windows without issues. Could this happen in an unstable scheme?

Could you try to use a proper convergence measure and upload the convergence file? Would be interesting to see whether your case actually converges.

Well, it could take some time till it shakes up. I would assume that your case also diverges in the first time window when you use a strict convergence measure.

My best guess is still the checkpointing.