Implicit Coupling With OpenFOAM

Hi everyone,

I would like to understand the coupling behaviour of the OpenFOAM Adapter in implicit coupling mode.
I’ve coupled a FSI simulation in parallel-explicit mode and now I’d like to switch to the parallel-implicit mode.
When switching to implicit-parallel mode,

Klick here to view the xml code
<coupling-scheme:parallel-implicit>
   <time-window-size value="1e-4" />
   <max-time value="0.25" />
   <max-iterations value="100"/>
      
   <min-iteration-convergence-measure min-iterations="2" data="Force_Data_Right" mesh="Solid-Mesh-Right" />
   <participants first="Fluid" second="Solid" />
   <exchange data="Force_Data_Right" mesh="Solid-Mesh-Right" from="Fluid" to="Solid" />
   <exchange data="Force_Data_Left" mesh="Solid-Mesh-Left" from="Fluid" to="Solid" />
   <exchange data="Displacement_Data_Right" mesh="Solid-Mesh-Right" from="Solid" to="Fluid" />
   <exchange data="Displacement_Data_Left" mesh="Solid-Mesh-Left" from="Solid" to="Fluid" />
</coupling-scheme:parallel-implicit>

OpenFOAM crashes.

Klick here to view the error message
[scw-a5ymyf:10418] *** Process received signal ***
[scw-a5ymyf:10418] Signal: Segmentation fault (11)
[scw-a5ymyf:10418] Signal code:  (-6)
[scw-a5ymyf:10418] Failing at address: 0x3e9000028b2
[scw-a5ymyf:10418] [ 0] /lib/x86_64-linux-gnu/libpthread.so.0(+0x153c0)[0x7f66d5cbd3c0]
[scw-a5ymyf:10418] [ 1] /lib/x86_64-linux-gnu/libpthread.so.0(raise+0xcb)[0x7f66d5cbd24b]
[scw-a5ymyf:10418] [ 2] /lib/x86_64-linux-gnu/libpthread.so.0(+0x153c0)[0x7f66d5cbd3c0]
[scw-a5ymyf:10418] [ 3] /usr/lib/openfoam/openfoam2012/platforms/linux64GccDPInt32Opt/lib/liboverset.so(_ZNK4Foam20dynamicOversetFvMesh16addInterpolationIdEEvRNS_8fvMatrixIT_EERKNS_5FieldIdEE+0x7b3)[0x7f66d6cd1c23]
[scw-a5ymyf:10418] [ 4] /usr/lib/openfoam/openfoam2012/platforms/linux64GccDPInt32Opt/lib/liboverset.so(_ZNK4Foam20dynamicOversetFvMesh5solveIdEENS_17SolverPerformanceIT_EERNS_8fvMatrixIS3_EERKNS_10dictionaryE+0x299)[0x7f66d6d0afa9]
[scw-a5ymyf:10418] [ 5] /usr/lib/openfoam/openfoam2012/platforms/linux64GccDPInt32Opt/lib/liboverset.so(_ZNK4Foam20dynamicOversetFvMesh5solveERNS_8fvMatrixIdEERKNS_10dictionaryE+0x22)[0x7f66d6d0b712]
[scw-a5ymyf:10418] [ 6] /usr/lib/openfoam/openfoam2012/platforms/linux64GccDPInt32Opt/lib/libfiniteVolume.so(_ZN4Foam16patchDistMethods7Poisson7correctERNS_14GeometricFieldIdNS_12fvPatchFieldENS_7volMeshEEERNS2_INS_6VectorIdEES3_S4_EE+0x159)[0x7f66d8d30359]
[scw-a5ymyf:10418] [ 7] /usr/lib/openfoam/openfoam2012/platforms/linux64GccDPInt32Opt/lib/libfiniteVolume.so(_ZN4Foam10meshObject10movePointsINS_6fvMeshEEEvRNS_14objectRegistryE+0x121)[0x7f66d8c6d9b1]
[scw-a5ymyf:10418] [ 8] /usr/lib/openfoam/openfoam2012/platforms/linux64GccDPInt32Opt/lib/libfiniteVolume.so(_ZN4Foam6fvMesh10movePointsERKNS_5FieldINS_6VectorIdEEEE+0x38d)[0x7f66d8c4eaed]
[scw-a5ymyf:10418] [ 9] /usr/lib/openfoam/openfoam2012/platforms/linux64GccDPInt32Opt/lib/libdynamicFvMesh.so(_ZN4Foam29dynamicMotionSolverListFvMesh6updateEv+0x31f)[0x7f66d6f7fe3f]
[scw-a5ymyf:10418] [10] /usr/lib/openfoam/openfoam2012/platforms/linux64GccDPInt32Opt/lib/liboverset.so(_ZN4Foam20dynamicOversetFvMesh6updateEv+0x13)[0x7f66d6cb23d3]
[scw-a5ymyf:10418] [11] overPimpleDyMFoam(+0x37da5)[0x55b1c74f2da5]
[scw-a5ymyf:10418] [12] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf3)[0x7f66d5adb0b3]
[scw-a5ymyf:10418] [13] overPimpleDyMFoam(+0x3e48e)[0x55b1c74f948e]
[scw-a5ymyf:10418] *** End of error message ***
/lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf3)[0x7f92a4d8f0b3]
[scw-a5ymyf:10413] [13] overPimpleDyMFoam(+0x3e48e)[0x55b677e8748e]
[scw-a5ymyf:10413] *** End of error message ***

As already suggested on gitter, I’ve tried several acceleration methods, which only helped partially (the simulation still crashes after max. 4 iterations).

I’ve found out that the same issue is appearing without MBDyn (using just OpenFOAM and preCICE python interface).
A case with the working explicit case (and a config file with the implicit case) can be found here.

Maybe you have an idea why this failure is occuring :slight_smile:

Hi @JulianSchl ,

could you post a log of the OpenFOAM run as well? I tried running your case, but I get

--> FOAM FATAL IO ERROR: (openfoam-2012 patch=210618)
Unknown solver type solidBodyDisplacementLaplacianMultiZone

Hi @JulianSchl

What could also help would be the preCICE config with an IQN acceleration that your tried and the preCICE convergence file.

Maybe the last note on this documentation page helps

https://precice.org/couple-your-code-implicit-coupling.html

I could imagine that sth with how you treat the rigid body displacement is not yet covered in the checkpointing.

1 Like

Just for completness: the OpenFOAM adapter does not even know if it is working in a serial-implicit or parallel-implicit coupling scheme. It only knows if it is working in an explicit or implicit scheme, to enable checkpointing.

I would also imagine the same and I would guess that the simulation is already unstable in the serial-implicit scheme, but not unstable enough to crash.

Hi @DavidSCN, @uekerman and @Makis,

thank you for your messages and sorry for my late reply :slight_smile:

Sorry I forgot to add the custom lib here. You can compile it with wmake, then it should run as expected.
In addition I’ve added the log file here: overPimpleDyMFoam.log (126.6 KB)
The convergence log can be found here: precice-Solid-convergence.log (86 Bytes)

The config file is located in the repository linked above.

But why would the simulation then crash in the 4th time window and not in the 1st?

The same configuration in the parallel-explicit coupling configuration runs >3000 time windows without issues. Could this happen in an unstable scheme?

Could you try to use a proper convergence measure and upload the convergence file? Would be interesting to see whether your case actually converges.

Well, it could take some time till it shakes up. I would assume that your case also diverges in the first time window when you use a strict convergence measure.

My best guess is still the checkpointing.