Excessive Runtime in preCICE OpenFOAM–CalculiX Coupling

Hello,

I am currently using preCICE to couple OpenFOAM and CalculiX, but I am facing an issue where the simulation runtime is extremely long. I am using the implicit coupling scheme with parallel execution.

In my current case:

  • Each time step requires an average of 10 iterations
  • Time step size: 1e-9
  • End time: 4e-4
  • Each iteration takes about 13 seconds

Based on this, the simulation would require roughly 600 days, which is not acceptable for me.

I am wondering if there are ways to reduce the number of time steps, the number of iterations per time step, or the time cost per iteration. My mesh sizes are not particularly large:

  • Flow field: ~60,000 cells
  • Solid domain: ~50,000 elements

I will attach my precice-config.xml file for reference.

In addition, I also have a conceptual question: Does the coupling require the time steps of OpenFOAM, preCICE, and CalculiX to be exactly the same?

Any advice or suggestions would be greatly appreciated!

Thank you in advance.
precice-config.xml (2.4 KB)

Hi,

Assuming that preCICE and solvers are release builds, I recommend using the build in performance analysis to figure out what parts of your simulation are the costly ones.

An easy first guess is the mapping method you use. The global RBF with thin-plate-splines works with dense matrices. I suggest switching to the PoU mappings.

Following your suggestion, I used the rbf-pum-direct mapping and adopted the serial-explicit coupling scheme. However, I encountered two issues: the Courant number on the OpenFOAM side spikes at a certain time, and the CalculiX side reports a divergence problem.


image

If you need any specific file from my case setup, I would be happy to provide it.

Regarding your original post, which participant restricts the time step to being as low as 1e-9? OpenFOAM and CalculiX do not need to have the same time step. preCICE offers time interpolation schemes when the time steps of both participants differ significantly. Take a look at the documentation on how manage this in the precice-config.xml: Waveform iteration for time interpolation of coupling data | preCICE - The Coupling Library
It could be the case that you structural solver unnecessarily operates with an extremely low time step, because the fluid solver demands it.

Actually, for FSI with implicit coupling, the CalculiX adapter does have issues with subcycling at the moment, and we are also investigating potential issues in the OpenFOAM adapter:

These typically originate from an incomplete or faulty checkpointing implementation.

We are actively looking into the respective bugs these days.