OpenFOAM/CalculiX FSI Simulation Crashes at ~5s - Help Needed!

Hello everyone,

I’m currently working on a Fluid-Structure Interaction (FSI) simulation coupling OpenFOAM and CalculiX using preCICE. I’m running a relatively simple 2D test case.Unfortunately, I’ve encountered a critical issue: the simulation runs for approximately 5 seconds of physical time before CalculiX abruptly crashes with an error:

txt
*** Process received signal ***
Signal: Segmentation fault (11)
Signal code: Address not mapped (1)
Failing at address: 0x2b

Upon reviewing the logs from preCICE, CalculiX, and OpenFOAM, I haven’t identified any obvious convergence issues – everything appears to be proceeding normally up to that point.Visualizing the CalculiX results revealed that the crash coincides with the onset of localized structural distortion in a specific region .

I’ve spent considerable time trying to diagnose the root cause of this crash but haven’t been successful yet. I’m hoping the community can provide some insights!To assist with troubleshooting, I’ve attached the following files:
case.zip (1.1 MB)
last_window_calculix.log (7.3 KB)
last_window_openfoam.log (7.4 KB)
I’ve also included plots of the OpenFOAM solver performance below for reference:


Would you please take a look when you have a moment? Any suggestions on what might be causing this sudden crash and distortion, or pointers on where to look next, would be immensely appreciated! If any additional files or information would be helpful, please don’t hesitate to let me know.

Thank you very much for your time and expertise!

Best regards!

Hi again everyone,

Just wanted to provide an update on my issue after some further investigation over the past few days.

I noticed something potentially significant: in my OpenFOAM setup using parallel computation, the decomposition splits the coupling interface between the fluid and structure domains. Crucially, the location where the initial structural distortion occurred (leading to the crash at ~4s) coincided exactly with one of these decomposition boundaries. You can see this alignment in the attached image below).

This made me suspect that the decomposition at the interface might be contributing to the instability. To test this, I modified my OpenFOAM decomposition to avoid splitting the coupling interface itself. The good news is that this change allowed the simulation to run significantly longer – reaching about 8.4 seconds of physical time.

However, the simulation ultimately failed again with a CalculiX error. Upon visualizing the final CalculiX results, I did not observe any obvious large-scale structural distortion like before. I examined the mesh carefully, zooming in on potential areas (especially near the original failure zone and interface), but couldn’t find any clear, severe distortion that would obviously cause a crash.

So, while avoiding interface splitting improved stability (doubling the runtime), it hasn’t fully resolved the crash, and the failure mode now seems different (no visible severe distortion). I’m quite puzzled by this outcome.

@ fsimonis, Makis, I sincerely apologize for the direct ping during what I’m sure are busy schedules. This stability problem has been a major roadblock in my project for quite some time now. Given your deep expertise with preCICE and FSI coupling challenges, would you possibly have a moment to glance at my case (details and files are in the original post above) or offer any insights? Are there specific settings, diagnostics, or areas I should focus my investigation on next? Any suggestions would be immensely valuable and greatly appreciated.

Thank you both, and everyone else, for your time and consideration!

Best regards

After further investigation, I now suspect that the decomposition boundaries on the coupling interface (mentioned in my last update) may not be the root cause of the crash.Immediately before the crash, CalculiX’s final calculation step shows an abrupt spike in structural forces – large enough to trigger divergence and terminate the simulation. This force explosion likely explains the structural distortion observed earlier.

But why would structural forces suddenly surge at the coupling interface?Is this an issue with OpenFOAM, a problem with CalculiX, or a coupling setup problem?@Makis @fsimonis @uekerman I sincere apologies for the ping during your busy schedules. Your unparalleled expertise on CalculiX FSI issues in the forum inspired me to reach out. I would greatly appreciate it if you could review my settings and provide some suggestions to enhance the stability of the OpenFOAM-CalculiX coupling.

In recent days I made new discoveries: By checking CalculiX logs, I found the largest residual force in the last time window is clearly abnormal, and the average force also increased suddenly. Logs for the last few time windows are attached here.

ccx.log (5.5 KB)

This made me suspect if it was caused by a sudden change in the pressure field calculated by OpenFOAM. I then reran the simulation and made OpenFOAM output every timestep. Sure enough, the pressure at the last timestep is significantly abnormal (pressure plots for the last two timesteps are attached below).

But a new question arises: Why does such a sudden pressure change occur in OpenFOAM? @Makis @fsimonis @uekerman @DavidSCN Do you have any clues? Any suggestions would be greatly appreciated.

Progress Update:
To investigate why the pressure field in OpenFOAM suddenly changes at the last timestep, I made efforts: I used preCICE’s export function to output data of all iterations on the coupling interface from both solvers simultaneously. Analysis revealed key findings.

Discovery at Critical Point:
I selected a point near the final distorted structural region and plotted its force history on the coupling interface . The results surprised me:

  • OpenFOAM-side forces show violent oscillations, with extreme spikes at the first iteration of every time window (Fig 1).

  • This indicates pressure spikes occur not just at the final crash, but repeatedly at every window start.

  • CalculiX forces also oscillate but with much smaller amplitude (Fig 1), and converge with OpenFOAM data at the last iteration per time window (see zoomed view in Fig 2).

Figure1 OpenFOAM side: yellow; CalculiX side: green

Figure 2

Interpretation & New Questions:
This pattern likely relates to implicit time-stepping schemes, but I don’t understand why OpenFOAM pressure calculations oscillate so severely.
Critically, my original hypothesis collapses :sad_but_relieved_face::

  • Fig 1 shows OpenFOAM forces at crash time (★) are not the maximum values historically.

  • CalculiX standalone forces (Fig 3) show several large jumps, but the final step appears normal.
    What then causes CalculiX’s final crash?

Figure 3

@Makis @fsimonis @uekerman @DavidSCN Sincere apologies for re-pinging – this issue is critical for my research. Any insights would save me immensely.

Hi @wlzr

I am not an expert in OpenFOAM-CalculiX FSI coupling, but I have worked with CalculiX and its preCICE adapter for a bit. Are you still facing the same segmentation fault on the CalculiX side that you originally saw? And have you tried playing with the timestep and timestepping schemes on the CalculiX and OpenFOAM sides? What your investigations show looks like a stability issue.

Hi @IshaanDesai ,
Thank you so much for your reply! I’m still struggling with the segmentation fault in CalculiX – this is undoubtedly a stability issue, and it occurs abruptly. Despite extensive troubleshooting efforts, I haven’t yet identified the root cause.

Regarding your suggestion:

“Have you tried playing with the timestep and timestepping schemes on the CalculiX and OpenFOAM sides?”

Could you clarify specifically what adjustments you mean?

  • For OpenFOAM, does this refer to modifying fvSchemes/fvSolution (e.g., discretization schemes, solver tolerances)?

  • For CalculiX, should I adjust the time integration method (e.g., implicit vs explicit) or step size controls?

My complete case files (including all configurations) remain attached to the original post. If your schedule allows even a brief glance at my settings – especially any potential misconfigurations or stability improvements – I’d be deeply grateful for your insights.

Hi @wlzr,

thank you for all the updates, these are useful observations to have documented.

Two questions:

  1. Are you using subcycling? It is known that CalculiX / the CalculiX adapter has issues with subcycling: Checkpointing does not work for subcycling · Issue #9 · precice/calculix-adapter · GitHub
  2. To what are you setting the end time in CalculiX? It should be greater or equal to the preCICE max-time.

We are currently looking into these (old) subcycling issues with implicit coupling, mainly focusing on OpenFOAM at the moment. Any input here is helpful.

Thank you so much for your response, and please accept my apologies for the delayed reply. I’d like to take this opportunity to clarify some of the points you raised and share a few recent observations.

In the earlier tests, sub-cycling was not employed. However, during this period, I experimented with sub-cycling on both the OpenFOAM and Calculix sides. In both cases, enabling sub-cycling led to an earlier termination of the simulation.

I initially suspected that the issue might not be strongly related to the maximum time setting in Calculix, since the simulation tended to stop well before reaching that limit. Subsequent tests have confirmed that this parameter does not significantly affect the behavior.

That said, over the past few weeks, I’ve made some new observations and encountered further questions—I would greatly appreciate your insight on these.

In later tests, I enabled VTU output from both OpenFOAM and Calculix and compared the force data at the coupling interface. A rather unusual behavior emerged. To help illustrate the process, I’m including a visualization of the coupling procedure below.

The setup includes:

  • A fluid domain (Fluid-Mesh)

  • An intermediate node set (Solid-Mesh) used for data transfer

  • A solid domain (Solid-Mesh)

When comparing the force data among these three node sets, I noticed that the force values on the Solid-Mesh (solid side) differ from those on the other two sets at each iteration—particularly in the first iteration of every time window, where the discrepancy is especially pronounced. I’ve attached a representative output for your reference.

I’m quite puzzled as to why such a significant change occurs when the force is transferred from the Solid-Mesh (fluid side) to the Solid-Mesh (solid side) in Calculix. I strongly suspect that this abrupt variation in force may be causing the divergence in the Calculix calculation.

My questions are:

  • Is this type of behavior expected in the coupling process?

  • Are there any recommended methods to mitigate such sudden changes in force transfer?

@Makis @fsimonis @uekerman @DavidSCN @IshaanDesai my apologies for the direct mention. I was wondering if you might have any insights or suggestions on this matter? I would be truly grateful for any advice you could offer.

If there are differences on the same field (Force) and on the same mesh (Solid-Mesh) between the two participants (Fluid and Solid), something is terribly wrong. I have never seen this happening, and it would be a bug in the communication of preCICE, which is already covered by several tests. There are also regression tests with OpenFOAM and CalculiX.

Is there any possibility of mixing up result files between runs? Or are these the same results on both sides with a delay of an iteration?

Could you upload your iterations.log and convergence.log files?

I would see more likely that something is wrong with the force calculation/writing/reading in one of the two adapters.

I assume you are still using a conservative mapping.

Thank you very much for taking the time to reply. I truly appreciate your input. In response to your feedback, I would like to provide the following additional information:

1.Regarding the force data mapping, I have been consistently using conservative mapping in all my simulations.

2.I am attaching the following files from one of my test cases for your reference:

3.I do not believe there is any confusion regarding the files, and I also consider mapping delay to be an unlikely cause. To help illustrate the situation, I have included force visualization plots for several selected time windows below.

1125 00_00_00-00_04_42

Please let me know if you need any further details or clarifications. I look forward to your thoughts.

This does look very strange. The IQN is also reaching the max-iterations near the end (as expected in such a divergent situation).

Could you also post your adapter configs? OpenFOAM: system/preciceDict, CalculiX: config.yml.

Certainly! Please find the files attached below.

config copy.yml (216 Bytes)

preciceDict.txt (499 Bytes)

If you require any additional documents or data, please feel free to let me know.
Thank you once again for your helpful response and support.

I see what’s happening now, it should be a numerical issue (the adapter config is correct).

Observation: The force values are starting to diverge on the Solid-Mesh (on the Solid side, but not on the Fluid slide). After several iterations, the values on the Fluid side are affected as well.

The precice-config1.xml defines the following coupling scheme:

  <coupling-scheme:parallel-implicit>
    <time-window-size value="0.002" />
    <max-time value="20" />
    <participants first="Solid" second="Fluid"/>
    <exchange data="Force" mesh="Solid-Mesh" from="Fluid" to="Solid" />
    <exchange data="Displacement" mesh="Solid-Mesh" from="Solid" to="Fluid" />

    <max-iterations value="80" />
    <relative-convergence-measure limit="1e-5" data="Displacement" mesh="Solid-Mesh" />
    <relative-convergence-measure limit="1e-4" data="Force" mesh="Solid-Mesh" />
    
    <acceleration:IQN-ILS>
      <data name="Displacement" mesh="Solid-Mesh" />
      <data name="Force" mesh="Solid-Mesh" />
      <preconditioner type="residual-sum" />
      <filter type="QR2" limit="1e-2" />
      <initial-relaxation value="0.1" />
      <max-used-iterations value="100" />
      <time-windows-reused value="20" />
    </acceleration:IQN-ILS>     
  </coupling-scheme:parallel-implicit>

Notes:

  • The acceleration is computed by the second participant of a coupling scheme.
  • Displacement is post-processed on the Solid-Mesh.
  • Meshes are exchanged between participants at the end of a coupling time window, not during iterations.

So, this is then a common IQN tuning problem (which is complicated and needs some expertise with the specific application). Read more in the documentation of the acceleration:

I would probably start by switching to the QR3 filter.

edit (triggered by the Discourse AI): Of course, assuming that the fluid simulation computes the forces correctly, i.e., that all the model parameters are correct and reasonable.

Hi Makis,

Thank you very much for your professional and insightful response. After reading your suggestions, I promptly implemented the changes—specifically, switching the IQN filtering method to QR3, with other parameters configured as follows.

<coupling-scheme:parallel-implicit>
    <time-window-size value="0.002" />
    <max-time value="20" />
    <participants first="Solid" second="Fluid"/>
    <exchange data="Force" mesh="Solid-Mesh" from="Fluid" to="Solid" />
    <exchange data="Displacement" mesh="Solid-Mesh" from="Solid" to="Fluid" />

    <max-iterations value="100" />
    <relative-convergence-measure limit="1e-5" data="Displacement" mesh="Solid-Mesh" />
    <relative-convergence-measure limit="5e-3" data="Force" mesh="Solid-Mesh" />   
    <acceleration:IQN-ILS>
      <data name="Displacement" mesh="Solid-Mesh" />
      <data name="Force" mesh="Solid-Mesh" />
      <preconditioner type="residual-sum" />
      <filter type="QR3" limit="1e-2" />
      <initial-relaxation value="0.1" />
      <max-used-iterations value="100" />
      <time-windows-reused value="15" />
    </acceleration:IQN-ILS>     
  </coupling-scheme:parallel-implicit>

The calculation required a significant amount of time, which is why I am only replying now. I truly appreciate your patience and hope you are still available to review my update.

Unfortunately, the simulation still did not complete successfully. It failed at approximately 7.2 seconds due to a crash in Calculix. Upon examining the iterations.log and convergence.log files, my experience suggests that the convergence behavior was actually quite good this time—yet the calculation still did not succeed. I have attached both log files for your reference.

precice-Solid-convergence copy.log (3.2 MB)

precice-Solid-iterations copy.log (197.5 KB)

In this run, during the final time window, Calculix reported an error after just one iteration. This is a recurring issue I have observed across multiple cases: in the first iteration of a time window, the force transferred to Calculix is often abnormally large and irregular. I have tried to investigate the root cause by reviewing the preCICE documentation and related literature, but so far I have not found a clear explanation.

Could you kindly clarify how the force is determined in Calculix at the beginning of each time window? Any further guidance on this would be greatly appreciated.

Thank you once again for your time and support.

Hi @wlzr,

sorry for the late response here. Do you have any new data in the meantime?

Not sure if we discussed this already, but: is the fluid solver subcycling? This would be a more complicated case, and I would start with a fixed time step size.