Hello everyone,
I am opening this topic because I am struggling to restart my coupled FSI simulation using OpenFOAM and CalculiX. I am aware that restart-related topics have already been discussed and resolved in this forum, but after carefully reviewing them, I have not been able to find a solution to my specific issue…
In short, my simulation runs well when executed continuously without restart, but it takes several days to complete (this is a large-deformation FSI case). I therefore wanted to use a restart workflow in order to run the first phase with a coarser time step and then reduce the time step after restarting for the second phase.
I followed the instructions provided in the following link to perform a proper restart, and after several small tests (restarting after only a few steps), everything worked as expected:
https://pawel-lojek.medium.com/resuming-fsi-simulations-with-openfoam-calculix-896088861ae
Unfortunately, when I run a full simulation and apply the restart during the second phase, the interface forces suddenly blow up to extremely large values, and the simulation crashes. Below is an example of a typical log file:
Output
Create time
Create mesh for time = 0.035
Selecting dynamicFvMesh dynamicMotionSolverFvMesh
STEP 2
Dynamic analysis was selected
Nonlinear material laws are taken into account
Newton-Raphson iterative procedure is active
Nonlinear geometric effects are taken into account
Selecting motion solver: displacementLaplacian
Applying motion to entire mesh
Selecting motion diffusion: quadratic
Selecting motion diffusion: inverseDistance
Selecting patchDistMethod meshWave
PIMPLE: Operating solver in PISO mode
Reading field p_rgh
Reading field U
Reading/calculating face flux field phi
Reading transportProperties
Selecting incompressible transport model Newtonian
Selecting incompressible transport model Newtonian
Selecting turbulence model type laminar
Selecting laminar stress model Stokes
Reading g
Reading hRef
Calculating field g.h
No MRF models present
No finite volume options present
Restarting alpha
DICPCG: Solving for pcorr, Initial residual = 1, Final residual = 9.21021e-06, No Iterations 521
DICPCG: Solving for pcorr, Initial residual = 0.00128464, Final residual = 9.98522e-06, No Iterations 79
time step continuity errors : sum local = 3.25606e-07, global = 6.33947e-09, cumulative = 2.90045e-05
Constructing face velocity Uf
Decascading the MPC’s
Determining the structure of the matrix:
Using up to 48 cpu(s) for setting up the structure of the matrix.
number of equations
342216
number of nonzero lower triangular matrix elements
12229167
Starting FSI analysis via preCICE using the geometrically non-linear CalculiX solver…
Using up to 48 cpu(s) for the stress calculation.
Using up to 48 cpu(s) for the energy calculation.
Using up to 48 cpu(s) for the symmetric stiffness/mass contributions.
Factoring the system of equations using the symmetric pardiso solver
number of threads = 48
Using up to 48 cpu(s) for the stress calculation.
Using up to 48 cpu(s) for the energy calculation.
Setting up preCICE participant Solid, using config file: config.yml
—[precice] e[0m This is preCICE version 3.2.0
—[precice] e[0m Revision info: no-info [git failed to run]
—[precice] e[0m Build type: Release (without debug log)
—[precice] e[0m Working directory “/bigdata/eskandarilab/shared/OpenFOAM/PPV_Control_Test_7_5kPa_Pressure_10cmh20_restart_2_v2_interfoam/solid-calculix”
—[precice] e[0m Configuring preCICE with configuration “../precice-config.xml”
—[precice] e[0m I am participant “Solid”
Set ID Found
Read data ‘Force’ found.
Write data ‘Displacement’ found.
Adapter writing coupling data…
Writing DISPLACEMENTS coupling data.
—[precice] e[0m Setting up primary communication to coupling partner/s
Courant Number mean: 0.0319774 max: 0.174813
Starting time loop
—[preciceAdapter] Loaded the OpenFOAM-preCICE adapter - v1.3.1.
—[preciceAdapter] Reading preciceDict…
—[precice] e[0m This is preCICE version 3.2.0
—[precice] e[0m Revision info: no-info [git failed to run]
—[precice] e[0m Build type: Release (without debug log)
—[precice] e[0m Working directory “/bigdata/eskandarilab/shared/OpenFOAM/PPV_Control_Test_7_5kPa_Pressure_10cmh20_restart_2_v2_interfoam/fluid-openfoam”
—[precice] e[0m Configuring preCICE with configuration “../precice-config.xml”
—[precice] e[0m I am participant “Fluid”
—[precice] e[0m Connecting Primary rank to 3 Secondary ranks
—[precice] e[0m Setting up primary communication to coupling partner/s
—[precice] e[0m Primary ranks are connected
—[precice] e[0m Primary ranks are connected
—[precice] e[0m Setting up preliminary secondary communication to coupling partner/s
—[precice] e[0m Prepare partition for mesh Solid-Mesh
—[precice] e[0m Setting up preliminary secondary communication to coupling partner/s
—[precice] e[0m Prepare partition for mesh Fluid-Mesh
—[precice] e[0m Receive global mesh Solid-Mesh
—[precice] e[0m Gather mesh Solid-Mesh
—[precice] e[0m Send global mesh Solid-Mesh
—[precice] e[0m Setting up secondary communication to coupling partner/s
—[precice] e[0m Broadcast mesh Solid-Mesh
—[precice] e[0m Filter mesh Solid-Mesh by mappings
—[precice] e[0m Feedback distribution for mesh Solid-Mesh
—[precice] e[0m Setting up secondary communication to coupling partner/s
—[precice] e[0m Secondary ranks are connected
—[precice] e[0m Automatic RBF mapping alias from mesh “Fluid-Mesh” to mesh “Solid-Mesh” in “write” direction resolves to “partition-of-unity RBF” .
—[precice] e[0m Computing “partition-of-unity RBF” mapping from mesh “Fluid-Mesh” to mesh “Solid-Mesh” in “write” direction.
—[precice] e[0m Secondary ranks are connected
—[precice] e[0m Mapping “Force” for t=0 from “Fluid-Mesh” to “Solid-Mesh”
—[precice] e[0m Automatic RBF mapping alias from mesh “Solid-Mesh” to mesh “Fluid-Mesh” in “read” direction resolves to “partition-of-unity RBF” .
—[precice] e[0m Computing “partition-of-unity RBF” mapping from mesh “Solid-Mesh” to mesh “Fluid-Mesh” in “read” direction.
—[precice] e[0m iteration: 1 of 200 (min 1), time-window: 1 of 1000, time: 0, time-window-size: 5e-06, max-time-step-size: 5e-06, ongoing: yes, time-window-complete: no, write-initial-data write-iteration-checkpoint
Initializing coupling data
Adapter reading coupling data…
Reading FORCES coupling data.
Adjusting time step for transient step
precice_dt = 0.000005, ccx_dt = 0.000005 (dtheta = 0.000005, tper = 1.000000) → dt = 0.000005 (dtheta = 0.000005)
Adapter reading coupling data…
Reading FORCES coupling data.
Adapter writing checkpoint…
—[precice] e[0m Mapping “Displacement” for t=0 from “Solid-Mesh” to “Fluid-Mesh”
—[precice] e[0m iteration: 1 of 200 (min 1), time-window: 1 of 1000, time: 0, time-window-size: 5e-06, max-time-step-size: 5e-06, ongoing: yes, time-window-complete: no, write-initial-data write-iteration-checkpoint
—[preciceAdapter] preCICE was configured and initialized
—[preciceAdapter] Setting the solver’s endTime to infinity to prevent early exits. Only preCICE will control the simulation’s endTime. Any functionObject’s end() method will be triggered by the adapter. You may disable this behavior in the adapter’s configuration.
Courant Number mean: 0.0319774 max: 0.174813
Interface Courant Number mean: 0.000151525 max: 0.0427162
Time = 0.035005
PIMPLE: iteration 1
DICPCG: Solving for cellDisplacementx, Initial residual = 0.000699, Final residual = 1.4975e-19, No Iterations 1
DICPCG: Solving for cellDisplacementy, Initial residual = 0.000276566, Final residual = 9.2845e-21, No Iterations 1
DICPCG: Solving for cellDisplacementz, Initial residual = 0.000281673, Final residual = 9.3851e-21, No Iterations 1
DICPCG: Solving for pcorr, Initial residual = 1, Final residual = 9.78625e-06, No Iterations 520
DICPCG: Solving for pcorr, Initial residual = 0.000425378, Final residual = 9.74963e-06, No Iterations 26
time step continuity errors : sum local = 1.51414e-06, global = 2.31912e-09, cumulative = 2.90069e-05
smoothSolver: Solving for alpha.water, Initial residual = 0.000364665, Final residual = 2.08147e-09, No Iterations 1
Phase-1 volume fraction = 0.0384998 Min(alpha.water) = -1.96007e-11 Max(alpha.water) = 1
MULES: Correcting alpha.water
MULES: Correcting alpha.water
Phase-1 volume fraction = 0.0384998 Min(alpha.water) = -2.21568e-10 Max(alpha.water) = 1
DILUPBiCGStab: Solving for Ux, Initial residual = 0.0133114, Final residual = 4.51679e-06, No Iterations 4
DILUPBiCGStab: Solving for Uy, Initial residual = 0.147151, Final residual = 4.28576e-06, No Iterations 5
DILUPBiCGStab: Solving for Uz, Initial residual = 0.144358, Final residual = 4.21873e-06, No Iterations 5
increment 1 attempt 1
increment size= 5.000000e-06
sum of previous increments=0.000000e+00
actual step time=5.000000e-06
actual total time=1.000005e+00
iteration 1
Using up to 48 cpu(s) for the stress calculation.
Using up to 48 cpu(s) for the energy calculation.
Using up to 48 cpu(s) for the symmetric stiffness/mass contributions.
Factoring the system of equations using the symmetric pardiso solver
number of threads = 48
Using up to 48 cpu(s) for the stress calculation.
Using up to 48 cpu(s) for the energy calculation.
average force= 0.000010
time avg. forc= 0.000010
largest residual force= 0.005030 in node 110086 and dof 1
largest increment of disp= 3.652511e-05
largest correction to disp= 3.652511e-05 in node 113964 and dof 3
DICPCG: Solving for p_rgh, Initial residual = 0.0110479, Final residual = 0.000534041, No Iterations 357
DICPCG: Solving for p_rgh, Initial residual = 0.000485026, Final residual = 2.37735e-05, No Iterations 201
time step continuity errors : sum local = 2.76436e-06, global = 6.01739e-08, cumulative = 2.9067e-05
DICPCG: Solving for p_rgh, Initial residual = 0.000531961, Final residual = 2.58215e-05, No Iterations 274
no convergence
iteration 2
Using up to 48 cpu(s) for the symmetric stiffness/mass contributions.
Factoring the system of equations using the symmetric pardiso solver
number of threads = 48
Using up to 48 cpu(s) for the stress calculation.
Using up to 48 cpu(s) for the energy calculation.
average force= 423542.375849
time avg. forc= 423542.375849
largest residual force= 115872721690.617142 in node 111910 and dof 1
largest increment of disp= 7.227784e-04
largest correction to disp= 7.227784e-04 in node 110563 and dof 3
DICPCG: Solving for p_rgh, Initial residual = 0.00022482, Final residual = 1.08727e-05, No Iterations 216
time step continuity errors : sum local = 8.05108e-07, global = -2.26584e-08, cumulative = 2.90444e-05
DICPCG: Solving for p_rgh, Initial residual = 0.000133118, Final residual = 6.63425e-06, No Iterations 60
DICPCG: Solving for p_rgh, Initial residual = 4.97278e-05, Final residual = 2.45716e-06, No Iterations 261
time step continuity errors : sum local = 1.77329e-07, global = 2.60301e-08, cumulative = 2.90704e-05
DICPCG: Solving for p_rgh, Initial residual = 4.21609e-05, Final residual = 2.08879e-06, No Iterations 330
no convergence
iteration 3
Using up to 48 cpu(s) for the symmetric stiffness/mass contributions.
Factoring the system of equations using the symmetric pardiso solver
number of threads = 48
Using up to 48 cpu(s) for the stress calculation.
Using up to 48 cpu(s) for the energy calculation.
average force= 29862423029549073370559207075328557056.000000
time avg. forc= 29862423029549073370559207075328557056.000000
largest residual force= 2061492857741142731376542643493492106461184.000000 in node 103405 and dof 2
largest increment of disp= 1.553917e+05
largest correction to disp= 1.553917e+05 in node 103432 and dof 3
malloc(): unsorted double linked list corrupted
[r33:240829] *** Process received signal ***
[r33:240829] Signal: Aborted (6)
[r33:240829] Signal code: (-6)
…
When I run each solver separately and restart them independently, everything works fine, which leads me to believe that the issue is related to the coupling rather than the individual solvers.
I have tried several modifications to my preCICE configuration, including switching from IQN-ILS to Aitken acceleration, reducing the initial relaxation factor, and using a very small time step. However, in all cases the forces still blow up after restart and the simulation crashes. I am currently using the <initialize=“yes”> option, as described in the documentation and in previous discussions on this forum.
I would like to ask whether I might be missing something or if there is a well-known or straightforward solution to this issue.
Thank you very much for your help and please let me know if you need more information!
Arif
My configuration:
-
Fluid solver: OpenFOAM 2406 (interFoam and interIsoFoam, both tested)
-
Solid solver: CalculiX 2.20
-
Coupling library: preCICE 3.2.0
-
preCICE config:
precice-config.xml (2.3 KB)