Deadlock during finalization

Dear all,
I am using preCICE 1.5.2 to couple my fluid and structural solver on Ubuntu18.04. I met with a strange problem and I don’t know how to solve it. preCICE finished coupling and stopped normally when max-time is 0.031and tilmestep-length is 0.001. However, the terminal stoped at t =0.032 and there was no hint of error when max-time is 0.032. It just stopped and there wasn’t any change in the screen for a long time as shown in the picture. I have tried several times but it failed at the same place.


Can anyone give me some advice?
Thanks!!
Jun

Welcome, @jun_leng!

I see that you are coupling OpenFOAM. Are you using our adapter or have you coupled a specific solver yourself? I assume something is going wrong in the order of calling preCICE. Can you describe your sequence?

Have you already seen our adapter example?

Hi~
I have seen the adapter examples. I am using my own adapter of OpenFoam because I need to add some codes in pimpleFoam and it is easier for me to add preCICE in the codes directly. currently, I am using the serial-explicit coupling method to couple OpemFoam with MBDyn. The coupling went well before 0.031s. But it stopped computating at t=0.032s.


And I noticed that the log file of MBDyn showed it had stopped at t=0.029s while preCICE showed it arrived at t=0.031s. I don’t know why it happened. I think there is something wrong when MBDyn received the external forces.

And this is the config.xml file.

And I noticed that the log file of MBDyn showed it had stopped at t=0.029s while preCICE showed it arrived at t=0.031s.

By “stopped” you mean exited with error?

Keep in mind that you are using a serial coupling, where as “first participant” you have MBDyn and “second” you have OpenFOAM. I see this in your config:

<coupling-scheme:serial-explicit>
  <participants first="StructureSolver" second="FluidSolver">

So, the first participant goes up to t=0.029\mathrm{s} and crashes at the end of it (has it already called advance() for t=0.030\mathrm{s}?) Shortly afterwards (but strangely not at the next timestep), the second participant hangs, while it should also error with “End of file”.

I suspect the following:

  • Something wrong in one of the two adapters (not sure what at the moment), especially the sequence of calls
  • A compatibility issue (do both solvers use the same MPI? I had strange hangs like this once and that was the cause)

Hi~
“stopped” means both terminal neither advanced nor existed and the screen didn’t change as the picture shows. The left one is OpenFoam and the other one is MBDyn. No error or hints showed on the screen.


From the right terminal (MBDyn) I notice that the preCICE goes up to 0.032s. But the log file of MBDyn shows it stopped at 0.029s.
I have changed “first participant” to OpenFOAM, but the results didn’t change.
I didn’t use parallel computing. What do you mean both solvers use the same MPI? sorry, I am not familiar with MPI.

It shows as follows when I pressed control+C to force the coupling to stop running.

Could you switch on debugging output as suggested in the first example here? For this you need to build preCICE in Debug mode, cmake [...] CMAKE_BUILD_TYPE=Debug.
There is a sychronization step in finalize, both solvers wait for each other. What you describe sounds like a standard deadlock.
I guess that the problem is that you don’t call finalize at the right moment (as soon as ongoing==false) is one of both adapters.

Dear Benjamin,

Thank you for your advice. I switched on debugging output. As you said, both solvers wait for each other. But I don’t know how to solve the deadlock. Attached is precice.log. I am not sure if I can upload files as a new user of DIscourse. So part of the file is pasted here too. Hope to hear from you.

(0) 21:59:16 [impl::SolverInterfaceImpl]:390 in advance: [0mit 1 of 1 | dt# 35 | t 0.034 of 0.5 | dt 0.001 | max dt 0.001 | ongoing yes | dt complete yes |

(0) 21:59:16 [impl::SolverInterfaceImpl]:1523 in handleExports: [0mEntering operator()

(0) 21:59:16 [impl::SolverInterfaceImpl]:1523 in handleExports: [0mLeaving handleExports

(0) 21:59:16 [impl::SolverInterfaceImpl]:321 in advance: [0mLeaving advance

(0) 21:59:16 [impl::SolverInterfaceImpl]:489 in isCouplingOngoing: [0mEntering operator()

(0) 21:59:16 [impl::SolverInterfaceImpl]:489 in isCouplingOngoing: [0mLeaving isCouplingOngoing

(0) 21:59:16 [impl::SolverInterfaceImpl]:483 in getDimensions: [0mEntering operator()

Argument 0: _dimensions == 3

(0) 21:59:16 [impl::SolverInterfaceImpl]:483 in getDimensions: [0mLeaving getDimensions

(0) 21:59:16 [impl::SolverInterfaceImpl]:997 in writeBlockVectorData: [0mEntering operator()

Argument 0: fromDataID == 3

Argument 1: size == 189

(0) 21:59:16 [impl::SolverInterfaceImpl]:997 in writeBlockVectorData: [0mLeaving writeBlockVectorData

(0) 21:59:16 [impl::SolverInterfaceImpl]:483 in getDimensions: [0mEntering operator()

Argument 0: _dimensions == 3

(0) 21:59:16 [impl::SolverInterfaceImpl]:483 in getDimensions: [0mLeaving getDimensions

(0) 21:59:16 [impl::SolverInterfaceImpl]:1118 in readBlockVectorData: [0mEntering operator()

Argument 0: toDataID == 2

Argument 1: size == 189

(0) 21:59:16 [impl::SolverInterfaceImpl]:1118 in readBlockVectorData: [0mLeaving readBlockVectorData

(0) 21:59:16 [impl::SolverInterfaceImpl]:321 in advance: [0mEntering operator()

Argument 0: computedTimestepLength == 0.001

(0) 21:59:16 [impl::SolverInterfaceImpl]:1404 in mapWrittenData: [0mEntering operator()

(0) 21:59:16 [impl::SolverInterfaceImpl]:1404 in mapWrittenData: [0mLeaving mapWrittenData

(0) 21:59:16 [impl::SolverInterfaceImpl]:1512 in performDataActions: [0mEntering operator()

(0) 21:59:16 [impl::SolverInterfaceImpl]:1512 in performDataActions: [0mLeaving performDataActions

(0) 21:59:16 [impl::SolverInterfaceImpl]:373 in advance: [0mAdvancing coupling scheme

(0) 21:59:20 [impl::SolverInterfaceImpl]:1512 in performDataActions: [0mEntering operator()

(0) 21:59:20 [impl::SolverInterfaceImpl]:1512 in performDataActions: [0mLeaving performDataActions

(0) 21:59:20 [impl::SolverInterfaceImpl]:1456 in mapReadData: [0mEntering operator()

(0) 21:59:20 [impl::SolverInterfaceImpl]:1456 in mapReadData: [0mLeaving mapReadData

(0) 21:59:20 [impl::SolverInterfaceImpl]:390 in advance: [0mit 1 of 1 | dt# 36 | t 0.035 of 0.5 | dt 0.001 | max dt 0.001 | ongoing yes | dt complete yes |

(0) 21:59:20 [impl::SolverInterfaceImpl]:1523 in handleExports: [0mEntering operator()

(0) 21:59:20 [impl::SolverInterfaceImpl]:1523 in handleExports: [0mLeaving handleExports

(0) 21:59:20 [impl::SolverInterfaceImpl]:321 in advance: [0mLeaving advance

(0) 21:59:20 [impl::SolverInterfaceImpl]:489 in isCouplingOngoing: [0mEntering operator()

(0) 21:59:20 [impl::SolverInterfaceImpl]:489 in isCouplingOngoing: [0mLeaving isCouplingOngoing

1523 in handleExports: [0mLeaving handleExports

(0) 21:59:00 [impl::SolverInterfaceImpl]:321 in advance: [0mLeaving advance

(0) 21:59:00 [impl::SolverInterfaceImpl]:489 in isCouplingOngoing: [0mEntering operator()

(0) 21:59:00 [impl::SolverInterfaceImpl]:489 in isCouplingOngoing: [0mLeaving isCouplingOngoing

(0) 21:59:05 [impl::SolverInterfaceImpl]:1118 in readBlockVectorData: [0mEntering operator()

Argument 0: toDataID == 1

Argument 1: size == 192

(0) 21:59:05 [impl::SolverInterfaceImpl]:1118 in readBlockVectorData: [0mLeaving readBlockVectorData

(0) 21:59:05 [impl::SolverInterfaceImpl]:997 in writeBlockVectorData: [0mEntering operator()

Argument 0: fromDataID == 0

Argument 1: size == 192

(0) 21:59:05 [impl::SolverInterfaceImpl]:997 in writeBlockVectorData: [0mLeaving writeBlockVectorData

(0) 21:59:05 [impl::SolverInterfaceImpl]:321 in advance: [0mEntering operator()

Argument 0: computedTimestepLength == 0.001

(0) 21:59:05 [impl::SolverInterfaceImpl]:1404 in mapWrittenData: [0mEntering operator()

(0) 21:59:05 [impl::SolverInterfaceImpl]:1435 in mapWrittenData: [0mMap data "Forces" from mesh "FluidMesh"

(0) 21:59:05 [impl::SolverInterfaceImpl]:1437 in mapWrittenData: [0mMap from dataID 0 to dataID: 2

(0) 21:59:05 [impl::SolverInterfaceImpl]:1439 in mapWrittenData: [0mFirst mapped values =  53.7965  -12.909 0.297074  57.3109 -20.4121 0.469742  61.5482 -29.0491 0.668502  66.5021

(0) 21:59:05 [impl::SolverInterfaceImpl]:1404 in mapWrittenData: [0mLeaving mapWrittenData

(0) 21:59:05 [impl::SolverInterfaceImpl]:1512 in performDataActions: [0mEntering operator()

(0) 21:59:05 [impl::SolverInterfaceImpl]:1512 in performDataActions: [0mLeaving performDataActions

(0) 21:59:05 [impl::SolverInterfaceImpl]:373 in advance: [0mAdvancing coupling scheme

(0) 21:59:05 [impl::SolverInterfaceImpl]:1512 in performDataActions: [0mEntering operator()

(0) 21:59:05 [impl::SolverInterfaceImpl]:1512 in performDataActions: [0mLeaving performDataActions

(0) 21:59:05 [impl::SolverInterfaceImpl]:1456 in mapReadData: [0mEntering operator()

(0) 21:59:05 [impl::SolverInterfaceImpl]:1487 in mapReadData: [0mMap read data "Displacements" to mesh "FluidMesh"

(0) 21:59:05 [impl::SolverInterfaceImpl]:1489 in mapReadData: [0mFirst mapped values =  4.22603e-18    0.0345102      1.06216  -0.00013702    0.0536917      2.05393 -0.000359231    0.0711244      3.04572 -0.000593126

(0) 21:59:05 [impl::SolverInterfaceImpl]:1456 in mapReadData: [0mLeaving mapReadData

(0) 21:59:05 [impl::SolverInterfaceImpl]:390 in advance: [0mit 1 of 1 | dt# 34 | t 0.033 of 0.5 | dt 0.001 | max dt 0.001 | ongoing yes | dt complete yes |

(0) 21:59:05 [impl::SolverInterfaceImpl]:1523 in handleExports: [0mEntering operator()

(0) 21:59:05 [impl::SolverInterfaceImpl]:1523 in handleExports: [0mLeaving handleExports

(0) 21:59:05 [impl::SolverInterfaceImpl]:321 in advance: [0mLeaving advance

(0) 21:59:06 [impl::SolverInterfaceImpl]:489 in isCouplingOngoing: [0mEntering operator()

(0) 21:59:06 [impl::SolverInterfaceImpl]:489 in isCouplingOngoing: [0mLeaving isCouplingOngoing

(0) 21:59:10 [impl::SolverInterfaceImpl]:1118 in readBlockVectorData: [0mEntering operator()

Argument 0: toDataID == 1

Argument 1: size == 192

(0) 21:59:10 [impl::SolverInterfaceImpl]:1118 in readBlockVectorData: [0mLeaving readBlockVectorData

(0) 21:59:11 [impl::SolverInterfaceImpl]:997 in writeBlockVectorData: [0mEntering operator()

Argument 0: fromDataID == 0

Argument 1: size == 192

(0) 21:59:11 [impl::SolverInterfaceImpl]:997 in writeBlockVectorData: [0mLeaving writeBlockVectorData

(0) 21:59:11 [impl::SolverInterfaceImpl]:321 in advance: [0mEntering operator()

Argument 0: computedTimestepLength == 0.001

(0) 21:59:11 [impl::SolverInterfaceImpl]:1404 in mapWrittenData: [0mEntering operator()

(0) 21:59:11 [impl::SolverInterfaceImpl]:1435 in mapWrittenData: [0mMap data "Forces" from mesh "FluidMesh"

(0) 21:59:11 [impl::SolverInterfaceImpl]:1437 in mapWrittenData: [0mMap from dataID 0 to dataID: 2

(0) 21:59:11 [impl::SolverInterfaceImpl]:1439 in mapWrittenData: [0mFirst mapped values =   53.797 -12.9094 0.297083  57.3114 -20.4135 0.469773  61.5484 -29.0507  0.66854  66.5019

(0) 21:59:11 [impl::SolverInterfaceImpl]:1404 in mapWrittenData: [0mLeaving mapWrittenData

(0) 21:59:11 [impl::SolverInterfaceImpl]:1512 in performDataActions: [0mEntering operator()

(0) 21:59:11 [impl::SolverInterfaceImpl]:1512 in performDataActions: [0mLeaving performDataActions

(0) 21:59:11 [impl::SolverInterfaceImpl]:373 in advance: [0mAdvancing coupling scheme

(0) 21:59:11 [impl::SolverInterfaceImpl]:1512 in performDataActions: [0mEntering operator()

(0) 21:59:11 [impl::SolverInterfaceImpl]:1512 in performDataActions: [0mLeaving performDataActions

(0) 21:59:11 [impl::SolverInterfaceImpl]:1456 in mapReadData: [0mEntering operator()

(0) 21:59:11 [impl::SolverInterfaceImpl]:1487 in mapReadData: [0mMap read data "Displacements" to mesh "FluidMesh"

(0) 21:59:11 [impl::SolverInterfaceImpl]:1489 in mapReadData: [0mFirst mapped values =  4.57025e-18    0.0373854      1.06209 -0.000103111    0.0585333      2.05382 -0.000295384    0.0778745      3.04558 -0.000507566

(0) 21:59:11 [impl::SolverInterfaceImpl]:1456 in mapReadData: [0mLeaving mapReadData

(0) 21:59:11 [impl::SolverInterfaceImpl]:390 in advance: [0mit 1 of 1 | dt# 35 | t 0.034 of 0.5 | dt 0.001 | max dt 0.001 | ongoing yes | dt complete yes |

(0) 21:59:11 [impl::SolverInterfaceImpl]:1523 in handleExports: [0mEntering operator()

(0) 21:59:11 [impl::SolverInterfaceImpl]:1523 in handleExports: [0mLeaving handleExports

(0) 21:59:11 [impl::SolverInterfaceImpl]:321 in advance: [0mLeaving advance

(0) 21:59:11 [impl::SolverInterfaceImpl]:489 in isCouplingOngoing: [0mEntering operator()

(0) 21:59:11 [impl::SolverInterfaceImpl]:489 in isCouplingOngoing: [0mLeaving isCouplingOngoing

(0) 21:59:16 [impl::SolverInterfaceImpl]:1118 in readBlockVectorData: [0mEntering operator()

Argument 0: toDataID == 1

Argument 1: size == 192

(0) 21:59:16 [impl::SolverInterfaceImpl]:1118 in readBlockVectorData: [0mLeaving readBlockVectorData

(0) 21:59:16 [impl::SolverInterfaceImpl]:997 in writeBlockVectorData: [0mEntering operator()

Argument 0: fromDataID == 0

Argument 1: size == 192

(0) 21:59:16 [impl::SolverInterfaceImpl]:997 in writeBlockVectorData: [0mLeaving writeBlockVectorData

(0) 21:59:16 [impl::SolverInterfaceImpl]:321 in advance: [0mEntering operator()

Argument 0: computedTimestepLength == 0.001

(0) 21:59:16 [impl::SolverInterfaceImpl]:1404 in mapWrittenData: [0mEntering operator()

(0) 21:59:16 [impl::SolverInterfaceImpl]:1435 in mapWrittenData: [0mMap data "Forces" from mesh "FluidMesh"

(0) 21:59:16 [impl::SolverInterfaceImpl]:1437 in mapWrittenData: [0mMap from dataID 0 to dataID: 2

(0) 21:59:16 [impl::SolverInterfaceImpl]:1439 in mapWrittenData: [0mFirst mapped values =  53.7974 -12.9093 0.321846  57.3119 -20.4139 0.508946  61.5485  -29.051  0.72428  66.5017

(0) 21:59:16 [impl::SolverInterfaceImpl]:1404 in mapWrittenData: [0mLeaving mapWrittenData

(0) 21:59:16 [impl::SolverInterfaceImpl]:1512 in performDataActions: [0mEntering operator()

(0) 21:59:16 [impl::SolverInterfaceImpl]:1512 in performDataActions: [0mLeaving performDataActions

(0) 21:59:16 [impl::SolverInterfaceImpl]:373 in advance: [0mAdvancing coupling scheme

(0) 21:59:16 [impl::SolverInterfaceImpl]:1512 in performDataActions: [0mEntering operator()

(0) 21:59:16 [impl::SolverInterfaceImpl]:1512 in performDataActions: [0mLeaving performDataActions

(0) 21:59:16 [impl::SolverInterfaceImpl]:1456 in mapReadData: [0mEntering operator()

(0) 21:59:16 [impl::SolverInterfaceImpl]:1487 in mapReadData: [0mMap read data "Displacements" to mesh "FluidMesh"

(0) 21:59:16 [impl::SolverInterfaceImpl]:1489 in mapReadData: [0mFirst mapped values = -4.14861e-17    0.0373854      1.06209 -0.000105812    0.0585321      2.05382 -0.000300905    0.0778766      3.04558 -0.000515673

(0) 21:59:16 [impl::SolverInterfaceImpl]:1456 in mapReadData: [0mLeaving mapReadData

(0) 21:59:16 [impl::SolverInterfaceImpl]:390 in advance: [0mit 1 of 1 | dt# 36 | t 0.035 of 0.5 | dt 0.001 | max dt 0.001 | ongoing yes | dt complete yes |

(0) 21:59:16 [impl::SolverInterfaceImpl]:1523 in handleExports: [0mEntering operator()

(0) 21:59:16 [impl::SolverInterfaceImpl]:1523 in handleExports: [0mLeaving handleExports

(0) 21:59:16 [impl::SolverInterfaceImpl]:321 in advance: [0mLeaving advance

(0) 21:59:16 [impl::SolverInterfaceImpl]:489 in isCouplingOngoing: [0mEntering operator()

(0) 21:59:16 [impl::SolverInterfaceImpl]:489 in isCouplingOngoing: [0mLeaving isCouplingOngoing

(0) 21:59:20 [impl::SolverInterfaceImpl]:1118 in readBlockVectorData: [0mEntering operator()

Argument 0: toDataID == 1

Argument 1: size == 192

(0) 21:59:20 [impl::SolverInterfaceImpl]:1118 in readBlockVectorData: [0mLeaving readBlockVectorData

(0) 21:59:20 [impl::SolverInterfaceImpl]:997 in writeBlockVectorData: [0mEntering operator()

Argument 0: fromDataID == 0

Argument 1: size == 192

(0) 21:59:20 [impl::SolverInterfaceImpl]:997 in writeBlockVectorData: [0mLeaving writeBlockVectorData

(0) 21:59:20 [impl::SolverInterfaceImpl]:321 in advance: [0mEntering operator()

Argument 0: computedTimestepLength == 0.001

(0) 21:59:20 [impl::SolverInterfaceImpl]:1404 in mapWrittenData: [0mEntering operator()

(0) 21:59:20 [impl::SolverInterfaceImpl]:1435 in mapWrittenData: [0mMap data "Forces" from mesh "FluidMesh"

(0) 21:59:20 [impl::SolverInterfaceImpl]:1437 in mapWrittenData: [0mMap from dataID 0 to dataID: 2

(0) 21:59:20 [impl::SolverInterfaceImpl]:1439 in mapWrittenData: [0mFirst mapped values =   53.798 -12.9097 0.321856  57.3125 -20.4152 0.508979  61.5488 -29.0526  0.72432  66.5017

(0) 21:59:20 [impl::SolverInterfaceImpl]:1404 in mapWrittenData: [0mLeaving mapWrittenData

(0) 21:59:20 [impl::SolverInterfaceImpl]:1512 in performDataActions: [0mEntering operator()

(0) 21:59:20 [impl::SolverInterfaceImpl]:1512 in performDataActions: [0mLeaving performDataActions

(0) 21:59:20 [impl::SolverInterfaceImpl]:373 in advance: [0mAdvancing coupling scheme

precice.log (101 KB)

So, MBDyn waits at advance():

(0) 21:59:20 [impl::SolverInterfaceImpl]:373 in advance: e[0mAdvancing coupling scheme

What happens on the OpenFOAM side?

Do you suspect that any part of your adapters’ code is not as it should be?

I simplified the adapter from https://github.com/precice/mbdyn-adapter and used it for explicit-serial coupling. I think there is something wrong when MBDyn is used for explicit coupling. But I am not sure.

Hi, thank you for your reply.

I briefly looked into your code and tried to compare it with the MBDyn adapter on GitHub.

An important difference is that while self.interface.is_coupling_ongoing() is replaced by while 1, meaning that seld.interface.advance() is always called, even at the end of the simulation. Why did you comment it out?

Also, what is this send() function you mention in the screenshot and where is it used? Why do you need both this and preCICE?

By the way, you don’t need to use screenshots for everything (especially not for error messages, that will not be discoverable by other users with similar issues). You can always insert code blocks, for example:

while (self.interface.is_coupling_ongoing()):
            if (self.interface.is_action_required(precice.action_write_iteration_checkpoint())):
                self.interface.fulfilled_action(precice.action_write_iteration_checkpoint())

Hi~

The same deadlock appeared when I use while self.interface.is_coupling_ongoing() before. The simulation stopped before the end time. So I think the bug is not here.

The send() function is the Python API of MBDyn, which puts external forces to peer and is also used in the MBDyn adapter on Github. It can be found in the helper.py

def solve(self, converged = True):
    # self.calcLoads()
    stop = self.nodal.send(converged)
    if stop:
        self.writeVTK('final')
        return True
    stop = self.nodal.recv()
    if stop:
        self.writeVTK('final')
        return True
    return False

So I followed it.
In the example, the external structural variant of the force element is used in MBDyn. It allows to transmit the motion of a set of structural nodes to an external software and to receive back the value of the corresponding set of forces and moments to be applied to the nodes. The solvers communicate though either UNIX or INET sockets using a native protocol.

Did you mean this function is in conflict with preCICE?

I must admit I have lost overview a bit.
Could you please upload your preCICE config and give debug logging for both participants when using self.interface.is_coupling_ongoing() in the MBDyn adapter?

Did you mean this function is in conflict with preCICE?

No, I don’t expect any conflict here. But could very well be that you introduce the deadlock this way. This mechanism is used to communicate between the Python script and MBDyn? Or to which external software to you communicate?

The following are my preCICE config and debug logging.
debug.log (101.2 KB) precice-config.xml (1.8 KB)

As far as I know, two mechanisms in MBDyn are needed in coupling. The first one(the function send(), recv(), etc) is used to communicate between the Python script and MBDyn(compiled by C++). Additionally, the external force element is needed to communicate with an external software via files or sockets and apply forces on structures.

I noticed that MBDyn will create the socket and the peer has to connect to it when the external force element is used. Will preCICE create another new socket when it is called by the two solvers?

Thank you.

Hi Jun,

I quickly read through your MBDyn adapter code and it seems to do exactly the same as my MBDyn adapter. I think the line self.mbd.solve(False) is not needed in your code with explicit coupling but it should not hurt either. I also tried the cavity case with explicit coupling and the time stayed in sync with OpenFOAM/preCICE.

Best, Mikko

1 Like

Hi Mikko,

Thank you for your help. Your MBDyn adapter is pretty good and useful and I made some changes based on it to couple my own structural models. The deadlock happened when the external force element chose loose coupling in MBDyn. However, it coupled well with preCICE/OpenFOAM when the external force element chose tight coupling even if explicit coupling is used in PreCICE. But I think loose should be chosen in explicit coupling, right? Will this affect the results of coupling?

force: 1, external structural,
socket,
create, yes,
path, “/tmp/mbdyn.sock”,
no signal,
coupling,
tight,

Another question is why you called solve() function twice in one timestep? Does that mean the send() and recv()would also be called twice?

Hope to receive your reply.

Best,
Jun

Hi Jun,

Here is an old post by Pierangelo Masarati in MBDyn’s mailing list which explains how FSI with MBDyn works:

For each time step,

  • first MBDyn predicts the new solution,
  • then sends it to the peer solver, and waits for loads,
  • then solves with new loads
  • in case of loose coupling: for each time step, one info exchange at
    first iteration; then MBDyn iterates until convergence with frozen loads

  • in case of tight coupling: for each time step, information exchange
    at each iteration, until mutual convergence (if MBDyn converges, it
    continues calling the peer solver with frozen kinematics until the
    peer solver converges too; if peer solver converges, MBDyn continues
    iterating until convergence with frozen loads)

So in case of tight coupling, send(False) does one MBDyn iteration and then the resulting displacements are passed to preCICE which checks if the convergence tolerances have been met. If not, we continue to the next iteration and read the new forces from preCICE and again call send(False). If the two solvers converge then we call send(True) which moves to the next time step. For loose coupling, you can still use tight coupling but then just call send(True) at each time step.

Best, Mikko

1 Like

Hi Mikko,
Does the variable True in send(True) mean both solvers converge or only MBDyn converges? Did you mean MBDyn continues iterating until convergence with frozen loads sent by PreCICE when I use tight coupling in MBDyn but just call send(True) for explicit coupling?