Large memory usage while running

Hi,

First of all thank you for developing this amazing library! It’s quite something to be able to couple code for FSI and have simulations running in a matter of weeks!

I am using preCICE to couple an in-house finite-volume immersed boundary method code called Lotus, (written in Fortran) to CalculiX to do some FSI of sheets/membranes. I have the following questions:

  1. When running the simulations, the memory usage gradually increases up to the point where all my available memory is used (this takes a while but my simulations run for a while). This is not a problem with our in-house code, as running the rigid-body version of the problem result in a constant memory usage, and the same with CalculiX.

  2. Is there a way to have one of the participant stop the simulation internally, instead of when the time reached the maximum set in the precice-config.xml file? For now, I can exit the time integration loop, but the simulation then stops at the finalize() call (I suppose this is because for the other participant isCouplingOngoing() still return True?).

Here is my precice config file
precice-config.xml (2.1 KB)

Thanks in advance

Marin Lauber

Could you provide some more information on which version numbers of preCICE and Calculix you are using and maybe also if you have build preCICE yourself and how (which options) if applicable?!

About what size of memory, mesh points and length of simulation are we talking here?

Hi @marinlauber,

In the past memory leaks have been identified when using CalculiX in a coupled simulation. Please see the relevant issue. It has been difficult to identify the source of the leaks, specifically if it is from the adapter or CalculiX itself. Can you provide more information on your case as @ajaust has suggested? That would definitely give more clues.

Your understanding of finalize() and isCouplingOngoing() is correct. I am not sure I understand why would you want one solver to stop the simulation internally and let the other participant continue. Is there a use case for this? Or do you want both participants to stop at an abrupt stage?

@ajaust @IshaanDesai

I am using preCICE version 2.2.0 (from a package distribution not from source, I am using Ubuntu 18.04) and CalculiX 2.16. I compiled the preCICE adapter as described on the github repo.

These size of the simulation is not so big, I am using a rectilinear grid with 256x128x128 cells for the fluid and the solid is a plate of 11x51 nodes (triangular elements). At the start of the simulation, I am using around 5GB of memory (this stays at this level if I only run the fluid sim), but it grows to 44GB towards the end. Simulations are around 50 seconds of physical time.

Thanks for the answer, I am looking at the the issues you mentioned on the CalculiX repo about memory leaks.

Finally, @IshaanDesai I want to be able to stop the whole simulation from the fluid because for steady state FSI I don’t want to have to try to find a simulation end time, I measure convergence from within the fluid solver an stop it from there. But that’s just to save me some time, I can work around it. I was just wondering if there is a stop switch available to participants.

Sorry for the late reply. I wonder whether you could identify which participant of your coupled simulation requires the vast amount of memory. Is it Calculix or Lotus which uses most of the 44 GB?

Maybe we also have to check back with the preCICE developers on how to debug that. In the issue on GitHub I mentioned that one might want to trace the memory consumption instead of looking for memory leaks. I could imagine that it is not a leak, but faulty logic that does not free memory when it could already.

@marinlauber sorry for the late reply, was quite busy over the past few days. You explicitly say that the total memory usage remains around 5GB if you only run the fluid simulation. Here do you mean you run a single-physics test case for the fluid simulation which has no coupling whatsoever? If that is the case then CalculiX or CalculiX-adapters are strong contenders for the memory accumulation.

But first let us also negate other possible memory issues. You are performing a 3D coupling, so which mapping schemes are you using? And what is the type of data you are transferring along the coupling interface? If possible could you please share your precice-config.xml?

Depending on the complexity of your case it would be possible to narrow down where the memory leak is originating from.

Unfortunately there is no such stop switch within preCICE. If one participant stops and exits within the time loop then preCICE would also stop and end the other participant. However sometimes preCICE may also hang here and the behavior is not well defined. If you really think such an internal stop switch would be good, please open an issue in the preCICE repo where a detailed discussion on the implementation aspects can be done.

Thanks for your reply.

Yes, I run a single-physics test case with the exact same set-up as the coupled simulation. I can do the same with CalculiX only (single-physics) and the memory usage is stable. I am relatively confident the leak/accumulation is from the adapter itself.

I already supplied the precice-config.xml in the original post, but here it is again
precice-config.xml (2.1 KB)

I am transferring forces from the fluid to the solid mesh, and displacements from the solid to the fluid mesh. The Forces are computed on the cell faces (triangular faces) and the displacement are supplied to the nodes of the triangles. I use nearest-neighbour interpolation for the data transfer.

Okay then it is very likely that the memory leak is coming from the CalculiX-Adapter itself.

Sorry for the double request of the config file, in any case everything seems to be okay there.

Can you try to do uni-directional coupling of your setup in two stages to narrow down the leak? Meaning, in one simulation the coupling is configured to only transfer forces from the fluid to the solid mesh and in the other simulation only displacements are transferred from the solid to fluid. Of course both these cases will produce non-physical results which might be problematic from the stability perspective. To somehow get something running you can define some dummy values within a physical range for the forces and displacements. If we are able to see the memory leak in only one of these cases then we can narrow down to the read or write functionality of the adapter.

Please let us know what you find here, the memory leak in the CalculiX-Adapter is something we have want to figure out and resolve.

Related discussion: CalculiX adapter memory just keeps growing

1 Like

@marinlauber we recenly released the adapter v2.19.0, which also fixed a few memory leaks. Could you please check if this improved the situation for you and report back?

We are also looking for a case to reproduce the issue, so if you manage to make a tutorial case trigger this, it would help a lot.

More in the related issue: Check for memory leaks · Issue #10 · precice/calculix-adapter · GitHub

Hi @Makis!

Sorry for the late reply.

So in the meantime I re-installed everything, and also installed some packaged from source and that made the problem disappeared. This was using ccx_2.16.

I have since then updated Ubuntu to 20.04 and I have installed everything again on this distribution, but know using ccx_2.19 and the corresponding adapter. The issue seems resolved.

All the best,

Marin

4 Likes

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.