Pipe flow FSI using OpenFOAM and CalculiX

I was setting up an FSI case for a cylindrical pipe however, when I try to run the case, both OF and ccx stop at:
“—[precice] Setting up master communication to coupling partner/s”
I am not quite sure what the issue is and would appreciate any help.

leePipe.zip (5.3 MB)

Hi @mishal49,

here are some things you can try: Help! The participants are not finding each other! - #2 by Makis

Hi @Makis,
I did notice this post on using shell elements. Could this also be a potential issue? I have used shell elements in the Calculix setup.

I tried deleting the precice-run directory and re-ran the case but the solid gives the following error and the fluid one keeps running alone:

Setting up preCICE participant Solid, using config file: config.yml
—[precice] This is preCICE version 2.3.0
—[precice] Revision info: no-info [Git failed/Not a repository]
—[precice] Configuration: Release (Debug and Trace log unavailable)
—[precice] Configuring preCICE with configuration “…/precice-config.xml”
—[precice] I am participant “Solid”
Set ID Found
Read data ‘Force’ found with ID # ‘3’.
Write data ‘DisplacementDelta’ found with ID # ‘2’.
—[precice] Setting up master communication to coupling partner/s
—[precice] Masters are connected
—[precice] Setting up preliminary slaves communication to coupling partner/s
—[precice] Receive global mesh Fluid-Mesh-Faces
—[precice] Prepare partition for mesh Solid-Mesh
—[precice] Gather mesh Solid-Mesh
—[precice] Send global mesh Solid-Mesh
—[precice] Setting up slaves communication to coupling partner/s
—[precice] Slaves are connected
terminate called after throwing an instance of ‘std::bad_alloc’
what(): std::bad_alloc
./run.sh: line 6: 2139857 Aborted (core dumped) ccx_preCICE -i tube -precice-participant Solid

I don’t know about shell elements in CalculiX, but what happens if you change that? The (very generic/cryptic) error could be related to that, but could also be some simpler technical issue related to building CalculiX/the adapter (only if you recently changed something there).

Hello @Makis
I have not really made any changes to the installation and tutorial cases are still running so I assume its something else. I changes the *SHELL SECTION in the tube.inp file to *SOLID SECTION but naturally, it says that I cannot use that for a tube or beam.

I actually doubt that the problem is in the CalculiX configuration, if the solvers don’t find each other in the first place. Did you already try deleting any stray precice-run/ directory before starting?

It could very well be a problem in the CalculiX adapter.
Shell elements should be supported IIRC.
Maybe @boris-martin has an idea how to debug this?

1 Like


Shell elements are currently supported since recent changes that haven’t yet been merged on the master branch.
Does it work if you build the adapter from source on the “develop” branch ?

Hey @Makis
I tried this and still no luck

Hey @boris-martin,
I looked at what is on the develop branch GitHub - precice/precice: A coupling library for partitioned multi-physics simulations, including, but not restricted to fluid-structure interaction and conjugate heat transfer simulations..
Should I just download the entire code or is there a specific part pertinent to the ccx adapter?

I didn’t mean the develop branch of preCICE, but the develop branch of the adapter itself.
GitHub - precice/calculix-adapter at develop, which you’d have to compile.

Hello @boris-martin,
Thank you for the link. I recompiled the new adapter yet the error persists. I get the same error that I mentioned earlier with the solid and the fluid runs on its own.

I also noticed that they both run perfectly fine separately but when coupled it results in this problem.
By the way I am running this on a cluster.

I tried running it on my PC as well and it just says killed:

Oh, wait, I now saw the Slaves are connected part. I once had this issue when mixing up MPI versions, i.e., the MPI version that preCICE was compiled with and the MPI version the solvers were compiled with.

Although, CalculiX should not have anything to do with MPI.

I tried running the case and both parts seem to work (or at least go further than you, I stopped after a few iterations).

Wait, the case couples and runs?
They run separately for me but not together

Yes. Maybe your calculix issue will disappear once you fix the OpenFOAM side

What do you think the issue with the OpenFOAM side is?
I am not getting any errors so I’m not sure.

@mishal49 can you comment on this? How did you install preCICE and OpenFOAM in this system?

Hi @Makis
I am sure I used intel/19.1.1 for preCICE, adapters and CalculiX. I installed OpenFOAM earlier so I am not sure how to check the compiler used however, it says “export WM_MPLIB=INTELMPI” in the OF bashrc file so I assume I still used intel.
Is this the information you were looking for?