Hello everyone,
I have a rather daunting problem in establishing communication between my two solvers, namely OpenFOAM and a FEM commercial solver, while running on a cluster.
I am currently using preCICE v2.5.0 and OpenMPI for OpenFOAM.
The FEM solver, instead, has its own pre-compiled preCICE library (also 2.5.0). In order to run the solid partecipant in parallel, I have to use the intra-comm:sockets option in the precice-config.xml. Otherwise, this error shows up:
—[precice] ^[[31mERROR: ^[[0m Implicit intra-participant communications for parallel participants
are only available if preCICE was built with MPI. Either explicitly define an intra-participant
communication for each parallel participant or rebuild preCICE with
“PRECICE_MPICommunication=ON”.
Everything works perfectly fine running both partecipants in parallel IF OpenFOAM is running on only one node.
When I switch to multiple nodes, OpenFOAM hangs during initialization, as it cannot connect the primary rank to the others:
I should also mention that if I run OpenFOAM on multiple nodes, without the key intra-comm:sockets on the other participant (hence running it in serial) it works.
To communicate between the two participants I’m using:
<m2n:sockets from="Fluid" to="Solid" exchange-directory="/complete/path/to/run-folder/" network="ib0" />
Have you got any idea why this happens? I have to run huge simulations and this is a big bottleneck for me.
Thank you in advance,
Alice