Point-to-point communication on a cluster

I’m attempting to run the elastic-tube-3d tutorial case on a HPC cluster. I am using the openfoam adapter and the calculix adapter. Thus far, I have not compiled Calculix to run in parallel, so my goal was to run the FSI simulation on 2 nodes (16 cores each). I’d use 1 core to run ccx_precice and 31 cores to run pimpleFoam. This proved to be a bit troublesome.

I am able to run on 1 node just fine. I can use 15 cores for OpenFOAM and 1 core for ccx_precice.

To run on 2 nodes I found that I have to modify my precice-config.xml file to enforce-gather-scatter="1" within the m2n communication specifications. This leads to a few questions:

  1. Why do I have to enforce gathering all the data on to one core before passing it between the adaptors?
  2. How much of a performance hit am I taking when I do this? I’m assuming it will be worse with larger sims.
  3. Is there any way I can keep from having to gather and scatter? I think all of my solvers and adapters were complied with openmpi. I think I read somewhere that this can be a problem?
  4. Any tips on getting Calculix and the calculix adapter to run in parallel across many nodes?

Hi Mike,

To enforce-gather-scatter should not be necessary. This is only an option for debugging (and it would indeed deteriorate performance). But that this works for you might already be a good indicator why things go wrong.
My best guess is that you did not correctly specify the inter-node connector (infiniband on your cluster?). You might need a

<m2n:sockets ... network="ib0" />

More information: Communication configuration | preCICE - The Coupling Library

In case this does not help, please share your preCICE configuration file.

CalculiX adapter is a different story. I just saw you already asked in a different thread. Let’s keep the discussion there or open an additional thread if necessary.

Indeed! Turning off enforce-gather-scatter and specifying network="ib0" did the trick!

Thank you very much. I am very impressed with how helpful the precice community has been in my efforts. Everyone here is awesome!

3 Likes

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.