Use mpi for my own adapter

I am trying to create my own adapter for WRF (a meso-scale meteorology simulation package), such a software often requires the MPI environment for its own. Currently, I am trying to let one single rank of the software to handle the excahnge mesh and data associated with that mesh. Other ranks in WRF send information to the specific rank before it put forwards to the Precice. My question is that should I invoke the precicef_create subroutine with all ranks in WRF, or I can invoke it once with only the rank for communication with precice. If latter, should I specify the rank of 0, and the comm size of 1 in the precicef_create subroutine?

Hi, this depends strongly on how your solver is implemented.

If you use your MPI ranks for parallel processing and need to gather your data on the primary rank (0), then you can create a SolverInterface with size 1 and pass the communicator MPI_COMM_SELF. This essentially means that you have one rank that handles the coupling.

Otherwise, use preCICE on each rank and let preCICE handle the rest.

I however strongly recommend this variant if there is no very good reason against it (is there?). “Create preCICE” on every rank passing the MPI rank and size as argument and then use preCICE on every rank. Otherwise, you lose most of the power of preCICE.

Please keep us updated how the coupling with WRF goes. Are you developing the adapter openly? Could be very interesting for others I assume.

1 Like

yes, I will make the adapter of WRF publicly available. Currently I am working on coupling the WRF with openFOAM to put the simulation results of WRF to “guide” the openFOAM simulation.

Here I have another question about mapping. As the coupled participants are both 3D, the coupling interface are 2D. I am wonderring what would happen in the mapping if the coupling interface provided by WRF and OpenFOAM are not exactly colocating on the same plane, as the vertices provided by the two participant may differ (at two different planes)?

1 Like

If they only differ a bit, no worries. That’s what the mapping is for.
If the plane is axis-aligned and you use an RBF mapping, you might need to switch off the normal direction in the mapping.

I am wondering if I coded the adapter right. I create the precice coupling interface through calling precicef_create for every rank participanted in the WRF simulation, and each rank defines its own mesh vertices through calling the precicef_set_vertices subroutine. I check the vertices coordinates output from the WRF, they are all right. However, the vertices received by OpenFOAM is somehow wrong. Running the openFOAM produces the following output

—[preciceAdapter] Loaded the OpenFOAM-preCICE adapter v1.0.0.
—[preciceAdapter] Reading preciceDict…
—[preciceAdapter] [DEBUG] precice-config-file : /home/sunwei/Work/urbanWindEnvironment/virusCFD/cases/beijing-sample-2020-01-20_18:00:00/foam_run/…/precice.xml
—[preciceAdapter] [DEBUG] participant name : OpenFOAM
—[preciceAdapter] [DEBUG] modules requested :
—[preciceAdapter] [DEBUG] - FF

—[preciceAdapter] [DEBUG] interfaces :
—[preciceAdapter] [DEBUG] - mesh : OpenFOAM-Mesh
—[preciceAdapter] [DEBUG] locations : faceCenters
—[preciceAdapter] [DEBUG] connectivity : 0
—[preciceAdapter] [DEBUG] patches :
—[preciceAdapter] [DEBUG] - Interface
—[preciceAdapter] [DEBUG] writeData :
—[preciceAdapter] [DEBUG] readData :
—[preciceAdapter] [DEBUG] - Velocity
—[preciceAdapter] [DEBUG] - Pressure
—[preciceAdapter] [DEBUG] - TKE
—[preciceAdapter] [DEBUG] - DissipationRate
—[preciceAdapter] [DEBUG] Checking the timestep type (fixed vs adjustable)…
—[preciceAdapter] [DEBUG] Timestep type: fixed.
—[preciceAdapter] [DEBUG] Creating the preCICE solver interface…
—[preciceAdapter] [DEBUG] Number of processes: 1
—[preciceAdapter] [DEBUG] MPI rank: 0
preCICE:e[0m This is preCICE version 2.3.0
preCICE:e[0m Revision info: v2.3.0-79-g8a4f4845
preCICE:e[0m Configuration: Debug
preCICE:e[0m Configuring preCICE with configuration “/home/sunwei/Work/urbanWindEnvironment/virusCFD/cases/beijing-sample-2020-01-20_18:00:00/foam_run/…/precice.xml”
preCICE:e[0m I am participant “OpenFOAM”
—[preciceAdapter] [DEBUG] preCICE solver interface was created.
—[preciceAdapter] [DEBUG] Creating interfaces…
—[preciceAdapter] [DEBUG] Interface created on mesh OpenFOAM-Mesh
—[preciceAdapter] [DEBUG] Adding coupling data writers…
—[preciceAdapter] [DEBUG] Adding coupling data readers…
—[preciceAdapter] [DEBUG] Initalizing the preCICE solver interface…
preCICE:e[0m Setting up master communication to coupling partner/s
preCICE:e[0m Masters are connected
preCICE:e[0m Setting up preliminary slaves communication to coupling partner/s
preCICE:e[0m Prepare partition for mesh OpenFOAM-Mesh
preCICE:e[0m Receive global mesh WRF-Mesh

afterwards, it stops due to the error in
precice::mesh::BoundingBox::expandBy(precice::mesh::Vertex const&) in “/home/sunwei/Apps/precice/lib/libprecice.so.2”

any throughts

I am wondering if I coded the adapter right. I create the precice coupling interface through calling precicef_create for every rank participanted in the WRF simulation, and each rank defines its own mesh vertices through calling the precicef_set_vertices subroutine.

Sounds correct.

Which error message do you get exactly?

Are the coupling meshes of WRF and OpenFOAM meshing the same geometry? Exporting the coupling meshes always helps in such situations: