Running preCICE on a Cluster: The linear system of the RBF mapping from mesh Solid-Mesh to mesh Fluid-Mesh has not converged

Hi,
I’m running a 3D coupled simulation with the OpenFOAM-Adapter.
Launching the simulation on 7 cores works well on my local machine.
However executing the exact same script on a cluster leads to:

(0) 18:07:51 [mapping::PetRadialBasisFctMapping]:785 in map: ERROR: The linear system of the RBF mapping from mesh Solid-Mesh to mesh Fluid-Mesh has not converged. This means most probably that the mapping problem is not well-posed. Please check if your coupling meshes are correct. Maybe you need to fix axis-aligned mapping setups by marking perpendicular axes as dead?

When I’m running a 2D case (3D case but axis z marked as dead) with the same configuration, it works on the cluster without a problem.
I’m running preCICE v2.2.1 and the OpenFOAM-Adapter v1.0.0
The case is run via the command mpirun -np 7 pimpleFoam -case openfoam -parallel.
The full log file can be found here:
pimpleFoam.log (30.3 KB)

Klick here to view my preCICE-config.xml
<?xml version="1.0" encoding="UTF-8" ?>
<precice-configuration>

  <solver-interface dimensions="3">
    <data:vector name="Force_Data" />
    <data:vector name="Displacement_Data" />

    <mesh name="Fluid-Mesh">
      <use-data name="Force_Data" />
      <use-data name="Displacement_Data" />
    </mesh>
    
    <mesh name="Solid-Mesh">
      <use-data name="Displacement_Data" />
      <use-data name="Force_Data" />
    </mesh>

    <participant name="Fluid">
      <use-mesh name="Fluid-Mesh" provide="yes" />
      <use-mesh name="Solid-Mesh" from="Solid" />
      <write-data name="Force_Data" mesh="Fluid-Mesh" />
      <read-data name="Displacement_Data" mesh="Fluid-Mesh" />
      <!--<mapping:rbf-thin-plate-splines direction="write" from="Fluid-Mesh" to="Solid-Mesh" constraint="conservative" z-dead="true"/>
   	  <mapping:rbf-thin-plate-splines direction="read" from="Solid-Mesh" to="Fluid-Mesh" constraint="consistent" z-dead="true"/>-->
   	  <mapping:rbf-compact-polynomial-c0 support-radius="5" direction="write" from="Fluid-Mesh" to="Solid-Mesh" constraint="conservative"/>
   	  <mapping:rbf-compact-polynomial-c0 support-radius="5" direction="read" from="Solid-Mesh" to="Fluid-Mesh" constraint="consistent"/>
    </participant>

    <participant name="Solid">
      <use-mesh name="Solid-Mesh" provide="yes" />
      <write-data name="Displacement_Data" mesh="Solid-Mesh" />
      <read-data name="Force_Data" mesh="Solid-Mesh" />
    </participant>

    <m2n:sockets from="Fluid" to="Solid" exchange-directory="." enforce-gather-scatter="0"/>

    <coupling-scheme:parallel-explicit>
      <time-window-size value="0.0001" />
      <max-time value="0.5" />
      <participants first="Fluid" second="Solid" />
      <exchange data="Force_Data" mesh="Solid-Mesh" from="Fluid" to="Solid" />
      <exchange data="Displacement_Data" mesh="Solid-Mesh" from="Solid" to="Fluid" />
    </coupling-scheme:parallel-explicit>
  </solver-interface>
</precice-configuration>

Maybe somebody has a clue what is going on :slight_smile:

Hi @JulianSchl,

ok so for my understanding again. The overall case is a quasi 2D case, i.e. you have preCICE dimension 3, but only physics in 2D. The simulation works on your local machine without any issues, but on the cluster, you need to mark axis as dead in order to make it work?!

Although the error occurs here in your case on the Fluid side, the ‘crucial’ mesh is here the solid mesh (from mesh in case of a read mapping). Which solid solver do you use and is the solid mesh the same as well?

If the mapping problem is ill-posed or close to ill-posed, it can happen that one PETSc implementation works and another not. We had similar issues in:

Solution is always to make your mapping problem well-defined (setting axis to dead) or properly go to 2D.

Hi @DavidSCN and @uekerman,

thank you for your messages and sorry for my late reply!


Maybe I havent expressed myself that well. The case is a 3D case coupling a 3D airfoil with FSI.

view picture of the mesh

airfoil3D



Yes marking the z-axis as dead is a way to make the simulation run on the cluster, but would physically lead to wrong results, since it is a 3D case.


I’m using MBDyn as solid solver and I’m using a simplified mesh version of the fluid mesh.



As mentioned, I’m running a 3D case. Is there another way to solve such a problem?

To visualize the mapping a bit better, I made some screenshots:

solid solver mesh

solid solver mesh (rotated view)

fluid solver patch

fluid solver patch (rotated view)

Note: the airfoil patch in the fluid solver is closed on one side as you can see in the image. In the solid mesh it is opened on both sides. I was expecting this is not a problem, when mapping these meshes. What do you think?

But still I have no clue why the case is not running on the cluster, but on my machine. If you think it could be a PETSc issue: On my local machine, I’ve installed the preCICE Ubuntu package and on the cluster I built preCICE from source (including PETSc). Can I deactivate features on the cluster to acheive a similar environment as on my machine?

Sry for the late reply.

You use a support radius of 5. How does this value relate to say the span-wise width of your airfoil?
It could be that your mapping system did not diverge, but neither converge. Good to know: Our next release v2.3 will give more information for such a case.

Just to be sure: both your meshes mesh the same geometry. Meaning, when you plot both at the same time they lie nearly over each other, right?

This could indeed be a problem when you have a fluid mesh partition which is only on this side. But actually, I would expect a different error message then. Sth coming from the re-partitioning.
Can you visualize your fluid partitions? Any chance to cut this lid?

To get a bit more information: what happens when you run with a different number of ranks?

Hi :wave:

I selected the support radius just for testing purposes (I thought this value is conservative enough). The airfoil has the following dimensions:
chord length: 0.144 m
maximum thickness: 0.017 m
span length: 0.15 m

see screenshot

Because I’m not exactly sure how to calculate the support radius, I’ve opened a seperate topic.

It should be … I was trying to visualize it with paraview using the export function of preCICE, but with the RBF mapping I’m getting many different files, which don’t show the expected mesh when selecting the glyph filter. Is there a way to plot such meshes with preCICE?

Still the mapping works on the local machine for > 2000 timewindows …

You mean you want to see how the case is decomposed?

I don’t understand what you mean by that …

I tried it with different number of processors … 1 … 8 … 35,
but still the same issue.

Your support radius is definitely too large. You get a dense mapping matrix and GMRES might have problems. Try 3-7 times your mesh width.

You can use the vtk export of preCICE. Is this what you already do? Why do you get many files?

I meant to mesh really the same geometry with fluid and solid. Not to have this additional “closure”. I was referring to:

I now ran different radiuses (0.5, 0.8, 1) with the same issue.

Maybe I selected the wrong output files, as there are many when using the RBF mapping.
The following files are available (only the fluid participant has time to create its files):

view list of files
Fluid-Mesh-Fluid.dt1_master.pvtu   Solid-Mesh-Fluid.dt1_r1.vtu
Fluid-Mesh-Fluid.dt1_r0.vtu        Solid-Mesh-Fluid.dt1_r2.vtu
Fluid-Mesh-Fluid.dt1_r1.vtu        Solid-Mesh-Fluid.dt1_r3.vtu
Fluid-Mesh-Fluid.dt1_r2.vtu        Solid-Mesh-Fluid.dt1_r4.vtu
Fluid-Mesh-Fluid.dt1_r3.vtu        Solid-Mesh-Fluid.dt1_r5.vtu
Fluid-Mesh-Fluid.dt1_r4.vtu        Solid-Mesh-Fluid.dt1_r6.vtu
Fluid-Mesh-Fluid.dt1_r5.vtu        Solid-Mesh-Fluid.init_master.pvtu
Fluid-Mesh-Fluid.dt1_r6.vtu        Solid-Mesh-Fluid.init_r0.vtu
Fluid-Mesh-Fluid.init_master.pvtu  Solid-Mesh-Fluid.init_r1.vtu
Fluid-Mesh-Fluid.init_r0.vtu       Solid-Mesh-Fluid.init_r2.vtu
Fluid-Mesh-Fluid.init_r1.vtu       Solid-Mesh-Fluid.init_r3.vtu
Fluid-Mesh-Fluid.init_r2.vtu       Solid-Mesh-Fluid.init_r4.vtu
Fluid-Mesh-Fluid.init_r3.vtu       Solid-Mesh-Fluid.init_r5.vtu
Fluid-Mesh-Fluid.init_r4.vtu       Solid-Mesh-Fluid.init_r6.vtu
Fluid-Mesh-Fluid.init_r5.vtu       Solid-Mesh-Solid.dt1.vtk
Fluid-Mesh-Fluid.init_r6.vtu       Solid-Mesh-Solid.dt2.vtk
Solid-Mesh-Fluid.dt1_master.pvtu   Solid-Mesh-Solid.final.vtk
Solid-Mesh-Fluid.dt1_r0.vtu

When I try to view Fluid-Mesh-Fluid.init_master.pvtu or Fluid-Mesh-Fluid.dt1_master.pvtu, paraview doesn’t show anything (scale in glyph is automatically very small and increasing it doesn’t help).
Now Fluid-Mesh-Fluid.init_r*.vtu looks like that:

view screenshots

Blue: Fluid mesh (Fluid-Mesh-Fluid.init_r*.vtu)
Red: Solid mesh (Solid-Mesh-Fluid.init_master.pvtu)



At the end it should be closed at the side … and as you know it works on my local machine ^^
Also since I didn’t create the fluid solver case this would be quite some work for a dummy case,
so maybe first try some other things before …


I tried the visualization of the same case on my machine:
view screenshots



In my eyes the mesh looks good, what do you think?

I would try to go even lower. Your span has a width of 0.15 and you use roughly 10 elements there. 0.015 * 5 = 0.07. (your solid mesh is the one to consider when you use a consistent mapping from solid to fluid)

Does it work when you use

use-qr-decomposition="yes"

in your RBF mappings?

Meshes look good. I still don’t like the “lid”, but like I said, I would actually expect a different error there.

With 0.07 the simulation started :partying_face:
I will keep you updated how long it runs :+1:
So I didn’t needed

So a big thank you and I will keep you updated :grinning:

2 Likes

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.