Mapping issue: Forces mapped from fluid mesh to solid mesh look weird. How to debug?

Dear preCICE community,

I’m testing our in-house fluid solver coupled to the solid solver CalculiX with preCICE (release version 2.4.0). I’m working on a linux cluster with CentOS 7. The fluid and solid solvers and preCICE are compiled with Intel oneAPI and its corresponding MPI library. petsc 3.17.3 was installed and also compiled with Intel oneAPI. The library boost 1.71.0 is installed.

The test case I’m using is the FSI3 case of Hron and Turek. I’m doing only 1 time step with 1 iteration in order to compare the fluid forces on the original cell center fluid mesh and the fluid forces mapped on the nodes of the structure mesh. I’m observing a weird problem at the tip of the deformable flap:

On the picture below the fluid forces at each cell centers. preCICE seems to get the correct fluid forces and the correct cell center coordinates:

On the picture below the fluid forces mapped on the structure nodes:

On one side (z=0) the forces mapped on the nodes are directed in positive y-direction. On the other side (z=1) the forces mapped the nodes are directed in negative y-direction. I draw some rectangles to capture the nodes with this issue. The problem here is not the magnitude of the vectors but their opposite direction. It generates a moment with respect to the x-axis, which makes no sense in 2D.

Any idea how to understand and debug this problem?

I attach my xml preCICE config:
precice-config.xml (2.8 KB)

Regards
Guillaume

Hi Guillaume,

Could you please upload your preCICE vtu files as well. Looks strange. Could be an artifact of the RBF mapping, a CalculiX adapter issue (we had some problems with such dangling vertices) or a 2D-3D mapping issue.

A more reliable option for such a 2D scenario could be to use preCICE in 2D mode:

<solver-interface dimensions="2">

The CalculiX adapter should already be able to do this automatically. In FASTEST, you then have to define 2D vertices as well and do the 2D-3D mapping yourself.
But let’s look at the vtu files first.

Hi Benjamin,

as required I attached the files:
FSI3_mapping_issue.zip (8.8 KB)

These files are generated with the previously attached xml input file. I get exactly the same files by switching from serial:implicit to serial:explicit, which makes sense, since I only iterate once with serial:implicit. If I correctly understood the constant underrelaxation does not apply on the forces for serial:implicit. So it has no impact on my files. Am I correct on this point?

Few words about the fluid solver configuration:

  • A 3D block-structured grid is used and divided into 15 blocks.
  • These blocks are distributed on 5 processors.
  • Rank 2 has no block with FSI interface, therefore it does not appear in the VTU files.
  • The surface at the free end of the flap is treated by rank 1.
  • The top surface of the flap is composed of several patches distributed on several ranks (3 and 4).
  • The bottom surface has 3 patches attached with ranks 0, 3 and 4.

During the preCICE initialization done in fluid solver, nodes coordinates and cell center coordinates are sent/written in the following manner:

  • In case of the nodes: All nodes of a block are sent. Therefore, in case of 2 inter-connected blocks, 2 lines of identical nodes are sent to preCICE. I think it should work, since they have the same coordinates, they will get the same interpolated data. Am I correct on this point?
  • For the cell centers: Only the inner cell centers are sent to preCICE. No duplicate of cell centers are present. Is it the right way to do it with preCICE?

Thx for your suggestion with the 2D option, I will give it a try.

Is there in the docu a list with all possible options of the xml input file? I would like to test other RBF mapping…

Cheers

Switch from ‘rbf-thin-plate-splines’ to ‘rbf-compact-tps-c2’ has a lot of impact on my mapped forces. The definition of the support radius (using rbfShape.py) is not fully clear for me…in other words I don’t understand anything. I have already downloaded the Lindner thesis.

The main problem of your setup is that you have two “layers” on the CalculiX side, which only expand in the z direction. In your configuration, you set z-dead = true, i.e., you ask the mapping to ignore the expansion in z-direction such that you essentially define the solid mesh twice.
In a conservative setting, the interpolation matrix is build upon the to mesh, i.e., the CalculiX mesh. Defining vertices (or the entire mesh) multiple times will render the interpolation matrix singular. PETSc might not notice that since an iterative method is applied to solve the system. For such mapping investigations, it is usually a good idea to use a direct method use-qr-decomposition="true", as the method will complain if the mapping problem is not well posed (apart from being faster for such small application cases).
You can compute the case ‘transform’ the mapping to a well posed problem by omitting the dead axis specification, but considering the geometrical setup, I could imagine that the resulting system is still rather ill-conditioned (without getting any errors).
To summarize: I would definitely try run the simulation in 2D as proposed by @uekerman

The execution of the mapping is independent of the coupling scheme, so that’s perfectly fine.

Exactly, it applies to the data sent by the second participant to the first participant. Anyway, the exported data is (should) never correspond to the ‘accelerated’ data.

As a rule of thumb I would recommend to only define vertices once (also across ranks). There are certain exceptions, where a duplicate might be ok, e.g., when reading data in a consistent mapping setup. preCICE probably won’t complain if your ‘write mesh’ in a conservative mapping contains duplicates, but you need to consider that the duplicate here will affect your global sum, i.e., your constraint and that’s most likely not what you want.
In all other cases, the mapping matrix (assuming RBFs) will be singular (as already written above) and at least using the use-qr-decomposition=true option will indicate that something is wrong. Hence, I would just try to avoid duplicate nodes as long as you don’t know exactly what you do.

However, in case you use the ‘nodes’ here only on the Fluid side for reading displacements (no data writing), it would be ok to duplicate (one of the exceptions mentioned above).

Yes

If you are really curious about the ‘most accurate’ mapping for your setup you can easily run a simple test case using aste (tutorial is still pending, unfortunately)
https://github.com/precice/aste/tree/develop/tests/example/nn
and look at the generated errors. Aste is compatible with the exported meshes from preCICE.

The support radius is just a cut-off threshold for your RBF, i.e., nodes of which distance are taken into account. The script you mentioned was invented for the Gaussian RBF, where the support-radius option was only added at a later point in time. You can safely ignore it.

Thx for your detailed reply.

So If I correctly understood: The problem I get on the mapped forces with ‘rbf-thin-plate-splines’ come from the fact that the problem I tried to solved is not well posed (2D extruded problem solved in 3D). Indeed, you are right, if I set use-qr-decomposition="true", preCICE complains. This only occurs for the conservative mapping. For the consistent it works.

Setting ‘rbf-compact-tps-c2’ with a radius of 1.0 “solved” my problem in the mapped forces. The mapped distribution of the forces looks nice and physical.

Considering the duplicates points in meshes:

  • I have duplicate points in the mesh based on the nodes, which is used for reading the displacements. So it should be ok.
  • I have no duplicate point in the mesh based on the cell center, which is used for writing the forces. So it should be ok too.

Thank you for the XML reference page. I did not see it before!

I already clone ASTE. But for the moment It did not compile:

git clone git@github.com:precice/aste.git
yum install metis-devel.x86_64 vtk.x86_64 vtk-devel sympy
cd aste
export CC=icc
export CXX=icpc
cmake3 .
make all

I get the following errors:

CMakeFiles/testing.dir/src/mesh.cpp.o: In function `aste::MeshName::save(aste::Mesh const&, std::string const&) const':
/home/denayer/softwares/aste/src/mesh.cpp:305: undefined reference to `MPI_Abort'
/home/denayer/softwares/aste/src/mesh.cpp:333: undefined reference to `MPI_Comm_rank'
CMakeFiles/testing.dir/src/mesh.cpp.o: In function `aste::BaseName::findAll(aste::ExecutionContext const&) const':
/home/denayer/softwares/aste/src/mesh.cpp:419: undefined reference to `MPI_Finalize'
CMakeFiles/testing.dir/src/mesh.cpp.o: In function `aste::readMesh(aste::Mesh&, std::string const&, int, bool)':
/home/denayer/softwares/aste/src/mesh.cpp:49: undefined reference to `MPI_Abort'
/home/denayer/softwares/aste/src/mesh.cpp:68: undefined reference to `MPI_Abort'
/home/denayer/softwares/aste/src/mesh.cpp:107: undefined reference to `MPI_Abort'
CMakeFiles/testing.dir/src/mesh.cpp.o: In function `aste::readData(aste::Mesh&, std::string const&)':
/home/denayer/softwares/aste/src/mesh.cpp:120: undefined reference to `MPI_Abort'
/home/denayer/softwares/aste/src/mesh.cpp:142: undefined reference to `MPI_Abort'
CMakeFiles/testing.dir/src/mesh.cpp.o:/home/denayer/softwares/aste/src/mesh.cpp:185: more undefined references to `MPI_Abort' follow

Should I define something for MPI to get ASTE compiled?
On the CentOS 7 I used, Intel oneAPI and its corresponding MPI are in the LIBRARY_PATH.

Regards

I don’t fully understand your setup here, but if it works for you, ok.

You can also generate it yourself locally using precice-tools xml >reference.xml

Please pull again and update me if the error persists.

So do I…I found this by try and error.

Is the value set in the radius a length with the unit used by the solvers? Or is it a ratio or a dimensionless length?

Great! it compiles!

As you wrote there is no tutorial. Is there a Category on the forum to talk about ASTE? Is there some videos/talks about ASTE with some explanations?

Regards

preCICE is not aware of any units. The support-radius is simply an absolute value and should be selected according to your mesh.

We have a pending documentation here Add ASTE Documentation in tooling section by kursatyurt · Pull Request #105 · precice/precice.github.io · GitHub and here ASTE Turbine Tutorial by kursatyurt · Pull Request #244 · precice/tutorials · GitHub

I just noticed that the use of local RBFs instead of global ones is recommended for FSI3 in the FSI3 preCICE tutorial:

For the double-refined mesh, it is wisely to use local basis functions in the RBF data mapping method instead of global ones. You can use:

<mapping:rbf-compact-tps-c2 direction="read" from="Fluid-Mesh-Centers" to="Solid-Mesh"
                            support-radius="0.011" constraint="consistent" />

If you configure to use the qr-decomposition, there will computationally no difference. This comment was added because the most refined case ran on a cluster in a parallel setup without qr-decomposition, where it actually matters.