Test fails for mapping RBF during installation

Hello, I am Prasad ADHAV.

I am new to preCICE.
I want to perform coupled simulation using multiple solvers (OpenFOAM, CalculiX, XDEM).

I followed the instructions on preCICE wiki as best as I could.

At Step 4, when I do make make test_base

and make test, the test fails for mapping RBF (precice.mapping.petrbf).
Also, in make test mpiports times out.

What might be the issue here?
I have attached the log from the test here.

Thank you in advance for the help.

log_make_TEST.txt (40.2 KB) log_make_TEST_BASE.txt (25.2 KB)

Hi Prasad,

Welcome to preCICE!

Both problems are presumably known issues. You can find more information in the wiki:

  • OpenMPI does not properly support MPI Ports. I recommend to stick to (TCP/IP) sockets for your coupled simulations, see tag <m2n:sockets/> in your config.
  • For PETSc, I recommend to build at least v3.12 from source if you want to use rbf mapping, tag <mapping:rbf/> in your config.

Concerning OpenFOAM + CalculiX + XDEM, have you seen this video already?

Benjamin

1 Like

Hello Benjamin,

Yes, I have seen the video.

As for the issues I had with RBF mapping. I have installed PETSc v3.12.5. I am adding the install log, just in case.

But, I am still getting the same error when I do make test_base.

Could you also please confirm that without solving these issues if I try to make install, the installation will fail? Because if I try to make install I get the following output:

-- Found Git: /usr/bin/git (found version "2.17.1") 
-- Revision status: Detection failed
[  0%] Built target GitRevision
[ 63%] Built target precice
[ 64%] Built target binprecice
[100%] Built target testprecice
Install the project...
-- Install configuration: "Debug"
-- Installing: /usr/local/lib/libprecice.so.2.0.2
CMake Error at cmake_install.cmake:53 (file):
  file INSTALL cannot copy file
  "/home/prasad/precice-2.0.2/build/libprecice.so.2.0.2" to
  "/usr/local/lib/libprecice.so.2.0.2".


Makefile:150: recipe for target 'install' failed
make: *** [install] Error 1

I was not able to follow your advice on OpenMPI. Could you please elaborate a bit more?

Thank you.PETSc_install_log.txt (68.7 KB)

As for the issues I had with RBF mapping. I have installed PETSc v3.12.5. I am adding the install log, just in case.
But, I am still getting the same error when I do make test_base.

Installation looks correct.
When you run cmake on preCICE, does it find the correct PETSc version? You can check this in the output you get.

Could you also please confirm that without solving these issues if I try to make install, the installation will fail?

No, these two things are completely independent. Installation of preCICE only means that the library and headers are copied somewhere else. Installation is also optional.

Because if I try to make install I get the following output:

-- Found Git: /usr/bin/git (found version "2.17.1") 
-- Revision status: Detection failed
[  0%] Built target GitRevision
[ 63%] Built target precice
[ 64%] Built target binprecice
[100%] Built target testprecice
Install the project...
-- Install configuration: "Debug"
-- Installing: /usr/local/lib/libprecice.so.2.0.2
CMake Error at cmake_install.cmake:53 (file):
  file INSTALL cannot copy file
  "/home/prasad/precice-2.0.2/build/libprecice.so.2.0.2" to
  "/usr/local/lib/libprecice.so.2.0.2".

Makefile:150: recipe for target 'install' failed
make: *** [install] Error 1

I guess you have to run with sudo rights: sudo make install. Did you?

I was not able to follow your advice on OpenMPI. Could you please elaborate a bit more?

Sure. For the communication between coupled solvers, preCICE can use two technical implementations: MPI Ports or TCP/IP sockets. You can choose in the precice-config.xml which one you want to use. OpenMPI does unfortunately not support MPI Ports as we use them. So when you build preCICE with OpenMPI, you cannot use MPI Ports for communication between coupled solvers m2n:mpi. Still, we also need MPI for other things, so it is still important that you link against OpenMPI. Long story short, my advice is:

  • Don’t worry, just use m2n:sockets and not m2n:mpi. Ignore the failing test.
  • Or, if you are on a large computing cluster and need the last bit of performance, use any other MPI implementation (e.g. MPICH or Intel).
1 Like

The preCICE was installed after I used this. Thank you.

Haha :sweat_smile:… Yes, I was worried it might create problems later. But thanks for the explanation, I understand it a bit better now.

Overall the preCICE installation is done along with dependencies.

Now onwards to the installation of CalculiX and then the tutorials.
Thank you again for all your help.