OpenFOAM adapter on Ubuntu 20.04

Hi,
I was trying to use the OpenFOAM adapter on OF-1912 on Ubuntu 20.04. Afaik OpenFOAM works fine, PReCICE version 2.0.1 compiles and gives no errors in test_base, the adapter compiles, but when I try to run a tutorial I get this error:

Starting time loop

---[preciceAdapter] The preciceAdapter was loaded.
#0  Foam::error::printStack(Foam::Ostream&) at ??:?
#1  Foam::sigSegv::sigHandler(int) at ??:?
#2  ? in /lib/x86_64-linux-gnu/libc.so.6
#3  ? at ??:?
#4  orte_init in /usr/lib/x86_64-linux-gnu/libopen-rte.so.20
#5  ompi_mpi_init in /usr/lib/x86_64-linux-gnu/libmpi.so.20
#6  PMPI_Init_thread in /usr/lib/x86_64-linux-gnu/libmpi.so.20
#7  Foam::UPstream::initNull() at ??:?
#8  Foam::functionObjects::preciceAdapterFunctionObject::preciceAdapterFunctionObject(Foam::word const&, Foam::Time const&, Foam::dictionary const&) at ??:?
#9  Foam::functionObject::adddictionaryConstructorToTable<Foam::functionObjects::preciceAdapterFunctionObject>::New(Foam::word const&, Foam::Time const&, Foam::dictionary const&) at ??:?
#10  Foam::functionObject::New(Foam::word const&, Foam::Time const&, Foam::dictionary const&) at ??:?
#11  Foam::functionObjectList::read() at ??:?
#12  Foam::Time::run() const at ??:?
#13  ? in /opt/claudio/OpenFOAM/OpenFOAM-v1912/platforms/linux64GccDPInt32Opt/bin/pimpleFoam
#14  __libc_start_main in /lib/x86_64-linux-gnu/libc.so.6
#15  ? in /opt/claudio/OpenFOAM/OpenFOAM-v1912/platforms/linux64GccDPInt32Opt/bin/pimpleFoam

I don’t know what is going on. Any help would be appreciated.
Claudio

I need to try this out, but in the meantime, could you try if removing this line helps?

This may give you a warning at finalization in serial cases, which you can ignore.

Hi,
I commented the line out and recompiled, but unfortunately the error becomes:

---[preciceAdapter] [DEBUG]   MPI rank: 0
#0  Foam::error::printStack(Foam::Ostream&) at ??:?
#1  Foam::sigSegv::sigHandler(int) at ??:?
#2  ? in /lib/x86_64-linux-gnu/libc.so.6
#3  ? at ??:?
#4  orte_init in /usr/lib/x86_64-linux-gnu/libopen-rte.so.20
#5  ompi_mpi_init in /usr/lib/x86_64-linux-gnu/libmpi.so.20
#6  MPI_Init in /usr/lib/x86_64-linux-gnu/libmpi.so.20
#7  precice::utils::Parallel::initializeMPI(int*, char***) in /usr/local/lib/libprecice.so.2
#8  precice::impl::SolverInterfaceImpl::configure(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) in /usr/local/lib/libprecice.so.2
#9  precice::impl::SolverInterfaceImpl::SolverInterfaceImpl(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, int, int, void*) in /usr/local/lib/libprecice.so.2
#10  precice::impl::SolverInterfaceImpl::SolverInterfaceImpl(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, int, int) in /usr/local/lib/libprecice.so.2
#11  precice::SolverInterface::SolverInterface(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, int, int) in /usr/local/lib/libprecice.so.2
#12  preciceAdapter::Adapter::configure() at ??:?
#13  Foam::functionObjects::preciceAdapterFunctionObject::read(Foam::dictionary const&) at ??:?
#14  Foam::functionObjects::preciceAdapterFunctionObject::preciceAdapterFunctionObject(Foam::word const&, Foam::Time const&, Foam::dictionary const&) at ??:?
#15  Foam::functionObject::adddictionaryConstructorToTable<Foam::functionObjects::preciceAdapterFunctionObject>::New(Foam::word const&, Foam::Time const&, Foam::dictionary const&) at ??:?
#16  Foam::functionObject::New(Foam::word const&, Foam::Time const&, Foam::dictionary const&) at ??:?
#17  Foam::functionObjectList::read() at ??:?
#18  Foam::Time::run() const at ??:?
#19  ? in /opt/claudio/OpenFOAM/OpenFOAM-v1912/platforms/linux64GccDPInt32Opt/bin/pimpleFoam
#20  __libc_start_main in /lib/x86_64-linux-gnu/libc.so.6
#21  ? in /opt/claudio/OpenFOAM/OpenFOAM-v1912/platforms/linux64GccDPInt32Opt/bin/pimpleFoam

Thank you
Claudio

How did you install OpenFOAM and, more importantly, did you use the system MPI or the one provided by OpenFOAM (if you built it from source)? This could be an important aspect.

Disclaimer once again: I have not yet tried running anything OpenFOAM-related on Ubuntu 20.04, so I don’t know if this is a general issue or not, so it is not necessary that you are doing something wrong.

Hi,
I have built OpenFOAM from source. I think I have done the same steps I made when I compiled it for 18.04. I might be wrong but I think I am using the system MPI, as after setting the environment variables, mpirun points to the system executable.

@Claudio I just now built OpenFOAM v1912 from source on Ubuntu 20.04, where I already had OpenMPI etc installed from APT. For me, the OpenFOAM-OpenFOAM flow-over-plate tutorial runs fine both in serial and in parallel. Does also this fail for you?

@Makis It looks like you have my same configuration. But unfortunately the tutorial doesn’t work for me. I can attach the logs, both serial and parallel. It looks like there is some inconsistency in the mpi used, but I’m not sure.
Thank you
Fluid.log (6.4 KB) Fluid_parallel.log (2.5 KB) Solid.log (5.5 KB) Solid_parallel.log (2.5 KB)

So, I summarize the state for your system:

  • Ubuntu 20.04, preCICE 2.0.1, OpenFOAM 1912, system OpenMPI
  • OpenFOAM works fine (I assume in serial)
  • preCICE make test_base succeeds

Does OpenFOAM also work fine in parallel?

Where do you see this inconsistency?

So, I summarize the state for your system:

  • Ubuntu 20.04, preCICE 2.0.1, OpenFOAM 1912, system OpenMPI
  • OpenFOAM works fine (I assume in serial)
  • preCICE make test_base succeeds

Does OpenFOAM also work fine in parallel?

Yes, make test_base and also testprecice work, both in serial and in parallel. OpenFOAM works in parallel too.
I believe the inconsistency could be in the fact that the error trace prints about ompi_mpi_init in /usr/lib/x86_64-linux-gnu/libmpi.so.20 . Apparently my system has both libmpi.so.20 and libmpi.so.40 and it seems to me that system mpi points to libmpi.so.40.
I don’t exactly know how to fix it.

Indeed, this is important and could be the reason: preCICE and the solver need to be built with the same MPI version, as preCICE is trying to access the communicator of the solver. I cannot really think of how you could have installed OpenMPI 2 on Ubuntu 20.04. Did you somehow upgrade from 18.04 already?

I would try to remove OpenMPI 2 and only keep the latest one. Then, I would rebuild (just relink actually) the solvers.

Yes, the problem was indeed a difference between OpenMPI versions. I am not able to understand what happened first, but rebuilding everything with the latest version seems to solve the issue.
Thank you
Claudio

1 Like