CFX-CFX coupling

Hello, I am trying to code up a ANSYS CFX adaptor, but having an error at initialization stage. Here is the error I have right now.

(0) 00:36:48 [impl::SolverInterfaceImpl]:251 in initialize: Setting up master communication to coupling partner/s
terminate called after throwing an instance of ‘std::bad_alloc’
what(): std::bad_alloc
WARNING: CFXSTP called recursively.
An error has occurred in cfx5solve:

I am using mpi for communication. PreCICE version is 2.1.1. CFX, precice and my coupling code are all compiled with intel compiler.

I also tried sockets for communication, but this gives the following error when executing precicef_create.

*** Error in `/usr/ansys_inc/v202/CFX/bin/linux-amd64/ifort/solver-mpi.exe’: free(): invalid pointer: 0x00007ffce9a92fb8 ***

After installing preCICE, I run the ctest. All the tests are OK, as shown in the following.

Test project apps/precice-2.1.1/build
Start 1: precice.acceleration
1/28 Test #1: precice.acceleration … Passed 0.67 sec
Start 2: precice.action
2/28 Test #2: precice.action … Passed 0.39 sec
Start 3: precice.com
3/28 Test #3: precice.com … Passed 1.30 sec
Start 4: precice.com.mpiports
4/28 Test #4: precice.com.mpiports … Passed 0.45 sec
Start 5: precice.cplscheme
5/28 Test #5: precice.cplscheme … Passed 4.71 sec
Start 6: precice.io
6/28 Test #6: precice.io … Passed 0.47 sec
Start 7: precice.m2n
7/28 Test #7: precice.m2n … Passed 0.47 sec
Start 8: precice.m2n.mpiports
8/28 Test #8: precice.m2n.mpiports … Passed 0.38 sec
Start 9: precice.mapping
9/28 Test #9: precice.mapping … Passed 0.64 sec
Start 10: precice.math
10/28 Test #10: precice.math … Passed 0.28 sec
Start 11: precice.mesh
11/28 Test #11: precice.mesh … Passed 0.34 sec
Start 12: precice.partition
12/28 Test #12: precice.partition … Passed 1.49 sec
Start 13: precice.interface
13/28 Test #13: precice.interface … Passed 0.24 sec
Start 14: precice.serial
14/28 Test #14: precice.serial … Passed 12.00 sec
Start 15: precice.parallel
15/28 Test #15: precice.parallel … Passed 4.93 sec
Start 16: precice.query
16/28 Test #16: precice.query … Passed 0.28 sec
Start 17: precice.testing
17/28 Test #17: precice.testing … Passed 0.27 sec
Start 18: precice.utils
18/28 Test #18: precice.utils … Passed 0.27 sec
Start 19: precice.xml
19/28 Test #19: precice.xml … Passed 0.59 sec
Start 20: precice.solverdummy.build.cpp
20/28 Test #20: precice.solverdummy.build.cpp … Passed 8.28 sec
Start 21: precice.solverdummy.build.c
21/28 Test #21: precice.solverdummy.build.c … Passed 2.20 sec
Start 22: precice.solverdummy.build.fortran
22/28 Test #22: precice.solverdummy.build.fortran … Passed 0.83 sec
Start 23: precice.solverdummy.run.cpp-cpp
23/28 Test #23: precice.solverdummy.run.cpp-cpp … Passed 0.53 sec
Start 24: precice.solverdummy.run.c-c
24/28 Test #24: precice.solverdummy.run.c-c … Passed 0.48 sec
Start 25: precice.solverdummy.run.fortran-fortran
25/28 Test #25: precice.solverdummy.run.fortran-fortran … Passed 1.12 sec
Start 26: precice.solverdummy.run.cpp-c
26/28 Test #26: precice.solverdummy.run.cpp-c … Passed 0.47 sec
Start 27: precice.solverdummy.run.cpp-fortran
27/28 Test #27: precice.solverdummy.run.cpp-fortran … Passed 0.49 sec
Start 28: precice.solverdummy.run.c-fortran
28/28 Test #28: precice.solverdummy.run.c-fortran … Passed 0.50 sec

100% tests passed, 0 tests failed out of 28

If you want to know more information, please let me know. Thanks in advance.

Hi!

Do you really compile CFX yourself? If you get CFX as a pre-compiled executable, you never know what MPI is used.

Sockets communication is the better choice for closed-source solvers. I guess it would be even good to compile preCICE without MPI (see here for how this works). If you use CFX in parallel then you also need to use sockets for the inter-participant communication in preCICE, but better start with a serial CFX.

If all this does not help please switch on Debug/Trace output and check where the null-pointer exception happens (first example here, you need to build preCICE in Debug mode).

Hi @Jiahuan!

ANSYS software packages usually ship with its own MPI and such an installation makes it notoriously difficult to use MPI communication in preCICE. We have some experience of this while attempting to couple FLUENT using preCICE. A preCICE configuration which is acceptable for ANSYS FLUENT can be found here.
The example case with FLUENT is not in a working state yet (on-going project) but could give you clues on how a coupling with a pre-installed ANSYS software might look like.
A question out of interest: Which operating system are you working with?

I see. Many thanks for your reply @IshaanDesai. I will take a look at your Fluent adaptor. Much appreciated.

I am using CentOS7.

Sorry, you are right. I just installed CFX, rather than compiling it my self.

I did ldd to $ANSYSHOME/v202/CFX/bin/linux-amd64/ifort/solver-mpi.exe
It looks like it links to intel library, but the following share library is not found. I am not sure if this is the root.

I have tried socket option, but is failed even earlier at “precicef_create” stage. Anyway, thanks for your reply.

    libmport.so => not found
    libmpi_wrapper.so => not found
    libSysC.FMULibFortran.so => not found
    libSysC.FMULib.so => not found
    libhdf5.so.103 => not found
    libhdf5_hl.so.100 => not found
    libansysfluidscfxdirectwriter.so => not found
    libhoof.so => not found
    libLocationModel.so => not found
    libansliclib.so => not found

Hi @Jiahuan did you manage to run CFX with preCICE?

Hi @IshaanDesai , no yet. It still throws out error at either “precicef_create” stage or the initialization stage, depending on which communication method I am using.

Okay that is unfortunate. So you are trying to run precicef_create, that means you are using the FORTRAN bindings of preCICE. Are you using the bindings provided inside preCICE or the fortran-module. Can you post the full command when you call precicef_create?