Failed to establish the connection to preCICE

Hi all,
Recently, I’m working on coupling OpenFOAM with our in-house code. Since the in-house code is a little bit complex, I choose to write a simple displacement generator to couple with interFoam as an exercise.

Firstly, as python bindings for preCICE is easy to get started, I adopted python bindings to do bidirectional coupling ( between a displacement generator and interFoam ) and it worked well. Then, I wrote a module which contained all the necessary subroutines and compile the adapter by adopting gfortran.

gfortran -o adapterTest preCICEmodule.f90 main.f90 -I /lib/x86_64-linux-gnu

There is no error during the compilation. However, when I executed the displacement generator, it seemed that the displacement generator didn’t connect to preCICE at all. I found that the time loop in the generator is not controlled by the subroutine precicef_is_coupling_ongoing ( ongoing ) precicef_is_coupling_ongoing ( ongoing ). The generator just kept generating new displacements for the mesh nodes on the interface while the fluid participant ( interFoam ) was still waiting.

I doubt that this is due to the need of a config file such as config.yml for solid solver CalculiX and precice-adapter-config-fsi-s.json for solid solver fenics. But in my previous exercise, even though there is no such config file, the coupling between the displacement generator and interFoam works fine. So I wonder if this config file is a must.

If this config file is a must, how to make adapters read those information in the config file? In a previous topic, Makis suggested that there are two popular options YAML and JSON. However, I find that YAML does not support Fortran and if we have to make a config file, I probably need to adopt some methods to resd a JSON file.

Any advice will be appreciated!

Sincerely,
Stan

Hi Stan,

This should not have anything to do with the config file. The config file is just there to avoid hard-coding things in the adapter.
I suspect that your fortran code has problems finding/linking preCICE.
Do you use the preCICE fortran bindings or the fortran module?
Any error messages / warnings when compiling your code?
In case this doesn’t help already, could you provide your files preCICEmodule.f90 and main.f90?

Benjamin

Such a file is definitely not a must, you can also hard-code the values you need in your code.

There are unofficial YAML parsers for several languages, here is one I just found for Fortran: GitHub - BoldingBruggeman/fortran-yaml: Lightweight YAML parser written in object-oriented Fortran

Hi Uekerman and Makis,

Thanks for your reply. My doubt is gone now.

Do you use the preCICE fortran bindings or the fortran module?
Any error messages / warnings when compiling your code?
In case this doesn’t help already, could you provide your files preCICEmodule.f90 and main.f90?

I used the preCICE Fortran bindings in precice-develop to build a module.
There are no error messages when I compiled the code.
Yes, of course. Due to the format issue, these files are uploaded as .txt files.

preCICEmodule.txt (11.4 KB)
main.txt (3.9 KB)
VerticesInCoupling.txt (223 Bytes)

Thanks again for your help!

Best regards,
Stan

Hi Benjamin,

I’ve gone through the codes and compared my testing code with the given solverdummy in Fortran. I found that I forget to write the sentence “call precicef_initialize_data( )” and I accidentally commented the “call precicef_is_coupling_ongoing(ongoing)” before the time loop.

After these mistakes are fixed, I run both the fluid part and the solid part. However, the fluid part still waits even when the solid part is running. Since the variable “ongoing” is related to the time loop, I output the value of “ongoing” to see whether the connection is built. It seems that the solid part is still not connected to preCICE since in the figure below the “ongoing” is always the same value i.e. 185146416, and the Solid part keeps generating displacements without communicating with the fluid part.


Could you please share any advice or experience on this problem?
Thanks in advance!

Stan

You don’t get any preCICE output for your dummy solid, right?
The preCICEmodule you use is sth you developed, right? Not sth you copied and adapted from preCICE, right?
I am not Fortran expert, but it seems to me that you don’t call preCICE in your module, e.g.

subroutine precicef_create(participantName, configFileName, solverProcessIndex, solverProcessSize)
      
	CHARACTER :: participantName
      
	CHARACTER :: configFileName
      
	INTEGER :: solverProcessIndex
      
	INTEGER :: solverProcessSize 
      
	!IN:  participantName, configFileName, solverProcessIndex, solverProcessSize
      
	!OUT: -
    
end subroutine precicef_create

I would recommend that you either start from the provided Fortran Module incl solverdummy or from the fortran dummy which directly uses preCICE.
You could start with first coupling two solverdummies with each other. Afterwards you could rewrite the solverdummy to create the dummy displ data and then couple to OpenFOAM.

You don’t get any preCICE output for your dummy solid, right?

Yes, I don’t get any preCICE output for my dummy solid.

The preCICEmodule you use is sth you developed, right?
I would recommend that you either start from the provided Fortran Module incl solverdummy or from the fortran dummy which directly uses preCICE.

I’ve gone through the Fortran Module (precice.f03) and the solverdummy (solverdummy.f03). This provided module uses “intrinsic :: iso_c_binding”. I don’t know if it is suitable for f90 codes, and I doubt that it does not directly use preCICE. So I developed the module “preCICEmodule” by copying content from the Fortran binding of preCICE and made the main.f90 by modifying the solverdummy.f90.

it seems to me that you don’t call preCICE in your module

In my module, all the subroutines are copied from the Fortran syntax of those functions in “SolverInterfaceFortran.hpp”. If I understand correctly, preCICE should be called in the main program by calling those subroutines (functions provided by preCICE). Those preCICE subroutines should be available to adopt by including the preCICE library (libprecice.so or libprecice.so.2) when compiling the main program.

You could start with first coupling two solverdummies with each other. Afterwards you could rewrite the solverdummy to create the dummy displ data and then couple to OpenFOAM.

Thanks for the suggestions. I’ll try to first couple two solverdummies with each other.

I think, there is still a misunderstanding.

The f90 bindings of preCICE are directly compiled into preCICE. For f90 codes, you don’t need to write your own wrapper.

For the f90 fortran dummy, a simple

gfortran -o solverdummy solverdummy.f90 -lprecice

should do the job.
OK?

Hi Uekerman,

Thanks for your advice! I abandoned the unnecessary module I wrote and directly compiled my testing code under your guidance:

gfortran -o solverdummy solverdummy.f90 -lprecice

Now I get some preCICE output for solverdummy solid, but it is stuck at
[precise] Setting up master communication to coupling partner/s.

I’ve read the thread help-the-participants-are-not-finding-each-other. Since I start both solvers from sub-folders Fluid and Solid, I configure:

<m2n:sockets from=“Fluid” to=“Solid” exchange-directory=“…/”/>

Before I ran the two dummies, I deleted the precice-run folder. I also checked the permissions on the exchange directory. I have both read and write permissions. Besides, the network on my pc works well. In my testing code, I didn’t use the functions which were related to the checkpoint since I adopted the serial-explicit coupling method. So, it should be the problem of the sentences which are related to the displacement generation in my testing code rather than those sentences related to preCICE, right?

How and from which locations do you start the two solverdummies?

Does precice-run get created at the correct location?

In order to describe much clearly, I upload the figure which shows the structure of my testing case.

I start the two solverdummies by running the shell runFluid and the compiled shared library main (obtained from my testing code) in two terminals separately.

Does precice-run get created at the correct location?

Yes, it is created at the correct location, i.e. the parent directory of both Fluid and Solid.
The .xml file I used in this test is uploaded here as well:
precice-config.xml (1.9 KB)

What does runFluid do exactly? Does it cd into Fluid?
You have to start both codes from their sub-directories Fluid and Solid.

1 Like

It executes several operations related to OpenFOAM. It is modified from the shell runFluid in previous preCICE tutorials. The content is shown in the following figure.

Yes, it does cd into Fluid and execute the selected fluid solver. Both codes are started from the sub-directories.

@Stan I understand that this is the usual runFluid script we had before we restructure the tutorials. If you look closely, the solver is executed on line 42, but from the same directory as the script (see cd .. on line 36).

The -case flag tells OpenFOAM to use these files, but it still runs from the same directory.

You could just remove the cd .. from line 36 and the -case Fluid from lines 40 and 42, and then move the script inside the fluid case directory.

1 Like

Hi Makis,

I understand that this is the usual runFluid script we had before we restructure the tutorials. If you look closely, the solver is executed on line 42, but from the same directory as the script (see cd .. on line 36).

Yes, it is. Since parallel=0 , the case is not run in parallel and it is executed on line 42.

You could just remove the cd … from line 36 and the -case Fluid from lines 40 and 42, and then move the script inside the fluid case directory.

I’ve modified the script as you suggested and run it in the Fluid directory. However, I get the information:

ERROR: XML parser was unable to open configuration file "precice-config.xml"

Besides, I think that as we put the runFluid script into the Fluid directory, I should not only remove the cd .. from line 36 and the -case Fluid from line 42, but also those ‘-case Fluid’ from line 23 to line 27. Additionally, the cd Fluid from line 28 should also be removed.

After doing the above changes, I changed the preciceConfig "precice-config.xml" to preciceConfig "../precice-config.xml". Now it seems to work, thanks very much for your help and Uekerman’s suggestions!

1 Like

I am confused: is this now working, or not?

In this runFluid_log.log, I see for the solver:

Exec   : interDyMFoam -case Fluid

meaning that it is still the same script from the picture.

I’m sorry that I forget to delete the last post since it is not working in the parent directory. If it is OK, I’ll remove my last post. As you suggested, the OpenFOAM solver should be run in the sub-directory. The runFluid script I use now is as follow:

I also made a simpler script which only contains necessary OpenFOAM commands, and then the two participants work well. The figure below is the final output when the simulation is finished.

1 Like

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.