Several challenges during the coupling process of integrating my CFD (use PETSC) code with OpenFOAM

precice-config.xml (2.1 KB)
I am attempting to modify my CFD code and implement coupling through the OpenFOAM adapter. I have attached snippets of my code, along with the function documentation and the PRECICE configuration document.
However, when running my code (VFS-wind), I encountered an error related to data memory overflow :scream:

---- Coupling is open, There are 2 interfaces for VFS-Wind !
test A 
Reading grid.dat 1.000000e+00, 81x152x71
**DM Distribution: 5 6 3
Created DM
test
Simulation start time 0.000000e+00
Re 5.000000e+02 St 1.000000e+00
Initialization
test
test B 
test 3 
test 4 
-----DEBUG 1 is ok!!! ------
[64]PETSC ERROR: ------------------------------------------------------------------------
[64]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[64]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[64]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[64]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[64]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run 
[64]PETSC ERROR: to get more information on the crash.
[64]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[64]PETSC ERROR: Signal received
[64]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.
... ...
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 59.
MPI_ABORT was invoked on rank 64 in communicator MPI_COMM_WORLD
with errorcode 59.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.

Note: The output file contain error info has been upload as error.out
error.out (230.5 KB)

--------------The code snippets I modified in the CFD program are as follows:
======== main.c ==========This is my main program file:

        ... ... ... ...
	//wzw: add for Coulping Initialize
	if (CoupleOtherSolver>0){
		int Process_rank,Process_size,i;
		MPI_Comm_rank(PETSC_COMM_WORLD, &Process_rank);
		MPI_Comm_size(PETSC_COMM_WORLD, &Process_size);
		PetscReal cl_center = 1.;
		PetscReal cl_nodes = 1.;
		PetscOptionsGetReal(NULL,PETSC_NULL, "-cl_center_Precice", &cl_center, PETSC_NULL);
		PetscOptionsGetReal(NULL,PETSC_NULL, "-cl_nodes_Precice", &cl_nodes, PETSC_NULL);
		PetscPrintf(PETSC_COMM_WORLD, "-----DEBUG 1 is ok!!! ------\n");

		const char *participantName = "VFS-wind";
		const char *configFileName = "../precice-config.xml";
		precicec_createSolverInterface(participantName, configFileName, Process_rank, Process_size);
		PetscPrintf(PETSC_COMM_WORLD, "-----DEBUG 2 is ok!!! ------\n");


		Initialize_PRECICE(precice_list,&(user[0]),NumInterFaces,cl_center,VFS_OutPut, FSI_OutPut,cl_nodes,ti,deltafunc);
		precice_dt = precicec_initialize();
		PetscPrintf(PETSC_COMM_WORLD, "-----Initialize precice coupling is ok!!! ------\n");
		PetscBarrier(PETSC_NULL);
		ti = 0;
    	if (rstart_flg) ti = tistart;
	}
	//end over
        ... ... ... ...

======== Coupling.c ==========This is my built-in function file:

PetscErrorCode Initialize_PRECICE(PreciceInfo *precice_list,UserCtx *user,PetscReal NumberOfSurfaces,PetscReal cl,IBMNodes *VFS_OutPut, FSInfo *FSI_OutPut, double reflength_wt,PetscInt ti,int df){
	PetscBool flag;

	MPI_Barrier(PETSC_COMM_WORLD);
	int Process_rank,Process_size,i;
	MPI_Comm_rank(PETSC_COMM_WORLD, &Process_rank);
	MPI_Comm_size(PETSC_COMM_WORLD, &Process_size);
	PetscPrintf(PETSC_COMM_WORLD, "-----Initialize precice is coupling\n");

	PetscInt interfaces;//the number of interfaces in coupling,in my case is 1
	interfaces = NumberOfSurfaces;


	PetscInt dimensions;//the dimension of the case, in my case is 3
	dimensions = precicec_getDimensions();

	PetscPrintf(PETSC_COMM_WORLD, "-----Coupling dimensions is equal to :%d\n", dimensions);
	PetscPrintf(PETSC_COMM_WORLD, "-----Participant Name is equal to :%s\n", ParticipantName);
	PetscPrintf(PETSC_COMM_WORLD, "-----ConfigFileName is equal to :%s\n", ConfigFileName);
	
	Interfaces_Preprocess(user,NumberOfSurfaces,cl,VFS_OutPut,FSI_OutPut,reflength_wt);
	PetscPrintf(PETSC_COMM_WORLD, "-----Reading Interface files is over!!!\n");


	for(i = 1;i<interfaces+1;i++){
		int index = i-1;
		precice_list[index].dim = dimensions;
		char filen[256];
		sprintf(filen, "/Interface%d.dat", i);
		PetscOptionsInsertFile(PETSC_COMM_WORLD, NULL, filen, PETSC_TRUE);

		char meshName[256]; // Initialize the meshName variable
		PetscOptionsGetString(NULL,PETSC_NULL, "-meshName", meshName, 256, &flag); // Assign the value to meshName
		precice_list[index].meshName = meshName; // Assign the value of meshName to precice_list[index].meshName

		char dataName[256]; // Initialize the dataName variable
		PetscOptionsGetString(NULL,PETSC_NULL, "-writeData", dataName, 256, &flag); // Assign the value to dataName
		precice_list[index].writeDataName = dataName; // Assign the value of dataName to precice_list[index].dataName

		char readDataName[256]; // Initialize the readDataName variable
		PetscOptionsGetString(NULL,PETSC_NULL, "-readData", readDataName, 256, &flag); // Assign the value to readDataName
		precice_list[index].readDataName = readDataName; // Assign the value of readDataName to precice_list[index].readDataName

		char loctions[256]; // Initialize the readDataName variable
		PetscOptionsGetString(NULL,PETSC_NULL, "-locations", loctions, 256, &flag); // Assign the value to readDataName
		precice_list[index].location = loctions; // Assign the value of readDataName to precice_list[index].readDataName

		/*
		locations = "faceCenters" or "faceNodes"
		DataName = "Displacement" or "Velocity" or "Pressure" or "Stress" or "Force" or "None", None means no data
		*/
		PetscBarrier(PETSC_NULL);
		precice_list[index].meshID = precicec_getMeshID(precice_list[index].meshName);
		if(strcmp(precice_list[i].readDataName, "None") != 0){
			precice_list[index].readDataID = precicec_getDataID(precice_list[index].readDataName, precice_list[index].meshID);
		}
		if(strcmp(precice_list[i].writeDataName, "None") != 0){
			precice_list[index].writeDataID = precicec_getDataID(precice_list[index].writeDataName, precice_list[index].meshID);
		}
      	PetscPrintf(PETSC_COMM_WORLD, "--------------- meshID is :%d\n", precice_list[index].meshID);

		Set_Vertices(&(precice_list[index]),&(VFS_OutPut[index]));
		PetscPrintf(PETSC_COMM_WORLD, "-----Interface%d\n meshName:%s\n mesh Initialized!!!", i, precice_list[index].meshName);
		
	}

	PetscPrintf(PETSC_COMM_WORLD, "-----Call Initialize_PRECICE() !!!");

}

Note: According to the debug message in the program, the error appears to occur when the function: precicec_createSolverInterface() is called.

The example is run on a distributed cluster with a fluid grid of about 700,000 magnitude. I have uploaded the script of the submitted task( sbatch.dat).
sbatch.inp (492 Bytes)

I have been stuck on this issue for a long time and I hope to get your valuable opinions. :grinning:

Hi @WangZW928,

are you using the same MPI version for OpenFOAM and preCICE (and PETSc)? They must be the same.

Thank you for your prompt reply. I will recompile these support libraries and make sure the MPI version is the same.
In fact, the scenario I’m currently running uses two versions of PETSC.
My CFD code is compiled against PETSC 3.8.4 and precice uses PETSC 3.13.4. I’m not sure if this will cause the code to crash, I’m currently trying to keep both using the same PETSC version.

Hi!@ Makis
After tweaking the MPI version, things seem to be better, but there are some new issues!
First, the bugs in my program about the petsc library have disappeared.
But I found it impossible to run both OpenFOAM and my CFD program in parallel on the same Computer cluster node.
The output file (
slurm-error.out (20.0 KB)
) seems to be stuck
, stopping at this sentence:

—[precice] e[0m Setting up primary communication to coupling partner/s

The submit script for the job is as follows:

#!/bin/bash
#SBATCH -p blcy 
#SBATCH -N 1
#SBATCH -n 25 
#SBATCH -J One_WZ
#SBATCH -w i09r1n08

source ./env.sh
cd ./VFS-wind
mpirun -n 4 ./test&
#./test&
sleep 5

cd ..
source /work1/xdsc0070/openfoamv2206/new/OpenFOAM-v2206/etc/rebashrc
cd ./OpenFoam
decomposePar -force
mpirun -np 16  pisoFoam -parallel
#pisoFoam
reconstructPar
rm -r processor*
tar -czvf OpenFoam-Case.tar.gz *

The precice configuration file is here
precice-config.xml (1.8 KB)
.

Oddly enough, as long as either OpenFoam or my CFD program is running in serial, they communicate!
slurm-success.out (262.5 KB)

My CFD program is written as a demo here!
test.txt (58.9 KB)

Does this page maybe help?

Thank you for your reply@ Makis :smile:

I have found that if the task scripts are submitted separately on the same node, the two jobs can be successfully coupled!

sbatch RunFirst.sh

#!/bin/bash
#SBATCH -p blcy 
#SBATCH -N 1
#SBATCH -n 10 
#SBATCH -J One_WZ
#SBATCH -w i09r4n02

source ./env.sh
cd ./VFS-wind
mpirun -n 8 ./test
#./test&
#mpiexec -n 4 gdb ./test
#mpiexec -n 4 valgrind ./test

sbatch RunSecond.sh

#!/bin/bash
#SBATCH -p blcy 
#SBATCH -N 1
#SBATCH -n 20 
#SBATCH -J Two_WZ
#SBATCH -w i09r4n02


source ./env.sh
source /work1/xdsc0070/openfoamv2206/new/OpenFOAM-v2206/etc/rebashrc
cd ./OpenFoam
decomposePar -force
mpirun -np 16  pisoFoam -parallel
reconstructPar
rm -r processor*
tar -czvf OpenFoam-Case.tar.gz *

Then, the coupling task can run successfully!
Next I’ll try how to couple the two tasks on multiple different nodes.

1 Like

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.