Hello,
I started with a 6-way coupling with OpenFOAM+CalculiX+XDEM.
I have reduced from 6-way coupling to 4-way and now to 2-way since there seems to be an issue only in between OpenFOAM and CalculiX.
I am using modified OpenFOAM solver PorouspimpleFoam
, which takes into account for porosity in addition to normal things in pimpleFoam
solver. (attached in the files).
This I need because this will be used for coupling with the 3rd solver (XDEM).
I have used this Fluid with Deal.II (2-way), and it works. This also works fine with 6-way and 4-way coupling. So it seems that issue is not with Fluid.
I have taken the tutorial flap case and replaced it with the Fluid
folder I have.
With all the settings kept the same as the tutorial case, I get a PETSc error
smoothSolver: Solving for cellDisplacementy, Initial residual = 0, Final residual = 0, No Iterations 2
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 8 FPE: Floating Point Exception,probably divide by zero
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
[0]PETSC ERROR: to get more information on the crash.
[0]PETSC ERROR: User provided function() line 0 in unknown file
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 59.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
So read about the issues with rbf mapping (Parallel calculation problem for OpenFOAM-CalculiX Perpendicular flap tutorial) and replaced it with nearest-neighbour
mapping.
<mapping:nearest-neighbor direction="write" from="Fluid-Mesh-Faces" to="Solid" constraint="conservative" />
<mapping:nearest-projection direction="read" from="Solid" to="Fluid-Mesh-Nodes" constraint="consistent" />
I get the following error:
smoothSolver: Solving for cellDisplacementy, Initial residual = 0, Final residual = 0, No Iterations 2
#0 Foam::error::printStack(Foam::Ostream&) at ??:?
#1 Foam::sigFpe::sigHandler(int) at ??:?
#2 ? in "/lib/x86_64-linux-gnu/libc.so.6"
#3 void Foam::fvc::surfaceIntegrate<double>(Foam::Field<double>&, Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&) in "/home/prasad/OpenFOAM/prasad-7/platforms/linux64GccDPInt32Opt/bin/PorouspimpleFoam"
#4 Foam::tmp<Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh> > Foam::fvc::surfaceIntegrate<double>(Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&) in "/home/prasad/OpenFOAM/prasad-7/platforms/linux64GccDPInt32Opt/bin/PorouspimpleFoam"
#5 Foam::tmp<Foam::GeometricField<double, Foam::fvPatchField, Foam::volMesh> > Foam::fvc::div<double>(Foam::GeometricField<double, Foam::fvsPatchField, Foam::surfaceMesh> const&) in "/home/prasad/OpenFOAM/prasad-7/platforms/linux64GccDPInt32Opt/bin/PorouspimpleFoam"
#6 ? in "/home/prasad/OpenFOAM/prasad-7/platforms/linux64GccDPInt32Opt/bin/PorouspimpleFoam"
#7 ? in "/home/prasad/OpenFOAM/prasad-7/platforms/linux64GccDPInt32Opt/bin/PorouspimpleFoam"
#8 __libc_start_main in "/lib/x86_64-linux-gnu/libc.so.6"
#9 ? in "/home/prasad/OpenFOAM/prasad-7/platforms/linux64GccDPInt32Opt/bin/PorouspimpleFoam"
./runFluid: line 40: 18109 Floating point exception(core dumped) $solver -case Fluid
When I use <coupling-scheme:serial-explicit>
the Fluid crashes at t=0.0202, no matter what the mapping. (Just error changes between PETSc and one above).
When I use <coupling-scheme:parallel-explicit>
the Fluid crashes at t=0.0302, no matter what the mapping. (Just error changes between PETSc and one above).
When I use <coupling-scheme:serial-implicit>
the Fluid crashes at t=0.0102, no matter what the mapping. (Just error changes between PETSc and one above).This coupling-scheme was already present (commented out) in the precice-config.xml which is available with the tutorial case.
I need help in solving this issue.
What can I try next? Different Mapping or maybe different coupling schemes.
Or is it to do with something else.
I have attached my case with the solver inside (link).
Thank you very much in advance for all the help.
Here is the link for drive: