Hi @IshaanDesai ,
I checked everthing again, but the problem still remains. The fixed side is coordinate and axes -wise the correct one. The simulation of CalculiX runs perfectly fine. But there’s something wrong with OpenFoam.
Although I´ve found something else.
In the tutorial, the front and back sides were taken out in the interface_beam.nam file, which I didn’t notice at first. The nodes on the two sides must not be passed to OpenFoam, of course, because OpenFoam does not calculate in the Z-axis. This could have caused the problems from before, because OpenFoam tries to interpolate over these points anyway. But this doesn’t solve the problem in any way…
After taking out those nodes, however, I get the following error from OpenFoam:
PIMPLE: iteration 1
smoothSolver: Solving for cellDisplacementx, Initial residual = 0.976232, Final residual = 2.23259e-17, No Iterations 2
smoothSolver: Solving for cellDisplacementy, Initial residual = 0.998748, Final residual = 8.16963e-18, No Iterations 2
DICPCG: Solving for pcorr, Initial residual = 1, Final residual = 0.00083934, No Iterations 76
DICPCG: Solving for pcorr, Initial residual = 0.85702, Final residual = 7.34795e-09, No Iterations 98
time step continuity errors : sum local = 2.98789e-07, global = 2.17537e-08, cumulative = 29.3178
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 8 FPE: Floating Point Exception,probably divide by zero
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see https://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors
[0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
[0]PETSC ERROR: to get more information on the crash.
[0]PETSC ERROR: User provided function() line 0 in unknown file
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 59.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
And from CalculiX the following one:
Adapter calling advance()...
---[precice] relative convergence measure: relative two-norm diff of data "Displacement" = 1.00e+00, limit = 5.00e-03, normalization = 1.18e+04, conv = false
---[precice] relative convergence measure: relative two-norm diff of data "Force" = 1.00e+00, limit = 5.00e-03, normalization = 3.59e+15, conv = false
---[precice] ERROR: Send using sockets failed with system error: write: Broken pipe
I´ve searched for a solution in this forum, but the only thing I´ve found is an article, which stated a problem with the IQN-ILS acceleration. Due to I haven´t done anything to the precice-config, the acceleration looks like this, which is the same as the tutorial:
<acceleration:IQN-ILS>
<data name="Displacement" mesh="Solid-Mesh" />
<data name="Force" mesh="Solid-Mesh" />
<preconditioner type="residual-sum" />
<filter type="QR2" limit="1e-2" />
<initial-relaxation value="0.5" />
<max-used-iterations value="100" />
<time-windows-reused value="15" />
</acceleration:IQN-ILS>
</coupling-scheme:parallel-implicit>