I am now using OpenFOAM 1906 to simulate the flap case.
In fact, it works very very for a serial run. But when I used parallel mode(./Allrun_parallel), The case didn’t output anything after a few seconds when I tail -f the log file. Then I pressed “c” to stop it.
I am not clear about this, I didn’t change anything from the openfoam-adapter tutorial
Here I also upload my output file. (There might error in the file since I stopped the case halfway I guess) Fluid.log (22.8 KB) Solid.log (3.1 KB)
Great news! However, as I said, Nearest-Neighbor is just a simpler and less accurate method (with an implementation that does not require PETSc). Does building PETSc from source solve your problem?
Dear Makis, we are facing the same problem (perpendicular flap runs in serial mode but hangs while executing the parallel script). OpenFOAM runs a first time step but transferring data with RBF freezes. We use: OpenFOAM 5.0 (recompiled), precice 1.6.1 (recompiled), openMPI 2.1.1 (recompiled with disabled-heterogenous flag) PETSc 3.11.3 (recompiled) on Ubuntu 18.04.03. But the problem still exists. Any hints to solve it? Thanks Ulrich
@Yongbo do you have any updates? Did you try RBF with PETSc built from source?
If not, could you please increase the log verbosity level and upload your Fluid.log again? In your precice-config.xml change filter="%Severity% > debug" to filter="%Severity% > trace".
@Ulrich if you are getting exactly the same behavior, could you please upload your log files, also with increased log verbosity? If you see that it is a different situation, could you please write again a summary in a new thread? If it is exactly the same, it is fine.
sorry for the late reply. We still have the problem and it seems to be the same as Yongbo’s problem. We get one iteration in OpenFOAM but then the PetRadialBasisFctMapping hangs for the parallel parallel run and we cancelled the job. We tried this in our recompiled configuration but have the same problem. We also tried with OpenFOAM 7 but the problem is the same. Any ideas what might be wrong in our setup? Should we try with another MPI-version?
Thanks in advance.
Best regards
we installed PETSc 3.12 now from source and this seems to solve all issues we had with the installation so far. We moved meanwhile to PreCICE 2 but also had problems to run cases even with nearest- neighbor mapping in parallel. With PETSc 3.12 we can run all our cases with nearest-neighbor in parallel and also the RBF mapping works in parallel for the cylinder_flap tutorial.