Test project /home/zanella/softwares/precice-2.2.0/build Start 1: precice.acceleration 1/27 Test #1: precice.acceleration ......................***Failed 0.00 sec /usr/local/bin/mpiexec: Error: unknown option "-n" Start 2: precice.action 2/27 Test #2: precice.action ............................***Failed 0.00 sec /usr/local/bin/mpiexec: Error: unknown option "-n" Start 3: precice.com 3/27 Test #3: precice.com ...............................***Failed 0.00 sec /usr/local/bin/mpiexec: Error: unknown option "-n" Start 4: precice.cplscheme 4/27 Test #4: precice.cplscheme .........................***Failed 0.00 sec /usr/local/bin/mpiexec: Error: unknown option "-n" Start 5: precice.io 5/27 Test #5: precice.io ................................***Failed 0.00 sec /usr/local/bin/mpiexec: Error: unknown option "-n" Start 6: precice.m2n 6/27 Test #6: precice.m2n ...............................***Failed 0.00 sec /usr/local/bin/mpiexec: Error: unknown option "-n" Start 7: precice.mapping 7/27 Test #7: precice.mapping ...........................***Failed 0.00 sec /usr/local/bin/mpiexec: Error: unknown option "-n" Start 8: precice.mapping.petrbf 8/27 Test #8: precice.mapping.petrbf ....................***Failed 0.00 sec /usr/local/bin/mpiexec: Error: unknown option "-n" Start 9: precice.math 9/27 Test #9: precice.math ..............................***Failed 0.00 sec /usr/local/bin/mpiexec: Error: unknown option "-n" Start 10: precice.mesh 10/27 Test #10: precice.mesh ..............................***Failed 0.00 sec /usr/local/bin/mpiexec: Error: unknown option "-n" Start 11: precice.partition 11/27 Test #11: precice.partition .........................***Failed 0.00 sec /usr/local/bin/mpiexec: Error: unknown option "-n" Start 12: precice.interface 12/27 Test #12: precice.interface .........................***Failed 0.00 sec /usr/local/bin/mpiexec: Error: unknown option "-n" Start 13: precice.serial 13/27 Test #13: precice.serial ............................***Failed 0.00 sec /usr/local/bin/mpiexec: Error: unknown option "-n" Start 14: precice.parallel 14/27 Test #14: precice.parallel ..........................***Failed 0.00 sec /usr/local/bin/mpiexec: Error: unknown option "-n" Start 15: precice.query 15/27 Test #15: precice.query .............................***Failed 0.00 sec /usr/local/bin/mpiexec: Error: unknown option "-n" Start 16: precice.testing 16/27 Test #16: precice.testing ...........................***Failed 0.00 sec /usr/local/bin/mpiexec: Error: unknown option "-n" Start 17: precice.utils 17/27 Test #17: precice.utils .............................***Failed 0.00 sec /usr/local/bin/mpiexec: Error: unknown option "-n" Start 18: precice.xml 18/27 Test #18: precice.xml ...............................***Failed 0.00 sec /usr/local/bin/mpiexec: Error: unknown option "-n" Start 19: precice.solverdummy.build.cpp 19/27 Test #19: precice.solverdummy.build.cpp ............. Passed 0.87 sec Start 20: precice.solverdummy.build.c 20/27 Test #20: precice.solverdummy.build.c ............... Passed 0.26 sec Start 21: precice.solverdummy.build.fortran 21/27 Test #21: precice.solverdummy.build.fortran ......... Passed 0.29 sec Start 22: precice.solverdummy.run.cpp-cpp 22/27 Test #22: precice.solverdummy.run.cpp-cpp ...........***Failed 0.06 sec -------------------------------------------------------------------------- It looks like opal_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during opal_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_shmem_base_select failed --> Returned value -1 instead of OPAL_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "Error" (-1) instead of "Success" (0) -------------------------------------------------------------------------- DUMMY: Running solver dummy with preCICE config file "/home/zanella/softwares/precice-2.2.0/examples/solverdummies/precice-config.xml", participant name "SolverTwo", and mesh name "MeshTwo". *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) [megamind:04868] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- It looks like opal_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during opal_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_shmem_base_select failed --> Returned value -1 instead of OPAL_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "Error" (-1) instead of "Success" (0) -------------------------------------------------------------------------- DUMMY: Running solver dummy with preCICE config file "/home/zanella/softwares/precice-2.2.0/examples/solverdummies/precice-config.xml", participant name "SolverOne", and mesh name "MeshOne". *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) [megamind:04867] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! CMake Error at /home/zanella/softwares/precice-2.2.0/cmake/runsolverdummies.cmake:34 (message): An error occured running the solverdummies! Return code : "1" Start 23: precice.solverdummy.run.c-c 23/27 Test #23: precice.solverdummy.run.c-c ...............***Failed 0.05 sec -------------------------------------------------------------------------- It looks like opal_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during opal_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_shmem_base_select failed --> Returned value -1 instead of OPAL_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like opal_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during opal_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_shmem_base_select failed --> Returned value -1 instead of OPAL_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "Error" (-1) instead of "Success" (0) -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- DUMMY: Running solver dummy with preCICE config file "/home/zanella/softwares/precice-2.2.0/examples/solverdummies/precice-config.xml", participant name "SolverTwo", and mesh name "MeshTwo". *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) [megamind:04895] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "Error" (-1) instead of "Success" (0) -------------------------------------------------------------------------- DUMMY: Running solver dummy with preCICE config file "/home/zanella/softwares/precice-2.2.0/examples/solverdummies/precice-config.xml", participant name "SolverOne", and mesh name "MeshOne". *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) [megamind:04894] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! CMake Error at /home/zanella/softwares/precice-2.2.0/cmake/runsolverdummies.cmake:34 (message): An error occured running the solverdummies! Return code : "1" Start 24: precice.solverdummy.run.fortran-fortran 24/27 Test #24: precice.solverdummy.run.fortran-fortran ...***Failed 0.09 sec DUMMY: Starting Fortran solver dummy... DUMMY: Starting Fortran solver dummy... -------------------------------------------------------------------------- It looks like opal_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during opal_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_shmem_base_select failed --> Returned value -1 instead of OPAL_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "Error" (-1) instead of "Success" (0) -------------------------------------------------------------------------- *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) [megamind:04921] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- It looks like opal_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during opal_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_shmem_base_select failed --> Returned value -1 instead of OPAL_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "Error" (-1) instead of "Success" (0) -------------------------------------------------------------------------- *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) [megamind:04922] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! CMake Error at /home/zanella/softwares/precice-2.2.0/cmake/runsolverdummies.cmake:34 (message): An error occured running the solverdummies! Return code : "1" Start 25: precice.solverdummy.run.cpp-c 25/27 Test #25: precice.solverdummy.run.cpp-c .............***Failed 0.07 sec -------------------------------------------------------------------------- It looks like opal_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during opal_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_shmem_base_select failed --> Returned value -1 instead of OPAL_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "Error" (-1) instead of "Success" (0) -------------------------------------------------------------------------- DUMMY: Running solver dummy with preCICE config file "/home/zanella/softwares/precice-2.2.0/examples/solverdummies/precice-config.xml", participant name "SolverTwo", and mesh name "MeshTwo". *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) [megamind:04949] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- It looks like opal_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during opal_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_shmem_base_select failed --> Returned value -1 instead of OPAL_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "Error" (-1) instead of "Success" (0) -------------------------------------------------------------------------- DUMMY: Running solver dummy with preCICE config file "/home/zanella/softwares/precice-2.2.0/examples/solverdummies/precice-config.xml", participant name "SolverOne", and mesh name "MeshOne". *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) [megamind:04948] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! CMake Error at /home/zanella/softwares/precice-2.2.0/cmake/runsolverdummies.cmake:34 (message): An error occured running the solverdummies! Return code : "1" Start 26: precice.solverdummy.run.cpp-fortran 26/27 Test #26: precice.solverdummy.run.cpp-fortran .......***Failed 0.08 sec DUMMY: Starting Fortran solver dummy... -------------------------------------------------------------------------- It looks like opal_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during opal_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_shmem_base_select failed --> Returned value -1 instead of OPAL_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "Error" (-1) instead of "Success" (0) -------------------------------------------------------------------------- *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) [megamind:04976] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! -------------------------------------------------------------------------- It looks like opal_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during opal_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_shmem_base_select failed --> Returned value -1 instead of OPAL_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "Error" (-1) instead of "Success" (0) -------------------------------------------------------------------------- DUMMY: Running solver dummy with preCICE config file "/home/zanella/softwares/precice-2.2.0/examples/solverdummies/precice-config.xml", participant name "SolverOne", and mesh name "MeshOne". *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) [megamind:04975] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! CMake Error at /home/zanella/softwares/precice-2.2.0/cmake/runsolverdummies.cmake:34 (message): An error occured running the solverdummies! Return code : "1" Start 27: precice.solverdummy.run.c-fortran 27/27 Test #27: precice.solverdummy.run.c-fortran .........***Failed 0.13 sec -------------------------------------------------------------------------- It looks like opal_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during opal_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_shmem_base_select failed --> Returned value -1 instead of OPAL_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "Error" (-1) instead of "Success" (0) -------------------------------------------------------------------------- DUMMY: Running solver dummy with preCICE config file "/home/zanella/softwares/precice-2.2.0/examples/solverdummies/precice-config.xml", participant name "SolverOne", and mesh name "MeshOne". *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) [megamind:05004] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! DUMMY: Starting Fortran solver dummy... -------------------------------------------------------------------------- It looks like opal_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during opal_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_shmem_base_select failed --> Returned value -1 instead of OPAL_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like orte_init failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during orte_init; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): opal_init failed --> Returned value Error (-1) instead of ORTE_SUCCESS -------------------------------------------------------------------------- -------------------------------------------------------------------------- It looks like MPI_INIT failed for some reason; your parallel process is likely to abort. There are many reasons that a parallel process can fail during MPI_INIT; some of which are due to configuration or environment problems. This failure appears to be an internal failure; here's some additional information (which may only be relevant to an Open MPI developer): ompi_mpi_init: ompi_rte_init failed --> Returned "Error" (-1) instead of "Success" (0) -------------------------------------------------------------------------- *** An error occurred in MPI_Init *** on a NULL communicator *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, *** and potentially your MPI job) [megamind:05005] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! CMake Error at /home/zanella/softwares/precice-2.2.0/cmake/runsolverdummies.cmake:34 (message): An error occured running the solverdummies! Return code : "1" 11% tests passed, 24 tests failed out of 27 Label Time Summary: Solverdummy = 1.91 sec*proc (9 tests) petsc = 0.00 sec*proc (1 test) Total Test time (real) = 1.95 sec The following tests FAILED: 1 - precice.acceleration (Failed) 2 - precice.action (Failed) 3 - precice.com (Failed) 4 - precice.cplscheme (Failed) 5 - precice.io (Failed) 6 - precice.m2n (Failed) 7 - precice.mapping (Failed) 8 - precice.mapping.petrbf (Failed) 9 - precice.math (Failed) 10 - precice.mesh (Failed) 11 - precice.partition (Failed) 12 - precice.interface (Failed) 13 - precice.serial (Failed) 14 - precice.parallel (Failed) 15 - precice.query (Failed) 16 - precice.testing (Failed) 17 - precice.utils (Failed) 18 - precice.xml (Failed) 22 - precice.solverdummy.run.cpp-cpp (Failed) 23 - precice.solverdummy.run.c-c (Failed) 24 - precice.solverdummy.run.fortran-fortran (Failed) 25 - precice.solverdummy.run.cpp-c (Failed) 26 - precice.solverdummy.run.cpp-fortran (Failed) 27 - precice.solverdummy.run.c-fortran (Failed) Errors while running CTest