FF coupling with python 1D in-house code for flow distribution

Hello,

First of all, I have to thank you for the software and the effort you have put in the implementation and documentation. I will try to give a brief overview of the problem I am tackling and see whether you have had any previous experience that could be useful.

The physical problem we are trying to solve is the flow through a series of distributing manifolds that are connected by a (high) number of (relatively) 1D channels. The brute-force approach to the problem is to include these many channels into the 3D fully-turbulent CFD analysis. However, by doing so, most of the computational cost is invested in a rather boring turbulent flow in 1D pipe. In the other extreme, I could use a simple 1D approach for the distribution manifolds; but in that way I would lose many important flow features that happen in the manifolds that surely cannot be modelled as 1D flow. Some time ago, I wrote THIS post with a deeper description of my problem in case you are interested. As you may see, the post received an overwhelming amount of support from the OpenFOAM community.

Ideally, I would like to use OpenFOAM to solve the flow in the manifolds (3D CFD) and one in-house code that we have developed over the last years to solve the flow in pipes (1DPD). As you may have already realized, I think that preCICE (FF module) is exactly what I am looking for.

We spent a few days installing things and modifying our in-house code so it can communicate with preCICE. This code is written in python so we will use the wrapper for this language. After some changes here and there, we do have both programs nicely talking to each other and everything seems fine.

Our final problem will involve several manifolds in series connected by different channels. Therefore, some of the interfaces would be manifold inlets and some others would be manifold outlet. However, for the first try let’s keep it simple. We are modelling the flow in a square box (roughly the size of a shoe box) which has one inlet in the top in which we are imposing an inlet velocity and two outlets in one of the lateral walls. The two outlets are connected to two straight pipes that are modelled with our 1DPD in-house code. The interface takes the flowrate from each of the 2 OpenFOAM outlets and imports it into 1DPD. Then we solve the steady-state 1D approximation (simple turbulent flow in pipes) which provides the pressure drop. This pressure is then passed back from 1DPD into each of the OpenFOAM outlets.

So far, we have tried with a serial-explicit coupling in which we allow simpleFoam solver (that looks for a steady state solution) to have enough iterations between two successive preCICE-1DPD communications so as to reach a reasonable solution. After a considerable amount of try and error, the solution has only worked if we limit the pressure change imposed by 1DPD to OpenFOAM to very small changes (by means of a relaxation factor). We believe that part of the problem may be caused because the pressure mapping (1DPD->OpenFoam) is imposed in the faceCenters instead of in faceNodes (which causes backflow in some cases). However, we are not entirely sure the problem comes from this source.

In order to change perspectives, we are now trying with serial-implicit coupling through preCICE. However, we are reaching the following error:

---[precice] ERROR:  The required actions write-iteration-checkpoint are not fulfilled. Did you forget to call "markActionFulfilled"? 
---[precice] ERROR:  Sending data to another participant (using sockets) failed with a system error: write: Broken pipe. This often means that the other participant exited with an error (look there). 

However
cowic = precice.action_write_iteration_checkpoint() is called.

For reference, I quote here the Python code where we call both our in-house code 1DPD and the preCICE interface:

import DPD #in-house program 1DPD to solve steady-state flow in 1D channels
import precice

#% ---- Init PRECICE-------------------------------------------------------------#
configuration_file_name = '../precice-config.xml'
participant_name = 'fluid_1d'
solver_process_index = 0
solver_process_size = 1
pp = precice.Interface(participant_name,configuration_file_name,solver_process_index,solver_process_size)
mesh_id = pp.get_mesh_id('1d_mesh')
coric = precice.action_read_iteration_checkpoint()
cowic = precice.action_write_iteration_checkpoint()

#% ---- Init 1DPD-------------------------------------------------------------#
interface_mesh=DPD.initialize()

#% ---- Set interface mesh into precice-------------------------------------------------------------#
vertex_ids = pp.set_mesh_vertices(mesh_id,interface_mesh)
id_p = pp.get_data_id('Pressure',mesh_id)
id_u = pp.get_data_id('Velocity',mesh_id)

#% ---- Time loop-------------------------------------------------------------#
relaxation_factor=0.01
t = 0.0
dt_max = pp.initialize()
while pp.is_coupling_ongoing():
    #READ DATA FROM OPENFOAM
    if pp.is_read_data_available():
        u_readed = pp.read_block_vector_data(id_u,vertex_ids)

    if pp.is_action_required(cowic):
        #no action is required in 1DPD in order to saveOldState
        #Because the solver is steady-state and independent of initial conditions
        pp.mark_action_fulfilled(cowic)
        
    #SOLVE 1DPD
    DPD.updateBC(u_readed)
    p_to_send=DPD.solve(relaxation_factor)
    
    if pp.is_action_required(cowic):
        pp.mark_action_fulfilled(cowic)

    #SEND DATA BACK TO precice->OPENFOAM
    pp.write_block_scalar_data(id_p,vertex_ids,p_to_send)
    
    #ADVANCE TIME
    dt_max = pp.advance(dt_max)
    t += dt_max   

    if pp.is_action_required(coric):
        #no action in order to reloadOldState
        pp.mark_action_fulfilled(coric)

pp.finalize()

And I attach the config file:
precice-config.xml (2.4 KB)

So, our questions are:

  • Do you think Pressure mapping in FF module may be available for faceNodes? In fact, do you think this is the source of our problems?

  • Do you see any overall problem in our approach or do you think that our problem is more suitable for other strategy?

  • Do you think our code for serial-implicit coupling has any problem that prevents its correct execution?

I hope I could summarize the problem so you may easily understand. However, if you have any other question or need more information on more details on the coupling code, please let me know.

Best regards

Eduardo

Edit: formatting issues

Hi @ERodriguez,

looks like we are doing some similar research! I am also coupling 1D pipes with 3D CFD simulations (even though from a more toy-problem engineering-wise, as I am focusing on software).

Some notes:

  • The error you get is probably because you call action_write_iteration_checkpoint() outside the loop, so you don’t update them in following iterations. These will give potentially different values after each advance(). This part of the API is getting simpler in the upcoming preCICE v3.
  • It makes sense that you want to couple your 1D Python code with OpenFOAM, instead of simulating the complete domain in OpenFOAM. This is a common example also in nuclear reactor cooling, blood flow simulations, and more.
  • The current version of the FF module is quite basic and simplistic. We will release some updates, including some coupled boundary conditions, which you probably want to try. These are already in the develop branch, but that only works with the develop branch of preCICE (soon to be preCICE v3). See Updates to the FF module by thesamriel · Pull Request #281 · precice/openfoam-adapter · GitHub, as well as Markus’ thesis: mediaTUM - Media and Publication Server
  • In preCICE v3, we are also introducing a (very simplistic and experimental) geometric multiscale feature: https://github.com/precice/precice/pull/1675 This would allow you to do the mapping directly in preCICE. But probably only useful since the later v3.1. Feedback is very welcome!
  • simpleFoam sounds wrong to me for this use case. The coupled problem is transient, why add a steady-state solver in the mix?

Yes, yes, I know, many future references. v3 is coming… soon!?! Should still be this year. :sweat_smile:

If you are feeling brave, feel free to already try the develop branches of everything, but expect rough edges.

Hello, thanks for your answer and the interest, we are indeed working in very similar problems. We do acknowledge that this looks like a very interesting application for preCICE. Toy problems are very cool! We are indeed working in our shoebox-like flow distribution system to have the software running.

Thanks for spotting this, we thought that action_write_iteration_checkpoint() was only required to get the pointer outside of the loop. We will try what you suggest and let you know.

Some of the examples you quote (cooling applications) is exactly what we are aiming for. It seems such a waste of resources to mesh the CFD on the channels…

We are aware of the new updates on the FF module. We are not in a hurry for this development, but we definitely want it to work at some point. So we may as well wait for the 3.0 version (or higher). Especially because we rely on the python wrapper since our 1D in-house code is written in python. We are currently using coupled boundary conditions inletOutlet which set the backflow to (0,0,0) (or any specified value) whenever fluid is entering (are not leaving) the domain in the outlet sections. However, this did not help because the flow became unstable either way (just with zeros at the outlet). It seemed that the boundary condition is imposed after the actual computation of preCICE is conducted. Perhaps this is what you are showing. On the contrary, I read somewhere (but I cant find the link right now), that mixed boundary conditions may work better. This implies to specify both the value and the gradient of a variable (say the pressure at the outlet) to prevent backflow into the domain. Perhaps you have already try something in this lines, but I just mention it in case it helps.

I agree the current mapping capability 1D-3D is a bit of a stretch from the current capabilities. However, for the current purpose, we can work around. However, the two main problems we found are:

  • Pressure (from 1D mesh to 3D mesh) can only be mapped at the cellCenters. Perhaps getting cellNodes as an option may also be beneficial (though I am not really sure how openFoam imposes this B.C.). However, the pressure mapping (from 1D mesh into 3D mesh) is working relatively well since the pressure at the single 1D node may be applied to the whole 3D mesh.
  • On the contrary, we found that we need some tricks to map the 3D velocity field (at the faceCenters) into the 1D mesh. Since our code (and most likely, other codes as well) need the flowrate through the pipe system, we need to integrate the velocities in the 3D mesh. This integration (sum of Phi) requires knowing the area of each cell in the 3D mesh (which is currently not possible). If the mesh is regular at the outlet, the sum of velocities (what is sent to the 1D mesh by using conservative mapping) may be just multiplied by the faceArea in our in-house code. This requires a previous pre-processing stage in which we need to read these areas from the openFoam mesh and save it in textfiles or similar. However, this would be more complicated if the areas are not regular at the outlet (usual thing to happen with a real-world CFD mesh). In that case, with the current approach, we would need to pre-process the individual area of all cells at the outlet, then do the mapping in the in-house code side of preCICE. That is, we would need to send the 3D mesh at the faceCenters to the 1D code and then sum each velocity multiplied by their associated area (obtained from the external preprocessing). A possible way to bypass this is to send Phi instead of U from openFoam through preCICE, perhaps this is imposible, but it seems doable. Other way would be that faceAreas are also sent when doing the mapping in the 1D-side of the coupling.

This is a good debate point. In fact, before you replied we have already tested in the transient solver (pimpleFoam) and it is kind of working. However, the overall problem still remains, the solution is rather unstable and if we generate large pressure jumps from one coupling time-step to the next one backflow appears and the problem diverges. Therefore, we need to restrict the pressure increments to a small fraction of what we would like. We do this by artificially cropping the pressure increment at each time step in our 1D code. Essentially, this is analogous to having a valve at the outlet and preventing the valve operator the quick opening/closing of the valve. It kind of makes sense that large pressure jumps entail flow unstabilities.

In any case, for some situations (low turbulence, nice geometry, lack of high-energy eddies, etc.) the flow may be steady-state. For some of these cases, we have obtained a successful flow distribution by using a steady-state solver (with the consequent save in computational time). We therefore thought that this could be applicable. In some way, simpleFoam also iterates over a pseudo-time in order to look for a steady-state solution, so the principle does not differ too much. The behaviour is slightly better if we move to pimpleFoam, but the improvement is not drastic, we are still limited to very small pressure increments before the backflow appears and the solution diverges.

We look forward for it, and in the meantime, if any of our use-cases may be helpful for your development, feel free to ask, we will continue trying this in the next weeks/months, both for toy-problems and for more real-worls applications.

Regards
Eduardo

We have not tried a Robin-Robin coupling yet, but there is some literature on it. I have started an issue to keep track of the discussion: Robin-Robin FF coupling: mixed boundary conditions for flow coupling · Issue #231 · precice/openfoam-adapter · GitHub

Try using a compressible solver, instead (e.g., buoyantPimpleFoam). This could make a big difference in stability.

Maybe it is already the right time to have a meeting about this. Please send me an email about that, and let’s keep documenting the important points here. I am definitely looking for users for the geometric multiscale feature!