Nonuniform grid coupling

For FSI problem:
Solid grid is uniform: 0 1 2 3 4 5 6 7 8 9 10
Fluid grid is non-uniform: 0 0.5 1 1.5 2 3 4 5 6 7 8 9 10
Both the mapping scheme of nearest-neighbor and rbf-thin-plate-splines will give a ‘nan’ error.
If both grids are uniform (not conforming), there is no problem.

I want to make sure if it is possible to use non-uniform grids for coupling. If yes, how can I set up the mapping scheme to resolve this problem?

Coupling between non-matching/non-conforming meshes is definitely possible. Could you describe the setup a bit more and maybe post the log files showing the error as well as the preCICE configuration? This would help understanding the problem better. Where does the nan error appear? In preCICE or in one of your solvers?

precice-config-explicit.xml (1.8 KB)
The configuration is attached.

I’m testing and learning the coupling procedure for a further FSI problem. So I use python for both Fluid and Solid solvers.
The solid solver calculate the deflection of a a flap based on the force from the fluid solver. The grid points are artificially created in a one-dimension fashion. Since preCICE needs 2D or 3D meshes, I set all the x-coordinates zero for a vertical flap and only y coordinates are set accordingly).

For fluid, the first half use a small dx, and the second half use a larger increment.
`
grid = np.zeros([ns, dimensions])

grid[:, 1] = np.linspace(0, length, ns) # y component

dx = grid[1,1] - grid[0,1]

for i in range(1, ns):

if i < ns/2:

    grid[i,1]=grid[i-1,1] + dx/2

else:

    grid[i,1]=grid[i-1,1] + 3/2*dx

`

The solid grid is uniformly created:
`
grid = np.zeros([ns, dimensions])

grid[:, 1] = np.linspace(0, length, ns) # y component

grid[:, 0] = 0 # x component, leave blank
`

It seems the solution blows up after some iteration with non-matching mesh. For non-conforming uniform meshes, it runs well.

The error message given by the solid participant is as follows:

Solid: Advanceing in time
—[precice] Time window completed
—[precice] iteration: 1, time-window: 277, time: 2.76 of 100, time-window-size: 0.01, max-timestep-length: 0.01, ongoing: yes, time-window-complete: yes,
OK3
Iter = 276, Err = 9.76966e-01

Solid: Advanceing in time
—[precice] Time window completed
—[precice] iteration: 1, time-window: 278, time: 2.77 of 100, time-window-size: 0.01, max-timestep-length: 0.01, ongoing: yes, time-window-complete: yes,
OK3
Iter = 277, Err = 1.00000e+00

Solid: Advanceing in time
—[precice] Time window completed
—[precice] iteration: 1, time-window: 279, time: 2.78 of 100, time-window-size: 0.01, max-timestep-length: 0.01, ongoing: yes, time-window-complete: yes,
OK3
Iter = 278, Err = nan

Solid: Advanceing in time
—[precice] Time window completed
—[precice] iteration: 1, time-window: 280, time: 2.79 of 100, time-window-size: 0.01, max-timestep-length: 0.01, ongoing: yes, time-window-complete: yes,
Iter = 279, Err = nan
Solid: Advanceing in time ---[precice] Time window completed ---[precice] iteration: 1, time-window: 281, time: 2.8 of 100, time-window-size: 0.01, max-timestep-length: 0.01, ongoing: yes, time-window-complete: yes, OK3 Iter = 280, Err = nan

As a first guess: Are you sure that you want two times a consistent mapping? Maybe one of your quantities needs a conservative mapping. More information is in the documentation and you can also have a look at the perpendicular flap example and its configuration.

Besides that explicit coupling is not necessarily stable. You might have to switch to an implicit (=iterative) coupling. Maybe somebody with more knowledge about that will comment as well.

It seems weird that both data are mapped using consistent. That’s because I just want to transfer the raw data between fluid and solid.

Since explicit coupling for uniform grids works well, I expect it also works for a non-uniform grids (changed a little bit from uniform). However it is not stable (diverged) as you mentioned.

Now the solution converges but is not correct when I switched to an implicit coupling.

I read the document and tutorials you recommended. One thing I do not understand. Since the data is exchanged between each other (both need to read and write some data), why the mapping is set in the Fluid participant rather than the Solid or should it be set in both participants?

It seems it is not the error from preCICE, but from my python code.
Strangely, if I add a for loop to modify the coordinates, the error will appear, even though the coordinates are not changed actually.

grid = np.zeros([11, 2])
grid[:, 1] = np.linspace(0, 1, 11)  # y component
print(grid) # y=0, 0.1, 0.2, 0.3, ..., 1.0; x=0

dy = grid[1,1] - grid[0,1]

# The following loop does not change the coordinates actually.
# But it will cause the problem.
# If I delete the for block, the coupling has no issue.
for i in range(1, 11):
    if i < 11/2:     # first half section
        grid[i,1]=grid[i-1,1] + dy
    else:             # second half section
        grid[i,1]=grid[i-1,1] + dy
print(grid)   # give the same grid as before the loop        

vertex_ids = interface.set_mesh_vertices(mesh_id, grid)

Can anybody have a quick look at the code above and point out the mistake?

Thanks a lot!

You can set the mapping on either participant:

  • both in the Fluid
  • one in the Fluid, one in the Solid
  • both in the Solid

The important part is that the arrows at the config visualization are “flowing” to the same direction.

I don’t see any probem here, but I also don’t understand how the for loop helps: doesn’t it result in exactly the same grid as above?

Could you please upload the full log of your solver and your preCICE configuration file?

Also, have you tried visualizing the exported meshes? Maybe you find out that they are geometrically completely disparate.

logs-no-loop.zip (5.3 KB)
logs-using-loop.zip (3.5 KB)
Hi Makis,
Thank you for your reply and hints.

Does this mean it does not matter where I set the mapping, I just need to make sure the data are transferred as I want?

You can set the mapping on either participant:

  • both in the Fluid
  • one in the Fluid, one in the Solid
  • both in the Solid

The important part is that the arrows at the [config visualization ]>(Config visualization | preCICE - The Coupling Library) are “flowing” to >the same direction.

I don’t see any problem here, but I also don’t understand how the for loop helps: doesn’t it result in exactly the same grid as above?

For loop is not necessary, but I just found the problem. I’m afraid there are some memory problems there when transferring the data. But it is by no means, it runs well when I set the grid manually, but it blows up when using loop to set the same gird.

set grid by using for loop

ns = 11
l = 0.05
grid = np.zeros([ns, 2])

grid[:, 1] = np.linspace(0, 1, num = ns)  # y component
dy = grid[1,1] - grid[0,1]
for i in list(range(1, ns)):
    if i < ns/2:
        grid[i,1]=grid[i-1,1] + dy *1.5
    else:
        grid[i,1]=grid[i-1,1] + dy *0.5
grid = grid*l

set grid manually

ns = 11
l = 0.05
grid = np.zeros([ns, 2])
grid[:, 1] = np.linspace(0, 1, num = ns)  # y component
grid[:,1] = [0, 0.15, 0.30, 0.45, 0.60, 0.75, 0.80, 0.85, 0.90, 0.95, 1.0]
grid = grid*l

Could you please upload the full log of your solver and your preCICE configuration file?

I uploaded the log files for both cases. I cannot read information from the log files. They are attached. In case you cannot open the zip file, please change the extent from zip to rar. I made the zip files using winrar.

Also, have you tried visualizing the exported meshes ? Maybe you find out that they are geometrically completely disparate.

I exported meshes (good to know I can do it) but paraview does not show the grid points ( probably because my grid points is only 1D). I opened the vtk file and the coordinates are correct.


Here is my non-uniform fluid grid. I rotated 90 degree for display. In the code, x=0 for all points any y ranges in [0, 0.05].

It can matter in some cases (specific mapping combinations when running parallel simulations, or performance differences when running highly unbalanced parallel simulations), but this is not an issue here.

It sounds to me that there must be a Python coding issue somewhere. I would try to check that the grid has the exact same type and shape in both cases. Also, whenever I see l as a variable, I am getting a bit scared for accidentally mixing it up with 1 later on. :grimacing:

Since your interface does not contain mesh connectivity, ParaView does not know how to display these points by default. I usually add a Glyph filter, as described in the documentation.

This should be totally fine for preCICE. We use non-uniform grids all the time, also in our tutorials (e.g., in the Flow over heated plate | preCICE - The Coupling Library)

1 Like

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.