The link is unfortunately private. I guess you have to adjust some sharing option in Google Drive.
There is no simple to answer to this. Some guidelines:
- The more past information you use the better the performance in general (as long as you don’t use too much, then the condition gets bad). But then the coupling scheme has also more troubles to react to sudden changes. An interesting future direction of research would be to make IQN more adaptive.
- The filter should work, meaning not filter out too little, but also not too much. You can check the number of filtered out columns per timestep in the “iterations” file of the second participant.
This sounds like an intersting use-case for the waveform relaxation we want to implement in preCICE, @BenjaminRodenberg’s PhD project. Preliminary results here, but might still take a year till we have this directly in preCICE. At the moment, there is no good solution for this.
Well, hard question again. The black-box philosophy would say that both solvers should use what is best for them, so to balance the error between both. And then the data mapping has just to cope with those meshes. But, of course, the meshes also have an influence on the mapping error.
Maybe some hope here: we are currently working on an artificial testing environment for data mappings. So that you can test mappings between your meshes without needing to run the complete coupled simulation. Because in the end, things like this simply need a lot of testing and comparison.
This is odd. If you have z-dead="true"
the RBF mapping should not see the width in z-direction at all. Could CalculiX have troubles with too small aspect ratios?
You do give force values in z direction. Then, CalculiX also computes a deformation in z direction and, thus, you get different displacements along the z axis. Could an easy solution be to set the z components of the forces to 0?
That is, unfortunately, beyond my expertise. Maybe @KyleDavisSA knows more. Or you could ask on the CalculiX mailing list.