Parallel running on Windows - D-Flow Flexible Mesh - Delft3D
intro story D-Flow FM
D-Flow Flexible Mesh
D-Flow Flexible Mesh (D-Flow FM) is the new software engine for hydrodynamical simulations on unstructured grids in 1D-2D-3D. Together with the familiar curvilinear meshes from Delft3D 4, the unstructured grid can consist of triangles, pentagons (etc.) and 1D channel networks, all in one single mesh. It combines proven technology from the hydrodynamic engines of Delft3D 4 and SOBEK 2 and adds flexible administration, resulting in:
An overview of the current developments can be found here.
The D-Flow FM - team would be delighted if you would participate in discussions on the generation of meshes, the specification of boundary conditions, the running of computations, and all kinds of other relevant topics. Feel free to share your smart questions and/or brilliant solutions!
Parallel running on Windows
So I've tried to run simulation parallel on 4 partitions.
First I used the example script run_flow2d3d.bat to get correct path, and after adjusting the path the script worked and I got the same results as by running in standard way ( through Delft window ).
Next, I've opened the script run_flow2d3d_parallel.bat and read that I have to execute smpd -install first. I have created a new bat file in TextPad containing:
and placed it in my bin folder where was the smpd.exe file. After double clicking the script run saying "Unable to remove the previous installation,install failed."
Than I tried to run: smpd.exe
pause (changed content in created smpd.bat file ), and saw cmd window which explained what means smpd -install,smpd -d,smpd -s...,
Than I tried to run: smpd -d
and got new cmd window Picture1.
Than I've tried to run run_flow2d3d_parallel.bat and it worked, but the results were unreal and nothing like results I got using the standard way ( through Delft window ).
Please, any suggestions, overlooked details...
In the attachments is my run_flow2d3d_parallel.bat script.
It seems that smpd is started too often. You can check this in the Windows Task manager. The process named "smpd" should be there only once.
You don't have to start smpd again for each calculation, also not when restarting your PC. Just start it one single time and then it runs on your machine for ever.
About the rubbish results:
I'm thinking on two possible causes:
- May be it has to do with multiple smpd processes running on your machine. To check this: restart your PC and use the Task manager to check that there is only one "smpd" process running.
- The model contains features that currently do not work when running parallel. Can you try running "https://svn.oss.deltares.nl/repos/delft3d/trunk/examples/01_standard/run_flow2d3d_parallel.bat", using 2 partitions?
I've tried to run script using 2 partitions, and while it was running one smpd process was in the Windows Task manager.
Also the results are still incorrect.
There is Picture2 in the attachments showing the final cmd window after the simulation ends.
Maybe the answer is there.
Today I solved a library mismatch problem that may also solve your problem (depending on the version and the tools you are using).
Can you try to start with a clean directory, check out the trunk (revision 1846 or higher), recompile everything (release mode) and run example 01 in parallel?
Today I've finally tried your suggestion, and using the Quickplot as a result viewer I got matching results for all parameter except parameter instantaneous discharge.
Every time I want to see instantaneous discharge Quickplot shows me cumulative discharge ( parallel run ) .
The cumulative discharge graphs are the same for standard ( Delft window ) and parallel ( script ) running.
The same thing happened for both example 01 and my own model.
P.S. The same thing happened with GPP.
You're right; I can reproduce this problem. Seems to be a bug in Delft3D-FLOW. We will solve this. But that may take a while.
Thanks for reporting.
I have also discovered a problem with exporting data to csv file . For example: If I want to export time series of water level ( for some observation point) to csv file, the date I am getting in that file do not match to associated picture ( Quick View ). Data exported in mat or tekal file do match to associated picture.
I checked it for example "01_standard" and it looks fine (date is 1990-08-05, time starts at zero and increases everytime with 5 minutes). Can you describe in more detail what goes wrong?
In the attachments are two examples. It's about two observation points o1 and up in the same model.
The difference in the results for point o1 is not big, but for point up is.
So it's a simple bathymetry with bridge modeled as dry points. Every time I use total discharge as up and downstream boundary (for every model I have worked on), I get instability and model breaks or I need a long simulation time for model to stabilize.
For this model I made HEC-RAS 1D model to get water level at downstream boundary for wanted discharge (400m3/s).
Then in D3D I used same bathymetry, for initial condition uniform value (103), for upstream boundary total discharge (26 09 00 00 _ 0 ; 26 09 06 00_ - 400 ; 27 09 00 00_ - 400 ), for downstream boundary water level, same at both ends (26 09 00 00 _ 103 ; 26 09 06 00_105.4- FROM HEC ; 27 09 00 00_105.4-FROM HEC ). Run the model to get restart file at 27 09 00 00.
Than run model again using the restart file as initial conditions, upstream boundary total discharge ( 27 09 00 00_ - 400 ; 27 09 06 00_ - 400 ; 28 09 00 00_ - 400 ), downstream boundary total discharge (27 09 00 00_ -400 ; 27 09 06 00_ - 400 ; 28 09 00 00_ - 400). I have also changed the reference time at 27 09 00 00.
Discharge is with - because of the orientation of m,n co-ordinate system.
The weird part is that model behaves normally for about an hour and then goes crazy and breaks ( you can see this in map file using the longitudinal cross- section ).
Up and down are the names of obs.points and cross-sections near the boundaries.
It would be quite useful to use total discharge as downstream boundary condition in cases with no measurements of water level.
Any suggestion how to trick it, and use total discharge as downstream boundary condition?
Thanks for posting the data.
About your "water level" files:
Comparing the csv with the tek and the figure, everything seems to be fine. The only difference is that the csv file contains 6 digits and the tek file 15. Is that the problem? Note that Delft3D-FLOW writes output in single precision, so the 15 digits in the tek file are not all significant.
About your "example" files:
The files "trih-example.def" and "trim-example.def" are missing. They are needed to be able to view the data.
Your method of restarting looks fine.
To get more help about your model, you have the following options:
- Hopefully someone reacts on you posts
- Use one of our service packages, see services.
- Follow one of our courses, see also at services.
About obs.point I see what is the problem.
When I open csv file in Excel some of the date is missing ( seeing the data as Date), but using the Textpad I got matching result.
I put def files so you can have a look.
Thank you for everything.
QuickPlot cannot show the results of the map-file you posted. I don't know what went wrong. With VSI I managed to have a look in the files and they seem to contain normal data.
Some things to try/check:
- When restarting, be sure to use the same model with minimal input changes. More changes may mess up the simulation.
- If the (restarted) simulation starts normal but suddenly gives strange results, may be the Courant condition is not satisfied (locally). Does the simulation run with a halved time step?
- Write a lot of map fields around the time point where the problems begin. May be you can find out in what location the problems start and that may be a clue to what goes wrong.
The bug with the instantaneous discharge when running parallel is solved in revision 1890.
Thank you for replaying. I was quiet busy lately because of I'm running late with response.
I couldn't make it work with these boundary conditions, but I found a way to get the results I wanted.
Great news about instantaneous discharge !