Water quality modelling from MPI simulation and multiple COM files - Delwaq - Delft3D
intro story DELWAQ
DELWAQ is the engine of the D-Water Quality and D-Ecology programmes of the Delft3D suite. It is based on a rich library from which relevant substances and processes can be selected to quickly put water and sediment quality models together.
The processes library covers many aspects of water quality and ecology, from basic tracers, dissolved oxygen, nutrients, organic matter, inorganic suspended matter, heavy metals, bacteria and organic micro-pollutants, to complex algae and macrophyte dynamics. High performance solvers enable the simulation of long periods, often required to capture the full cycles of the processes being modelled.
The finite volume approach underlying DELWAQ allows it to be coupled to both the structured grid hydrodynamics of the current Delft3D-FLOW engine and the upcoming D-Flow Flexible Mesh engine (1D-2D-3D) of the Delft3D Flexible Mesh Suite (or even other models such as TELEMAC).
'DELWAQ in open source' is our invitation to all leading experts to collaborate in further development and research in the field of water quality, ecology and morphology using Delft3D. Feel free to post your DELWAQ related questions or comments in this dedicated forum space. If you are new to DELWAQ, the tutorial (in the user manual) is a good place to start. A list of DELWAQ related publications is available here.
** PLEASE TAG YOUR POST! **
Water quality modelling from MPI simulation and multiple COM files
Ben Williams, modified 7 Years ago.Jedi Knight Posts: 114 Join Date: 3/23/11 Recent Posts
I'd like to be able to do some water quality modelling using MPI simulation results (as opposed to domain decomposition).
Are there any workarounds for getting the DELWAQ module to recognize multiple COM files? For example, mocking up a ddb file (would this work?). Or maybe there are some tools to stitch COM files together?
Leo Postma, modified 7 Years ago.
If your hydrodynamic setting has produced a number of .com files for different MPI nodes and if they are fully compatible with the ddb type of multidomain definition in Delft3D, then you can use a standard procedure. You need the GUI for that and you can click: Water Quality => General => Coupling => Define input => Hydrodynamics => DD-bound and there you select your dd-bound file and you perform you coupling.
Out comes one hydrodynamic dataset for use with Delwaq that will run in one instance (there is no MPI or DD facility for Delwaq yet. Delwaq only supports OMP parallelism).
Your first Delwaq simulation is simple. You select your hydrodynamic result in the Delwaq GUI. You set the integration option under Numerical options at 15 (that is a robust implicit solver). You save your setting (this creates an ASCII input file for Delwaq with one substance called continuity that has 1.0 as initial condition and 1.0 at all boundary conditions). Then through pushing Waq(1) and selecting your saved input file and then pushing Waq(2), you are up and running the so called 'constancy preserving test' on the hydrodynamics.
From the installation of the GUI setting you may identify the underlying scripts and you may create batch files and edit input files to do things without the GUI, but it is always good to have a running simple sample simulation with the GUI first. You may also use the GUI to create your input on MS-Windows and use it on Linux for your simulations.
Some additional remarks on Leo's anwer,
We have been working on this and we were able to create a single delwaq domain but with the next constraints:
use the built in coupling in de flow executable (see B.18 Creating Delft3D-WAQ input files in the flow manual)
use flow executable from revision 2068 on
you have to create your own ddb file based on the flow partitioning
use the ddcouple program to merge the domains, from revision 25853 on the ddcouple program will take a ddb on the command line as argument. (ddcoule.exe <name ddb file>)
this version of ddcouple is included in the user interface release for delwaq open source.
We are planning automated creation of the ddb file but this is not implemented yet.
I hope this gets you going,
Did you implement the automated creation of the ddb file?
Yes, the automatic generation of the ddb file when using flow in parallel is implemented. The latest checkin with these changes took place 17/12/2013 revision 3220.
There is more to mention on the subject. The program ddcouple which glues the domains back together for WAQ would not produce exactly the same grid as the original flow grid, the reason being that ddcouple always adds domains together in the M direction of the grid and also that the seperate domains have extra grid lines at the domain boundaries which stay there in the overall grid.
We changed the ddcouple program in such a way that if the -parallel option is added to the command line the resulting waq grid will be exactly the same as if the flow run did not run in parallel. This way initial conditions and parameters as generated by quickin are directly usable. Also results from waq runs using different runs from FLOW with different parallelisation are compatible this way.
This was implemented in ddcouple at 7/1/2014. ddcouple is not part of the open source tree but is made availeble as executable to the community. I attach hereby the latest version of ddcouple.
First of all, thanks for your quick response...
I'm trying to run DELWAQ with a MPI simulation of FLOW (16 cores) in a LINUX cluster so I have 16 com-files. I've already read all the posts in te forum about this matter, however, I have some questions because I cannot solve the problem yet. Probably, I've missed something, that's why I'm going to describe the full procedure that I've followed:
1. use the built in coupling in de flow executable (see B.18 Creating Delft3D-WAQ input files in the flow manual)
In my flow simulation, I've added this three additional commands as follows: "Flwq 1440 60 23040; ilAggr 1 1 1 1 1 1 1 1 1 1; WaqAgg #active only#". As a result, I've obtained the same FLOW results as in my previous simulations and, aditionally, several com-files with different extensions (.are,.atr, .cco, .dps, .flo, .hyd, .len, .lga, .lgo, .lgt, .poi, .sal, .srf, .tau, .tem, .vdf, .vol, .wdt) which are the typical files that you obtain when you coupled a serial simulation.
2. you have to create your own ddb file based on the flow partitioning.
For this purpose, I've read the 16 com files with QUICKPLOT and export the morphological grid to a grid file (.grd). Next, I've loaded the 16 grid files on RFGRID, as you mention, there are an overlapping between each sub-domain... So my question is, where I should locate the ddboundaries? In all the boundaries of each sub-domain, in the cells overlapped of each sub-domain, do I have to delete the overlapped grid cells? Finally, once I've compiled the ddb file, how I should name each ddboundary? I've attached the generated ddb file (base_caso.ddb). Is this the optimal procedure to create the right ddb file?
3. the automatic generation of the ddb file when using flow in parallel is implemented. The latest checkin with these changes took place 17/12/2013 revision 3220.
I've not downloaded this revision yet, but the thing is: If I run a MPI simulation with this code, the dbb file is an automatic file result or should I activate something as in 1?
4. We changed the ddcouple program in such a way that if the -parallel option is added to the command line the resulting waq grid will be exactly the same as if the flow run did not run in parallel. This way initial conditions and parameters as generated by quickin are directly usable. Also results from waq runs using different runs from FLOW with different parallelisation are compatible this way.
With the method I've explained, I've tried to run in a DOS-window the new ddcouple.exe that you attached yesterday, as follows: "ddcouple.exe base_caso.ddb -parallel". Getting the next error messages attached in the file (base_caso-ddcouple.out). It looks like there is a problem allocating memory...
5. Finally, I would like to comment that all this method was caried out outside the GUI, should I use it for any tier?
Cheers and thanks again,
1) you should get these files for every domain (partitions)
2) The ddb file is is very systematic, you should be able to create it in a text editor. It should have only one boundary between two domains over the full length of the grid line. For partitioning in the N direction it would look like:
Suances-001.dat 1 nmax1-3 mmax nmax1-3 Suances-002.dat 1 3 mmax 3
Suances-002.dat 1 nmax2-3 mmax nmax2-3 Suances-003.dat 1 3 mmax 3
Suances-003.dat 1 nmax3-3 mmax nmax3-3 Suances-004.dat 1 3 mmax 3
Where mmax is the mmax of the overall domain. nmax1 is the number of n lines in the first domain, nmax2 of the second etc.
3) ddb is automatic when the coupling is activated.
4) It looks like it does not read the mmax and nmax of the first domain correctly, and then gets problems. It might be that for your version not all files are written correctly (see also point 1)
5) This should not be neccesary.
I strongly advise you to get the latest version and see if it works for svereal issues regarding the coupling to WAQ in parallel have been solved.
Thanks for your responses and advices.
I've already downloaded the following version: delft3d - Revision 3402: /tags/3058. I'm trying to model an MPI simulation with FLOW and OMP simulation with WAQ in this version, following your advices. I hope I am able to do this task...
Finally, I was able to run a FLOW-MPI simulation and coupling to a WAQ-OMP simulation with the latest version of Delft3D. The steps were:
1. Activating the 'Export WAQ' in the output tab of the GUI, as you mention, all the necessary files, including ddb file, are automatic created.
2. Running 'ddcouple.exe' (the attached version in this thread) obtaining the WAQ coupled files in a single domain from a multiple domain of MPI.
3. With these files, running WAQ simulations to test that work properly.