intro story D-Flow FM

 

D-Flow Flexible Mesh

D-Flow Flexible Mesh (D-Flow FM) is the new software engine for hydrodynamical simulations on unstructured grids in 1D-2D-3D. Together with the familiar curvilinear meshes from Delft3D 4, the unstructured grid can consist of triangles, pentagons (etc.) and 1D channel networks, all in one single mesh. It combines proven technology from the hydrodynamic engines of Delft3D 4 and SOBEK 2 and adds flexible administration, resulting in:

  • Easier 1D-2D-3D model coupling, intuitive setup of boundary conditions and meteorological forcings (amongst others).
  • More flexible 2D gridding in delta regions, river junctions, harbours, intertidal flats and more.
  • High performance by smart use of multicore architectures, and grid computing clusters.
An overview of the current developments can be found here.
 
The D-Flow FM - team would be delighted if you would participate in discussions on the generation of meshes, the specification of boundary conditions, the running of computations, and all kinds of other relevant topics. Feel free to share your smart questions and/or brilliant solutions! 

 

=======================================================
We have launched a new website (still under construction so expect continuous improvements) and a new forum dedicated to Delft3D Flexible Mesh.

Please follow this link to the new forum: 
/web/delft3dfm/forum

Post your questions, issues, suggestions, difficulties related to our Delft3D Flexible Mesh Suite on the new forum.

=======================================================

** PLEASE TAG YOUR POST! **

 

 

Sub groups
D-Flow Flexible Mesh
DELWAQ
Cohesive sediments & muddy systems

 


Message Boards

Delft3d flow error in PARALLEL (zeta layer) linux and works well in windows

shanas pr, modified 1 Year ago.

Delft3d flow error in PARALLEL (zeta layer) linux and works well in windows

Padawan Posts: 27 Join Date: 10/20/11 Recent Posts

Dear all Modelers


I am working on fully 3d- hydrodynamic flow simulation with delft3d tagged version 7545. Compiled and builded successfully on hpc cluster CRAY platform with MPI.( all the test cases working fine)

Domain covers 465 x 265 grid points with ~50m resolution and with 10 layer in the Zeta layer coordinate in vertical.{ 2 3 4 6 8 10 12 15 20 20 layer thickness for 10 layer where maximum depth is 400m}.

I have given astronomical boundary condition and transport condition with salinity and temperature at 10 layer at the boundary. Meteo input with spatially varying wind and atmospheric pressure imposed. And it works well in windows platform (delft3d 4.00.01) without any error but its a bit time consuming for one month simulation it takes more than 1 day in a time step of 5 seconds

Now i have few questions

 

i) Flow module crashes at 4% of the simulation when using parallel on linux ?? Error as usual the water level changes are bit high >25m and blows up

ii) is there any limitations for the computing nodes or processors for the specific domain???

 (as per my understanding the number of nodes cannot go more than 1/3rd of the kmax grids???)

iii) What is the best way of selecting the MPI parallel nodes?? what is the limitations and  how can we increase the number of cores?

iv) does the error due to the zeta layer thickness problem?? (same error when sigma coordinates used and works  well again in windows)

 

 

Kindly suggest the best way


Thanks in advance


regards
shanas