HPC Model Review: Difference between revisions
Content deleted Content added
(17 intermediate revisions by 4 users not shown) | |||
Line 1:
= Introduction =
TUFLOW HPC is an unconditionally stable 2D solution scheme that is mass conservative. This can result in HPC “hiding” poor data or model setup. Modellers must take measures to ensure the quality of their HPC models for commercial uses, by taking all reasonable steps to correct or at the very least understand areas of poor representation in their models.<br>
= TUFLOW Log File (*.tlf) =
The first step of reviewing a HPC model is no different to reviewing a TUFLOW Classic model. Start by opening the TUFLOW log file (*.tlf) and confirm at the bottom of the file, that the model run finished successfully by observing "Simulation FINISHED" and that the final mass error reported is acceptable as shown below:<br>
[[
Though HPC is volume conservative and shouldn't produce any mass
A 'healthy' model will usually report up to
= HPC TUFLOW Log File (*hpc.tlf) =
The .hpc.tlf log file records the model timestep, control numbers and volume of water in model at each timestep. It also shows repeated timesteps if the <u>[[HPC_Adaptive_Timestepping | control number limits]]</u> were exceeded or there is a significant change in
'''Note:''' TUFLOW HPC allows the <u>[[HPC_Adaptive_Timestepping | control numbers]]</u> to be breached by 20%. The timestep will not be repeated with
<br>
[[File: HPC_repeating_timestep.png]]
The new *hpc.tlf file can be also used to review the dt, controlling number values and the water volume in the model. However, this may be easier to review using the hpc.dt.csv output.
The last column of the *hpc.tlf file records the timesteping efficiency output. A value of 100% indicates that the HPC timesteps are perfectly aligned with the minimum stability timestep for complying with the three control numbers representing the Courant, Wave Celerity and Diffusion criteria.<br>
If virtual pipes are used in the model, the following information will also be reported in the hpc.tlf and console window:
* qInlet: flow extracted from the 2D domain.
* qSurcharge: flow surcharging out of inlets, due to the outlet flow being limited.
* qOutput: flow entering the 2D domain.
= HPC dt Time Series Output (*.hpc.dt.csv) =
This csv output provides a timeseries of the model timestep and the value of each control number at each timestep.<br>
[[File: Hpc_dt_csv.png |450px]]
<br>
It is recommended that modellers review the hpc.dt.csv file to check the dt timeseries output. There are two key features that modellers should look for in the hpc.dt.csv output, one is erratic bouncing of the dt values and the second is extremely low dt values. Anything less than 1/10 of a healthy TUFLOW Classic timestep could be consider low, but could vary depending on the study, for example a dam break assessment with high velocities and depths may require a much lower timestep. This becomes easier to observe by graphing the dt timeseries as shown below.<br>
Line 30 ⟶ 42:
This becomes valuable when trying to determine what is causing the low timestep and how to resolve it. For example:<br>
*An Nd value of 0.3 or higher, might suggest that there is poor boundary setup, or insufficient SX cells linked to a 1D structure.▼
*An Nc value of 1.0 or higher could be caused by an erroneously low cell elevation, resulting in an artificially large water depth.▼
*An Nu value of 1.0 or greater may indicate that the velocity is unusually high.
▲*An Nc value of 1.0 or higher could be caused by an erroneously low cell elevation, resulting in an artificially large water depth.
▲*An Nd value of 0.3 or higher
Because each relates to a hydraulic condition, they provide insight into what might be causing an issue in the model.<br>
= dt Map Output =
The dt map output is a grid map output that displays the calculated minimum timestep at each grid cell, and does not necessarily align with the timestep adopted in the model (the timestep in the model could be less if the previous timesteps were smaller). This helps identify which cells in the model are controlling the model timestep.<br>
<font color="blue"><tt>Map Output Data Types </tt></font><font color="red"><tt>==</tt></font><tt> dt </tt> writes the minimum dt (timestep) calculated for each cell in the model to the specified Map Output Format (XMDF, DAT, ASC, FLT etc.). Reviewing this output helps modellers identify which cells in the model have the lowest timestep and thereby control the model timestep. These locations are likely to have the greatest depth, velocity or turbulence in the model, forcing TUFLOW to lower the timestep to satisfy the conditions of the controlling numbers mentioned in the <u>[[HPC_Adaptive_Timestepping | HPC Adaptive Timestepping]]</u> page. If the timestep is extremely low the HPC model might be “hiding” poor data or model setup in this location. The simulation timestep will always be limited by the minimum occurring timestep anywhere in the model, hence it is important to ensure timestepping isn’t being unnecessarily restricted due to poor model configuration in one spot, even only in one cell. <br>
<br>
[[FILE: Min dt raster 01.png
<br>
The cause of the low timestep may become clearer when observing it geographically in the dt Map Output, as the low timestep may occur close to specific hydraulic features. This is seen in the example above, where an unusually low dt value is observed upstream, where the road intersects a local channel. It is helpful to review this in conjunction with the hpc.dt.csv to know which of the controlling numbers is causing the timestep to be lowered.<br>
Line 60 ⟶ 73:
= Control Number Factor Sensitivity Testing =
Sensitivity run can be done by reducing timestepping interval using Control Number Factor == 0.8. This reduces the timestep (as dictated by the three control numbers) by the factor 0.8, making the simulation to take roughly 20% longer. If comparing the two results shows negligible differences it is a strong indicator the default control number limits may be considered satisfactory. If there are unacceptable measurable differences, the location of these impacts may also help to spatially identify where the solution may have been struggling at the default control numbers. These areas can then be inspected for input topography or boundary conditions errors. Models shouldn’t need to have the CNF lowered any more than this, as if they do, it would indicate there is more likely an underlying model configuration issue that needs attention.
<br>
<br>
{{Tips Navigation
|uplink=[[ HPC_Modelling_Guidance | Back to HPC Modelling Guidance]]
}}
|