The U.S. Army Corps of Engineers (USACE) Risk Management Center (RMC) developed the Reservoir Frequency Analysis software (RMC-RFA) to facilitate, enhance, and expedite flood hazard assessments within the USACE Dam Safety Program. RMC-RFA is a stochastic flood modeling software that employs advanced statistical and computing techniques, allowing a user to perform a screening-level stage-frequency analysis on a desktop PC with runtimes on the order of seconds to a few minutes. RMC-RFA utilizes an inflow volume-based stochastic simulation framework that treats the seasonal occurrence of the flood event, the antecedent reservoir stage, inflow volume, and the inflow flood hydrograph shape as uncertain variables rather than fixed values. In order to construct uncertainty bounds for reservoir stage-frequency estimates, RMC-RFA employs a two looped, nested Monte Carlo methodology. The natural variability of the reservoir stage is simulated in the inner loop defined as a realization, which comprises many thousands of events, while the knowledge uncertainty in the inflow volume-frequency distribution is simulated in the outer loop, which comprises many realizations.
Stage-frequency curves derived with RMC-RFA are compared to those derived with more complex, precipitation-based simulation frameworks, such as the Monte Carlo Reservoir Analysis Model (MCRAM), the Stochastic Event Flood Model (SEFM), and the Watershed Analysis Tool (HEC-WAT). The inflow volume-based framework employed by RMC-RFA produces stage-frequency curves that strongly agree with the more complex, precipitation-based methods. Furthermore, the results from the alternative methods fall within the RMC-RFA uncertainty bounds, demonstrating its robustness. In this sense, the RMC-RFA simulation framework lends itself to a value of information approach to risk management, where knowledge uncertainty can be efficiently quantified at a screening-level assessment, and then the value of performing more complex and sophisticated studies to reduce uncertainty can be considered.
Now showing 1-12 of 59 2982:
Population at Risk (PAR) estimation involves quantification of people who could be exposed to flooding in the event of a dam failure. Conventionally, estimates of PAR involve manual and subjective assessment of individual structures located downstream of dams. To reduce the reliance on subjective judgement and better leverage publicly available population datasets, an automated method of PAR assessment was developed. This approach used the Geoscape dataset of building representations to disaggregate Australian Bureau of Statistics 2016 Census data for a study area around Gawler, South Australia.
Representative day and night spatial distributions of PAR were constructed to characterise the diurnal movement of people between homes and workplaces or other day activities. Flows of people were directly quantified to reduce reliance on high level assumptions regarding exposure. A Random Forest model was used to filter sheds and other unpopulated structures from the Geoscape dataset.
The largest deficiency in this approach is the lack of high detail data to classify building usage. It is recommended that the potential for automation of PAR assessment be continually revisited as more datasets become available.
Following the catastrophic failure of the bottom outlet conduits of the Massingir Dam, a rehabilitation project was launched involving the installation of steel liners and the rehabilitation of the hydromechanical equipment. This paper describes the testing of an emergency gates for possible use as a control gate to maintain supply to downstream water users. It further describes the innovative use of alternative access for concreting and other services, the use and benefits of self-compacting concrete for infill concreting between the steel liner and existing concrete and the programme and cost benefits of pressurising the steel conduit prior to concrete encasement.
Two-dimensional hydraulic modelling technology has advanced significantly in recent years, providing powerful and flexible tools that are now routinely used for a wide variety of flood risk assessments. Assessing the downstream impacts of catastrophic dam failure represents an extreme test for the accuracy and stability of hydraulic models. Catastrophic dam failure can present an extreme risk to downstream infrastructure and public safety. Hence, it is important to have confidence in the estimated magnitude of potential impacts to design suitable, costeffective mitigation measures. The highly visual output of two-dimensional models adds credibility to their results. However, validation data for extreme hydraulic conditions is rarely available, resulting in uncertainty in the accuracy of model predictions and in the risks associated with dam failure. By validating numerical model results against analytical solutions for cases of simple geometry and also against realworld data, an improved level of confidence can be obtained in the accuracy of the model representation of these extreme hydraulic conditions. In this paper, we assessed the capability of the TUFLOW hydraulic modelling software package to accurately simulate an idealised dam break scenario by comparing the model results to analytical solutions. We also compared the model results for coastal inundation by a tsunami to real-world data from the 2004 Banda Ache (Indonesia) tsunami. The results showed that the HPC solver version of TUFLOW correctly captures the dam break flood fronts and the flood wave propagation and TUFLOW HPC is well suited for dam break flood modelling.
There are a number of software packages that have been developed to conduct Probabilistic Seismic Hazard Assessments (PSHA’s). Each one has advantages and disadvantages. Two such programs are compared; the licenced subscription-based EZ-FRISK software package developed by Fugro USA Land, Inc. and the open-sourced OpenQuake-engine (OQ) software package by the Global Earthquake Model (GEM) Foundation. Both of these packages use the classical PSHA methodology as described by Cornell (1968) and modified by McGuire (1976). Each of these packages offers different advantages; OQ is freely distributed, code based and provides easy access to a number of tools. EZ-FRISK doesn’t rely on command-line tools and instead provides an easy user interface with quick access to plots to check results. EZ-FRISK is computationally faster than the OQ program.
A simple rectangular source model with four sites was used to investigate the degree of agreement between these two software packages. Results indicate that hazard estimates from the two packages agree to within 4% for the two closest sites. At long return periods for the two furthest sites, the difference is larger.
Since publication in 2003, the ANCOLD Guidelines for Risk Assessment have reached broad acceptance and use in Australia. In practice, dam owners use the principles of risk assessment to drive business investment decisions. As the guidelines undergo revision, it is timely to assess whether our practices need to evolve to more holistically consider all types of consequences, rather than our current focus on loss of life, in decision-making. This paper aims to prompt dam owners and consultants alike to re-assess our focus on loss of life in risk assessment decision-making, and whether we should more meaningfully consider alternative or broader indicators.
An industry survey was undertaken which found that large dam owners are generally happy with the current system of dam safety decision making. However, the survey responses did identify difficulties in relation to justifying investment below the limit of tolerability that are subject to ALARP principles. In a small number of cases, dam owners found it difficult to justify investment when life safety was not important.
Building on the industry survey and subsequent discussions with practitioners, this paper discusses how the current approach to risk based decision making may result in sub optimal decision making. Further it is discussed how there is an important role that economics should play in providing a universally accepted framework for assessing trade-offs and providing consistent evidence to support decision making.