Oroville Dam is located on the Feather River in northern California (USA). At 234.7 m (770-ft) tall, this earth embankment is the tallest dam in the United States. With its 4.3 billion m3 (3.5 million acre-feet) of storage, Lake Oroville is the second largest reservoir in California, supplying water to cities as far south as Los Angeles. The Oroville Dam, reservoir (Lake Oroville), and hydropower plant facility is the flagship of the State Water Project (SWP), which is owned and operated by the State of California, Department of Water Resources (DWR).
The notion of probability and its various interpretations brings numerous opportunities for errors and misunderstandings. This is particularly true of contemporary risk analysis for dams that mostly consider geotechnical, hydraulic, and structural capacities subjected to extreme loads considered as independent evets. In these analyses subjective “degree of belief” probability has a major role, both in the modelling of the risk in the system by means of event trees based on inductive reasoning and in the assignment of probabilities to events in the event tree. There are numerous situations where physically possible conditions are eliminated from consideration in a risk analysis on the basis of probabilities that are judged to be too low to be of relevance. This is despite the fact that the assignment of a probability to a condition means that the occurrence of the event or condition is inevitable sometime, with the added complication that the time of occurrence is unknown and unknowable. Although there is no relationship between a remote probability and the possibility (or credibility) of the occurrence of the event in the event tree, it is quite common for physically feasible conditions to be either eliminated or their importance discounted on the basis of low probability in a risk assessment of a dam. Twenty five years ago, this elimination process might have been referred to as “judicious pruning of the event tree”. In more modern parlance, the elimination process is based on consideration of whether or not the condition or sequence of events is clearly so remote a possibility as to be non-credible or not reasonable to postulate. In contrast to the consideration of extreme loads vs. structural or geotechnical capacities, experience has shown that many dam failures and perhaps the majority of dam incidents do not result from extreme geophysical loads, but rather from operational factors. These incidents and failures occur because an unusual combination of reasonably common events occurs, and that unusual combination of events has a bad outcome. For example, a moderately high reservoir inflow occurs, but nowhere near extreme; the sensor and SCADA system fail to provide early warning for some unanticipated reason; one or more spillway gates are unavailable due to maintenance, or an operator makes an error, or there is no operator on site and it takes a long time for one to arrive; and the pool was uncommonly high at the time. This chain of reasonable events, none by itself particularly dangerous, can in combination lead to an incident or even a failure. This leads to the unnerving conclusions that; our estimates of risk made in terms of best available practice using the best available estimates will be underestimates of the actual risk, and the extent to which we underestimate the risk is unknowable. This paper examines why these improbable events occur and what can be done to prevent them. Some implications with respect to the endeavour of risk evaluation are also considered.
The U.S. Army Corps of Engineers (USACE) Risk Management Center (RMC) developed the Reservoir Frequency Analysis software (RMC-RFA) to facilitate, enhance, and expedite flood hazard assessments within the USACE Dam Safety Program. RMC-RFA is a stochastic flood modeling software that employs advanced statistical and computing techniques, allowing a user to perform a screening-level stage-frequency analysis on a desktop PC with runtimes on the order of seconds to a few minutes. RMC-RFA utilizes an inflow volume-based stochastic simulation framework that treats the seasonal occurrence of the flood event, the antecedent reservoir stage, inflow volume, and the inflow flood hydrograph shape as uncertain variables rather than fixed values. In order to construct uncertainty bounds for reservoir stage-frequency estimates, RMC-RFA employs a two looped, nested Monte Carlo methodology. The natural variability of the reservoir stage is simulated in the inner loop defined as a realization, which comprises many thousands of events, while the knowledge uncertainty in the inflow volume-frequency distribution is simulated in the outer loop, which comprises many realizations.
Stage-frequency curves derived with RMC-RFA are compared to those derived with more complex, precipitation-based simulation frameworks, such as the Monte Carlo Reservoir Analysis Model (MCRAM), the Stochastic Event Flood Model (SEFM), and the Watershed Analysis Tool (HEC-WAT). The inflow volume-based framework employed by RMC-RFA produces stage-frequency curves that strongly agree with the more complex, precipitation-based methods. Furthermore, the results from the alternative methods fall within the RMC-RFA uncertainty bounds, demonstrating its robustness. In this sense, the RMC-RFA simulation framework lends itself to a value of information approach to risk management, where knowledge uncertainty can be efficiently quantified at a screening-level assessment, and then the value of performing more complex and sophisticated studies to reduce uncertainty can be considered.
For intraplate regions such as Australia, identifying and quantifying activity on tectonic faults for inclusion in probabilistic seismic hazard assessments can be challenging due to the typically long return period for ground-rupturing earthquakes associated with these structures. Return periods of 10,000’s to 1,000,000’s of years mean that surface displacement evidence is prone to degradation through erosion and burial, and paleoseismological ‘trench’ excavations may not uncover geology old enough to reveal previous events. As a consequence, there is often little or no preserved evidence of past ground rupturing events on these structures. Rather than ignoring faults which show no evidence of neotectonic displacement, we present an alternative approach; in addition to considering active faults (movement in the last 35,000 years) and neotectonic faults (movement in the last 10 Myr) in seismic hazard assessments, we also consider faults which otherwise show no evidence of neotectonic activity but which are aligned favourably with the current stress regime and are therefore potential sources of earthquakes and accompanying strong ground motion.
Junction and Clover Dams are central spillway slab-and-buttress dams located in Victoria. Previous safety reviews and assessments of the dams concluded that neither dam met modern dam design standards and remedial works were recommended, including infilling the slab-and-buttress dams with mass concrete to sustain seismic loadings. These conclusions were based largely on the assessed seismic hazard at the site, the results of response spectrum analyses and observed conditions of the dams including alkali-aggregate reaction of the concrete. AECOM used current seismic hazard assessment techniques, conducted concrete investigations and testing, assessed long term surveillance monitoring results and used modern finite element techniques to demonstrate that no upgrade works were required at either dam resulting in a significant saving for AGL.
Population at Risk (PAR) estimation involves quantification of people who could be exposed to flooding in the event of a dam failure. Conventionally, estimates of PAR involve manual and subjective assessment of individual structures located downstream of dams. To reduce the reliance on subjective judgement and better leverage publicly available population datasets, an automated method of PAR assessment was developed. This approach used the Geoscape dataset of building representations to disaggregate Australian Bureau of Statistics 2016 Census data for a study area around Gawler, South Australia.
Representative day and night spatial distributions of PAR were constructed to characterise the diurnal movement of people between homes and workplaces or other day activities. Flows of people were directly quantified to reduce reliance on high level assumptions regarding exposure. A Random Forest model was used to filter sheds and other unpopulated structures from the Geoscape dataset.
The largest deficiency in this approach is the lack of high detail data to classify building usage. It is recommended that the potential for automation of PAR assessment be continually revisited as more datasets become available.