Simon Lang, Peter Hill, Wayne Graham
The empirical method developed by Graham (1999) is the most widely used in Australia to estimate potential loss of life from dam failure. It is likely to remain that way while spatially based dynamic simulation models are not publicly available (e.g. LIFESim, HEC-FIA and LSM). When the Graham (1999) approach was first developed the prevalence of spatial data and the speed of computers was much less. In addition, most people did not have mobile phones, social media was in its infancy, and automatic emergency alert telephone systems were 10 years from being used in Australia. Graham (1999) was intended to be applied to populations at risk (PAR) lumped into a discrete number of reaches. The selection of fatality rates for the PAR in each reach was based on average flood severity and dam failure warning times. Today, there is typically much more spatially distributed data available to those doing dam failure consequence assessments. Often a property database is available that identifies the location of each individual building where PAR may be, along with estimates of flood depths and velocities at those buildings. News of severe flooding is likely to be circulated by Facebook, Twitter and e-mail, in conjunction with official warnings provided by emergency agencies through radio and television and emergency alert telephone systems.
This raises the question of how Graham (1999) is best applied in today’s digital age. This paper explores some of the issues, including the estimation of dam failure warning time, using Graham (1999) to estimate loss of life in individual buildings and the suitability of Graham (1999) for estimating loss of life for very large PAR.
Keywords: loss of life, dam safety, risk analysis.
Now showing 1-12 of 40 2976:
Robert Wark, R.N.M. Nixon
Sediment inflows to Lake Argyle, the reservoir formed by the construction of the Ord River Dam, were seen as a significant threat to the Ord Irrigation Project when the scheme was being developed through the 1960s. Sediment monitoring was built into the operation of Lake Argyle when the Ord River Dam was completed in 1971. The paper describes the strategies that have been in place to assess sediment loads and monitor sediment build up in the reservoir.
Spectacular reduction in sediment flows has been achieved through developing a comprehensive catchment management program. The program commenced in the early 1960s and was adapted and modified as progress was made. The paper describes the steps taken to identify the areas of the catchment at risk, the measures implemented and the current status of the catchment.
A key feature of the catchment management program has been the willingness to critically review progress and adapt the program. A variety of sediment tracing techniques have been used to help confirm the sources of sediment in the catchment, and the paper describes these, and the broad range of results and how they have helped direct the work on catchment management.
Keywords: Sediment, monitoring, catchment management, Lake Argyle, Ord River
P C Styles, A L Garrard
The Victorian town of Nathalia was surrounded by flood water during the March 2012 floods in Northern Victoria.
Nathalia is protected by earthen levees of various sizes and age. Portable aluminium levees were installed during the March 2012 flood event, generally in areas where a permanent levee would restrict access to a park and views. The flood level came within 200mm of the crest of many of the levees and remained at a high level for nearly 2 weeks.
The paper describes the emergency management issues and procedures which relied on engineering advice to provide targeted and relevant remedial works on the levee system as potential problems arose. Engineers worked alongside the SES, CFA, Victoria Police, ADF and other volunteers to monitor, repair and reinforce the levee system on a 24 hour basis. The engineering support continued over a period of approximately 2 weeks, from the time the flood waters commenced rising until they had receded sufficiently for the orders for evacuation of the town to be rescinded.
Keywords: Nathalia, floods, levees, emergency management
The design of tailings dams under earthquake loading is quite challenging due to the nature of the tailings materials which are generally liquefiable under earthquake shaking. The design will be more complicated when the dam foundation is also liquefiable material. While assessment of liquefaction potentials is well developed in practice, assessment of liquefaction induced deformation varies from the simplest Newmark’s displacement method to the more complex effective stress dynamic analysis approach. It is generally accepted that the simplified method can be used for cases involving non-liquefiable materials. However, the use of this method for cases involving liquefaction may generally result in overly conservative designs to cater for the many simplified
assumptions in the method. With the advance of computer technology, time and cost are no longer obstacles for using the more appropriate method for estimating liquefaction-induced deformations of a tailings dams and achieving an optimum dam design.
This paper attempts to critically discuss issues in seismic design of tailings dams and provide an example of the use of the effective stress dynamic analysis method to estimate the liquefaction-induced deformations of a tailings dam and its importance in optimizing the design. The approach used is capable of estimating pore pressure response of liquefiable materials at any given time during the shaking. The effective stress analysis method used herein is embedded in FLAC software using a specially written FISH routine. Using this method, it can be demonstrated that although liquefaction is an issue, it does not necessarily mean that we must prevent its occurrence. As long as the deformation is acceptable, liquefaction is not necessarily a ‘show stopper’ for the project.
Keywords: liquefaction, seismic deformation, tailings dam design.
Richard R. Davidson, Nate Snorteland , Doug Boyer, John France
The US Army Corps of Engineers (USACE) has embarked upon a monumental journey in applying risk-informed decision making in the management of the safety of the 650 major dams for which it is responsible. This process has shifted safety criteria from fully deterministic to a probabilistic basis. There has also been a shift from de-centralized district-based decision-making to centralized management of resources through the new Risk Management Center (RMC) and the Senior Oversight Group (SOG), a group of senior engineers and managers from across the USACE organization. The risk process began about five years ago with a portfolio prioritisation using screening-level risk assessments of the entire dam inventory, culminating in Dam Safety Action Classifications (DSAC) for each of the dams. Based on this risk prioritisation, Issue Evaluation Studies (IES) were initiated for the highest risk DSAC I and II dams, with each study including detailed failure mode and risk analyses for each dam. Because the Corps was relatively new to dam safety risk analyses, and their dam design history was one of following codified manuals of practice, various risk tools were prepared to provide guidance when assessing the risk of potential static, seismic and flood failure modes, as well as life loss and economic consequences of dam failure. Although these tools provided useful guidance to a relative large population of inexperienced risk estimators, many of these early risk assessments were flawed; they provided unrealistically high estimates of failure probabilities and the tools did not help estimators understand or explain each failure mode. To assist the RMC in bringing more defensible risk estimates to the table and improve consistency of the evaluations, the Quality Control and Consistency (QCC) review process was initiated about two years ago. The QCC process provides high level review of IES activities, including detailed reviews of risk analyses, by a small group of experienced dam safety risk estimators. Not only has this brought risk estimates into a more reasonable range, it has provided valuable training for risk estimators, and important checks and balances on the risk-informed decision making process for moving dam safety upgrade projects forward. The justification for a number of very expensive projects has been challenged and, in some cases, re-prioritised, and other projects have risen to the prominence they deserve.
Zhenhe Song, Arjuna Dissanayake, Shunqin Luo
One of the potential tailing dam failure modes that is commonly evaluated is for prediction of earthquake induced crest displacement in relation to available freeboard. The prediction of seismic induced displacement for tailing dams can be evaluated using simplified approaches, i.e. analytical methods by Newmark (1965), Makdisi and Seed (1978), Bray and Travasarou (2007) and empirical method by Swaisgood (2003) and Pells and Fell (2003).
Seismic induced displacements have been estimated using these simplified methods and numerical methods by FLAC and PLAXIS. The results from the numerical modelling were compared with results derived from the simpler analytical and empirical methods. The results indicate the numerical analysis results agrees reasonably well with empirical methods by Swaisgood (2003) and Pells and Fell (2003) and can be used to provide additional confidence in the seismic stability of tailings embankments. However, simplified analytical methods by Newmark (1965), Makdisi and Seed (1978), Bray and Travasarou (2007) could underestimate the seismic induced displacements.
Keywords: Tailing dam, Seismic analysis, numerical analysis, simplified analysis, liquefaction.