For intraplate regions such as Australia, identifying and quantifying activity on tectonic faults for inclusion in probabilistic seismic hazard assessments can be challenging due to the typically long return period for ground-rupturing earthquakes associated with these structures. Return periods of 10,000’s to 1,000,000’s of years mean that surface displacement evidence is prone to degradation through erosion and burial, and paleoseismological ‘trench’ excavations may not uncover geology old enough to reveal previous events. As a consequence, there is often little or no preserved evidence of past ground rupturing events on these structures. Rather than ignoring faults which show no evidence of neotectonic displacement, we present an alternative approach; in addition to considering active faults (movement in the last 35,000 years) and neotectonic faults (movement in the last 10 Myr) in seismic hazard assessments, we also consider faults which otherwise show no evidence of neotectonic activity but which are aligned favourably with the current stress regime and are therefore potential sources of earthquakes and accompanying strong ground motion.
Now showing 1-12 of 59 2982:
Earthquakes are a well-known threat to the safety of dams. While this threat is subdued for Australian Dams, the potential for earthquake induced failure of a dam requires risk minimisation in the downstream community through monitoring and emergency response procedures. This paper details WaterNSW’s approach to their development of a Seismic Monitoring Strategy which was to align the business and ensure an appropriate post-seismic response.
The strategy also identifies that a proactive approach to seismic instrumentation can be taken to reduce business risk by aiding decision making should a dam be in a damaged post-seismic state.
The interim outcome of implementing the Seismic Monitoring Strategy resulted in a fast emergency
response time and less overreaction/distraction of dam safety resources in insignificant seismic events. There is opportunity for other Australian dam owners to implement similar systems to = WaterNSW and achieve similar results.
Two-dimensional hydraulic modelling technology has advanced significantly in recent years, providing powerful and flexible tools that are now routinely used for a wide variety of flood risk assessments. Assessing the downstream impacts of catastrophic dam failure represents an extreme test for the accuracy and stability of hydraulic models. Catastrophic dam failure can present an extreme risk to downstream infrastructure and public safety. Hence, it is important to have confidence in the estimated magnitude of potential impacts to design suitable, costeffective mitigation measures. The highly visual output of two-dimensional models adds credibility to their results. However, validation data for extreme hydraulic conditions is rarely available, resulting in uncertainty in the accuracy of model predictions and in the risks associated with dam failure. By validating numerical model results against analytical solutions for cases of simple geometry and also against realworld data, an improved level of confidence can be obtained in the accuracy of the model representation of these extreme hydraulic conditions. In this paper, we assessed the capability of the TUFLOW hydraulic modelling software package to accurately simulate an idealised dam break scenario by comparing the model results to analytical solutions. We also compared the model results for coastal inundation by a tsunami to real-world data from the 2004 Banda Ache (Indonesia) tsunami. The results showed that the HPC solver version of TUFLOW correctly captures the dam break flood fronts and the flood wave propagation and TUFLOW HPC is well suited for dam break flood modelling.
Lessons learned from recent major incidents and related enquiries in Victoria in concert with the adoption
of an all-emergencies all-communities philosophy have informed both the scope and reach of the current
emergency management and dam safety regulatory environment. Victorian dam owners now have a statutory
obligation to implement an all-emergencies all-communities approach to risk assessment at their assets and,
as part of that, to adopt this approach as part of their “business as usual” activities. A major outcome of
this requirement is that for major dams, risk management is now being driven from Board and senior
management level: the implementation of controls and actions is formalised. As a consequence, there is a
better understanding across the organisation of new and emerging risks that require new technologies,
thinking and expertise and an improved appreciation of asset interdependencies and the risk posed to reliant
stakeholders. With other reforms including oversight and audit arrangements in place, the move from “doing
enough” to striving for “good’ industry practice, aided by an improved regulatory regime and statutory
processes, is well established. A brief consideration of the lessons learned from the February 2017 Oroville
dam incident in this context concludes the paper.
Physical modelling of dam structures remains a preferred method for validating and improving dam designs. Flow behaviour in the approach and over the crest of a dam can be accurately studied with traditional methods such as pressure transducers, piezometers and current meters due to the relatively smooth and steady flow conditions. However, characterising flows within a stilling basin is far more difficult due to the complex, aerated and highly turbulent flow conditions. Recent work on detailed measurement of hydraulic jumps using a line-scanning Lidar was adapted for measurement of stilling basin surface profiles in a 1:50 scale model of Somerset Dam, QLD. Lidar was shown to be an effective and efficient tool for providing assessment of the toe jump, boil and flow into the downstream channel.
There are a number of software packages that have been developed to conduct Probabilistic Seismic Hazard Assessments (PSHA’s). Each one has advantages and disadvantages. Two such programs are compared; the licenced subscription-based EZ-FRISK software package developed by Fugro USA Land, Inc. and the open-sourced OpenQuake-engine (OQ) software package by the Global Earthquake Model (GEM) Foundation. Both of these packages use the classical PSHA methodology as described by Cornell (1968) and modified by McGuire (1976). Each of these packages offers different advantages; OQ is freely distributed, code based and provides easy access to a number of tools. EZ-FRISK doesn’t rely on command-line tools and instead provides an easy user interface with quick access to plots to check results. EZ-FRISK is computationally faster than the OQ program.
A simple rectangular source model with four sites was used to investigate the degree of agreement between these two software packages. Results indicate that hazard estimates from the two packages agree to within 4% for the two closest sites. At long return periods for the two furthest sites, the difference is larger.