Underground Mining Solutions

Publications

McGarr (1977) described the correlation between seismic deformation and volumetric stope closure and how it may be possible to plan mine geometries that will keep problems associated with large tremors to a predictable level by controlling the convergence.

We proved that for Mponeng Mine the modelled volumetric closure correlates well with the recorded potency. We showed how maximum closure follows the same trend as volumetric closure under controlled conditions and therefore that maximum closure could be used as a modelling criterion in a mine design. We finally offer a brief description of the strategy change at Mponeng Mine based on the maximum closure criterion developed for the mine.


For seismically active mines, the analysis of the mining induced seismicity forms an important part of the  geotechnical risk management and design process. However, the quality of seismic data is  seldom scrutinized,  resulting in lower quality databases and, therefore, unreliable results.

These results are used to make decisions affecting both the safety and productivity of mine  sites. For this reason, assessing and quantifying the database quality is important. 

Many mine sites rely on seismic service providers to help maintain the seismic system and to provide good quality seismic data, which they will use in seismic analysis techniques. The  mine personnel assume  good  quality data  and do not have the tools or expertise to evaluate the integrity of a database.  In our experience, data  quality  problems are experienced by most mines, to some degree, with the quality at some mines having a serious impact  on decision making. 

This paper presents a method for the assessment of seismic data quality. This method  highlights the areas in the database that is most contaminated with bad data and thus provides a first  step towards rectifying the problem.

Quality indices are also developed to objectively quantify the database  quality.

The methods presented in this paper, are a work in progress on which the authors will improve in the near future.


Over the years there have been several attempts to undertake routine real-time micro seismic monitoring of open pit mine slopes. This technique has been commonly used in underground operations to manage induced seismicity and rock burst.

However, the micro seismic monitoring in open pits is still experimental and further studies are required. In this paper, we analyzed the data from MMG Century mine where, in November 2013, a micro seismic system was installed to monitor a large-scale unstable slope.

Design of the system and installation of the instruments were performed by the Institute of Mine Seismology. The seismic events were recorded, based both on a triggered scheme and in continuous mode. As part of our research project, data was given to four independent groups to be analyzed and provide their own results.

One group applied a routine method using the triggered data, manually processed them, and made them available for the engineers on site within 10 minutes. The other three groups later reanalyzed the data using both triggered and continuous waveform.

Our work compared the different results obtained and highlighted some of the key points engineers should be aware of in the design and implementation of a micro seismic system in open pit mines.


The hazard posed from large seismic events is often high enough to warrant the exclusion or evacuation of personnel from underground workings. A period of exclusion is often determined following blasts or large events due to the increased risk.

The period of exclusion until re-entry occurs is a decision for site geotechnical engineers and mine management that must balance the potential risk to personnel with lost production time and associated costs. There is currently no widely accepted method for determining re-entry times and mine sites typically develop their own rules for exclusions after blasts and large events. A systematic and evidence-based approach to the development of re-entry protocols could potentially reduce the risk to personnel from an early re-entry or reduce the lost production from an unnecessary exclusion.

Four methods of re-entry assessment have been considered in this paper. The seismic responses at three mines have been modelled and used to optimize each assessment method and gauge the relative success through back-analysis. These same techniques are available for other mines to review their own data and potentially improve their current re-entry protocols. The results of this research indicate that a real-time re-entry assessment method can offer improved outcomes compared to blanket re-entry rules by reducing the average exclusion time while still capturing the same number of large events.

The incorporation of event size in the assessment can result in better results than the event count. Vallejos and McKinnon (2009) developed a probabilistic framework for re-entry assessment, but this method was found to be less efficient than the blanket rule in the majority of cases in this study.

The method would also result in more administration and uncertainty for mine planning and scheduling. Several potential improvements to the analysis techniques, and avenues for further research, have been discussed.


Due to the complex nature of the seismic response to mining, geotechnical engineers often require back analysis to provide a base line against which to interpret future behavior.

This practice assumes, and is reliant on, the database being consistent in space and time. Few tools are available for geotechnical engineers dedicated to the task of quantifying the consistency of the seismic database, and to aid in identifying systematic inconsistencies in their databases. A methodical approach is required to warn geotechnical engineers of unexpected systematic shifts in their database as soon as they arise so that timeous and appropriate action can be taken. The industry collectively also requires a systematic approach to quantify the consistency of seismic databases.

A technique is proposed to adequately address these aims. The technique is fast and efficient and can be easily employed on any database. By continuously updating results, users would know within a few tens to hundreds of events when data shifts have occurred. This would allow for the effective management of these errors in the database.

Application of the method on some current industry databases showed that the shifts are sufficiently significant to render the use of some widely used analysis techniques unreliable. It show that shifts in the data have a significant influence on the interpretation of the source parameters.

Systematic errors are causing significant artefacts in seismic databases. Of the twenty databases investigated, 70% had one or more systematic shifts a year, and only one database showed no shifts at all. There is justified concern with respect to systematic inconsistencies in seismic databases in the industry. Such inconsistencies could lead to misinterpretation of seismic analysis results, which will have a carry-on effect on other parts of the operations.


The occurrence of seismicity in high stress hard rock mines poses a challenge to geotechnical engineers and mine management around the world. Only a few practical options are available when the mitigation of seismic risk is considered.

One of the most widely used options is the implementation of a re-entry protocol. These protocols are useful at limiting personnel exposure to elevated seismic hazard associated with the occurrence of a firing. There are several methodologies available for determining an appropriate re-entry time. The success rate of these methods varies between sites.

Recent work by Vallejos and McKinnon (2010) suggest a new approach to the re-entry problem. They provide methodology that could be implemented on any mine site with a seismic system. The method evaluates the current response in terms of the statistical properties of the rock mass based on historic responses.

Discussions on the practical implementation of the method on a site-wide basis were limited and did not provide an indication of what could quantitatively be expected from the method.

The Vallejos and McKinnon method could be automated and practically implemented on most mine sites with a comprehensive seismic data record. It was shown the methodology, in some cases, may be an improvement on the widely used ‘blanket’ rule implemented on many mine sites.


Reliable assessment of seismic hazard plays a vital role in addressing geotechnical risk in many mines. The data quality and assessed risk can be adversely affected by bandwidth limitations of the sensors in the mine’s seismic network.

The influence of sensor bandwidth on the recorded waveforms is well understood and has been observed from as early as 1976 by Hanks and Johnson. Boore (1986) and Di Bona and Rovelli (1988) provide analytical formulations to describe the influence of sensor bandwidth on the calculated source parameters. In the most recent RaSiM8 conference, Mendecki (2013) revisited this work and discussed the impact on different seismic source parameters.

This paper investigates the impact of bandwidth limitations on the assessment of seismic hazard.

We found that this phenomenon is present in seismic mine databases. The assessment of mmax (defined as the credible next largest event) may be sensitive to bandwidth limitations. Hazard assessment methods that do not depend heavily on the assessment of mmax are less affected.

Finally, for magnitude calculations depending only on the radiated seismic energy, the effect of the bandwidth limitations is less severe.


It is generally accepted that the ratio of energy associated with the S-wave (Es) and P-wave (Ep) is dependent on the focal mechanism (Mendecki 2013). In the mining industry, the ratio of S-wave energy to P-wave energy is regarded as an important indicator of the type of focal mechanism, with the ratio being lower for explosive sources and higher for fault slip (Cai et al 1998, Mendecki, 2013).

In pure shear, the Es is considerably larger than Ep (Es/Ep > 20). For the tensile model, Sato (1978) has shown that Ep and Es are approximately equal. Gibowicz et al (1991) and Gibowicz and Kijko (1994) suggest that when Es/Ep < 10, the source mechanism involves a tensile failure component. Boatwright and Fletcher (1984) suggest pure shear to correspond with Es/Ep > 10. Hudyma and Potvin (2010) suggest that for events with Es/Ep < 3, the mechanism is non-shear.

This paper investigates the Es/Ep ratio parameter and how sensitive it is to different seismic service setups. It will achieve this by investigating the consistency of the parameter for three different scenarios.


The management of seismic risks in metalliferous mines operating in developed mining countries such as Australia, Canada, Chile, and Sweden have been highly successful during the last decade. The occurrence and magnitude of large seismic events in deep mines has continued to increase with mining reaching deeper horizons, yet injuries and fatalities due to rock bursts remain very rare in these countries.

Although there are many common practices used to manage seismic risks in mines, there is no recognized process to do so. In 2017, Newcrest Mining Ltd, in collaboration with the Australian Centre for Geomechanics (ACG), undertook a benchmarking campaign to document the different seismic risk management practices currently implemented in mines which are considered leaders in this area.

Data was gathered from sixteen mines operating in five countries, experiencing different degrees of seismicity. Analysis of the data from the benchmarking study led to a better understanding of seismic risk management practices applied in the industry.

One of the important outcomes of this project was the development of a flowchart describing in detail a generic seismic risk management process. The process is broken into four different layers of activities: data collection, seismic response to mining, control measures, and seismic risk assessment.

Within each layer of activity, there are a number of components, and within each component, there are a number of practices, which have been benchmarked and are discussed in this paper.

In addition to providing a road map for managing seismicity in underground metalliferous mines, this work enables users to assess their own practices against standard and advanced practices in the management of seismic risks. A full description of the seismic risk management process is available to the mining industry at https://acg.uwa.edu.au/srmp.


The occurrence of seismicity in high stress underground environments is common and often leads to underground damage, affecting both the safety and profitability of a mine. To manage this risk, it is typical for a mine to install a seismic sensor array which is used to assess the seismic conditions around the excavations and monitor any changes to it.

Arguably, one of the most important seismic parameters to determine with any seismic network is the event locations. An accurate catalogue of event locations and times is a requirement for a good understanding of the seismic response to mining and the sources of seismicity. Unfortunately, the process of event location is challenging, and it is typical to have large errors on the location, especially in mining environments with large mine openings (e.g. caves).

In this paper, we discuss the results of a new method for estimating the event location uncertainties. We also discuss how this new method allows one to both improve the seismic sensor array to minimise location uncertainties, and how it allows geotechnical teams to consider the impact of inaccurate event locations in seismic interpretations.