The people, science and technology behind discovery

Subscribe to the Magazine

Get new articles sent directly to
your mailbox.


Join the network

Twitter Updates Group Forum

Earth Explorer is an online source of news, expertise and applied knowledge for resource explorers and earth scientists. Sponsored by Geosoft.

Upcoming Events

News & Views

All News


Can statistical analysis reverse the trend of declining discovery rates?

by Virginia Heffernan on September 11, 2012 expertise

What if you could generate a map that identified not only exploration targets, but also the likelihood of those targets becoming economic deposits? The authors of the cover article for July’s edition of SEG Newsletter say this kind of “radical approach” to exploration is both possible and necessary.

“The decision about where to look is too often based on an informal interpretation of partial information rather than quantitative analysis of all the available data,” say Colin Barnett and Peter Williams, principals of data mining firm BW Mining and authors of A Radical Approach to Exploration: Let the Data Speak for Themselves. “If we are to improve the discovery rate, we need to put more effort into targeting, or deciding where to look.”

The authors say explorers can increase their chances of success several fold by using statistical analysis in known camps with good quality databases. Advances in pattern recognition combined with greater accessibility to government datasets and lower costs for data storage and processing make it possible to not only pinpoint targets, but to determine the expected economic costs and rewards of each target.

That approach had a 50% success rate in finding more gold in the Porcupine camp of northern Ontario. The jury is still out on the duo’s most recent study area, the Eastern Goldfields North (EGN) portion of the Yilgarn craton in Western Australia.

“The favorable ground is almost completely taken in the EGN, so we need to come to suitable arrangements with current license holders,” Barnett said in an e-mail. “But there are targets on land held by most of the major players.”

The EGN covers about 165,000 km2,or a landmass roughly the size of Wisconsin or Uruguay. The area contains a high concentration of known gold deposits, public access to modern data, and less than 10% outcrop, making it a prime candidate for detailed statistical analysis of geological, geochemical and geophysical results.

Barnett and Williams’s analysis may be beyond the grasp of the average geoscientist, but it involves inputting all the layers of primary exploration data, such as magnetics and lithology, as well as the derivatives of that data, into neural networks. After taking into account all the available data for EGN, for example, the authors ended up with more than 250 layers representing a mindboggling 15 Gb of gridded data files.

The statistical approach allows explorers to quantify the relevance of each data set. In the EGN case, biogeochemistry (sampling the leaves of mulga trees) and geological structure (the area’s gold occurrences are commonly associated with major shear zones, secondary faults and hinge areas of antiforms) proved to be the most informative, while magnetic data was the least helpful.

The EGN study generated more than a dozen high priority targets. Some of them are brand new, but most occur in established camps within a few kilometres of operating mines or old workings.

Will this kind of high level statistical analysis revolutionize exploration in known camps and other areas with plentiful data? Or does the cost and expertise required outweigh the benefits of being able to pinpoint targets with a high probability of success?


Editor's note:

This article has generated lively discussion on Earth Explorer's Linked In group. Group members pointed out some limits to universal application of a statistical analysis approach, perhaps the biggest hurdle being the need for comprehensive geological, geophysical and geochemical datasets from areas that have already undergone exploration to some degree.

Lyndon Hardy, senior exploration geologist for Abra Mining in Perth, put it best: "Until there is sufficient regional data acquired and you can feed the various parameters into the model, you will have a situation of garbage-in-garbage-out."

Another concern was that the approach, which uses neural networks to crunch the data, lacks a formal process to apply weights to different datasets to ensure that drill hole data, for instance, gets a higher weighting than samples taken at surface.

Others prefer tried and true approaches to exploration, such as using old mine workings as a prospecting tool, or simply visiting the site to "kick, lick and spit" (thank you to Perth-based consultant Susan Lawson for that image).

Barnett responded to these concerns. He says the statistical approach can be used with sparser exploration data as long as the resulting "broad brush" targets are recognized as such. And weights need not be estimated because the modeling process automatically determines the contribution of each data set. For those concerned with costs, Barnett argues that the expense associated with statistical analysis becomes a tiny fraction of the reward if the method succeeds in pinpointing an economic deposit.

Although nothing replaces finely tuned geological intuition, there are probably several mining camps that could benefit from this level of quantitative analysis of all existing data to pinpoint orebodies hiding in the shadow of the headframe. It's already succeeded in finding new gold in the Porcupine camp of northern Ontario, for example.

As Shanti Kumar, a consultant in Hyderabad, advises: "keep trying neural networks, but validate (them) critically with conventional geological observation."