The people, science and technology behind discovery

Freeing up time for mine-finding: the Cameco solution

by Virginia Heffernan on April 2, 2014 technology

Cameco is clear on its corporate objectives to become a more efficient, streamlined and standardized organization. Cameco’s exploration department’s data management strategy and implementation is striving toward this goal while applying best practice.

Saskatoon-based Cameco relies on geophysics and downhole geology to find new mineralization. Shown above, typical Athabasca basin drilling landscape.

Members of Cameco’s exploration team are anticipating a time when their data management is so efficient that they can carve out an extra day in their weekly schedule to look for mines instead of spending those valuable hours searching for and manipulating data.

“We’re hoping that everything we’re doing right now will make us at least 20% more efficient, not to reduce manpower, but to satisfy employees’ concerns that they just don’t have enough time to be geologists,” says Mike McClelland, director of land tenure and geospatial information, who is spearheading the uranium miner’s efforts to upgrade and standardize its data repositories and provide a common interface to access them.

Creating standards

Saskatoon-based Cameco has already built separate repositories for its key GIS, geological, geochemical, geophysical, and land management data. Every bit of data from the company’s active projects in Saskatchewan, Australia and northern Canada is being examined and either accepted into or rejected from the repositories.

Senior management approved the data management process in early 2013 after recognizing that although the exploration team had spent millions of dollars to collect and interpret data over the years, the information was scattered among various repositories in different stages of analysis, and sometimes even solely in the brains of those maintaining the data.

The biggest challenge was to standardize the data in the same formats for all of Cameco’s projects around the world. Until recently, the company’s geoscientists were using variety of software programs depending on personal preference. Now about 80% of their interpretation work relies on standard platforms.

The other 20% (such as modelling) is so specialized that it can’t be centralized. “It’s impossible to standardize everything and we don’t want to constrain our really deep thinkers,” says McClelland. “I’ve seen more failures than successes at getting data in order by trying to account for every minute detail.”

Transparency also had to improve so that data could be found consistently in the same place and shared by everyone. Yet another challenge was to overcome natural human resistance to change. “If it wasn’t for the support of the end-users in exploration and their determination to adapt to change, our implementation efforts would never have been realized,” says McClelland.

But with support from the executive team and a general recognition that the painful adaptation would be worth it in the end, Cameco was able to make significant progress throughout the year.

“Compared to 12 months ago, the team I lead is able to access data much more quickly without having to ask others for it,” says Dave Thomas, director of exploration geoscience for Cameco. “They are operating much more efficiently now that everything is self serve.”

[Click to enlarge]

Geosoft DAP Seeker has greatly improved efficiency in locating centralized geophysical data.

Using DAP to store hundreds of surveys

Cameco uses ESRI’s ArcServer for its GIS data, Geosoft’s DAP server for its geophysics and acQuire for geochemistry. Each repository is assigned a “subject matter specialist” responsible for ensuring QA/QC and uploading data to the server.

Cameco spent $45 million of their 2013 exploration budget continuing to focus efforts in Australia and Saskatchewan probing deep into the Athabasca basin. The depth of targets in the basin limits the use of surface geochemistry tools, so the company relies almost entirely on geophysics and downhole geology to find new mineralization.

“Because of the size of the targets and depths at which we are working, we put a lot of time, effort and thought into geology, more than for any other commodity,” says McClelland. “We’re not looking for a porphyries at 800 metres, we’re looking for pseudo vein structures at 800 metres.”

Cameco uses a variety of geophysical tools to find those structures. EM surveys target conductive horizons in the basement rocks that are proxies for the faults that control mineralization. For mapping alteration around the fault structures, resistivity and gravity are essential tools. And seismic techniques are applied to enhance regional exploration and help the mining group understand the geology of certain advanced projects before they break ground.

As a result, getting the DAP server up and running to handle results from about 600 geophysical surveys on several active projects was a key step in the data management plan. DAP allows explorers to efficiently catalog, manage, deliver and visualize large, geospatial data. So far, Cameco has uploaded about 40% of its survey results to DAP and hopes to complete the rest of the upload by the end of 2014. Once all the active project data is documented, the company will start working on historical data.

Taking out the trash

But before the information is uploaded, it has to be cleansed. By identifying duplicates and triplicates of data, different versions of the same grid and files that had no value, Geosoft’s service team was able to reduce the size of the database by 55%, from two terabytes to about 800 GB. Cameco reduced the size even further, to about 500 GB, after assigning a full-time DAP administrator.

“We only have to do this right once and it’s done for eternity, unlike a folder structure when you have to keep continuously cleaning out the folders year after year,” says McClelland. “That value can be realized on present or future projects or even sold to third parties. You don’t really know what you’ve got until its all in one place.”

Designing a common interface

The downside of having different types of exploration data in separate repositories is that while data integrity is ensured, interoperability suffers. So now Cameco is working with Geosoft on a common web interface that connects the repositories on the back end so that explorers can retrieve data quickly and easily.

The Geospatial Envision Technology & Information Transfer (GET-IT) System provides a single web interface to data that is managed by the ArcServer (GIS, geology, and land management data), DAP (geophysical data) and acQuire (drillhole and geochemical data) servers. Geoscientists, management and other permitted users will be able to access the GET-IT system via a web browser and then search, preview, interrogate, and extract data from the connected servers and make use of simple map-making capabilities.

Drill core is the final and most expensive product of Cameco’s exploration process. Geologist Nathan Barsi ensures no details are overlooked.

In other words, geoscientists will be able to zoom into an area of interest and find all of the data ever collected from that area, whether that be drill summaries, geochemical reports or land permits. Once they’ve found the data, they’ll be able to click on an icon to launch everything onto their desktop application for further analysis using  “Cameco-centric” symbols, colours and templates that are consistent throughout.

“We’ll actually be able to get down to the interpretation instead of spending days trying to find the data, massage it and manipulate it without ever really knowing if we have the right data in front of us,” says Thomas. “GET-IT will allow us to put our brainpower into how and where to drill the next hole.”

McClelland says the work Cameco is doing now to organize, standardize and make data accessible should put the company in an excellent position to take advantage of future software applications that will be able to analyze massive amounts of exploration data based on certain criteria and, from that analysis, identify targets.

“Hopefully, as we get into cloud environments and perhaps even a different realm of partnerships and joint ventures, companies and individuals will be able to combine data in a systematic manner and use computer algorithms to help process it.”



View all articles »