Interpreting spectral gamma-ray logs

Before you can start interpreting spectral gamma-ray logs (or, indeed, any kind of data), you need to ask about quality.

Calibrate your tool...

The main issues affecting the quality of the logs are tool calibration and drilling mud composition. I think there's a tendency to assume that delivered logs have been rigorously quality checked, but... they haven't. The only safe assumption is that nobody cares about your logs as much as you. (There is a huge opportunity for service companies here — but in my experience they tend to be focused on speed and quantity, not quality.)

Calibration is critical. The measurement device in the tool consists of a thallium-laced NaI crystal and a photomultiplier. Both of these components are sensitive to temperature, so calibration is especially important when the temperature of the tool is changing often. If the surface temperature is very different from the downhole—winter in Canada—calibrate often.

Drilling mud containing KCl (to improve borehole stability) increases the apparent potassium content of the formation, while barite acts as a gamma-ray absorber and reduces the count rates, especially in the low energies (potassium).

One of the key quality control indicators is negative readings on the uranium log. A few negative values are normal, but many zero-crossings may indicate that the tool was improperly calibrated. It is imperative to quality control all of the logs, for bad readings and pick-up effects, before doing any quantitative work.

...and your interpretation

Most interpretations of spectral-gamma ray logs focus on the relationships between the three elemental concentrations. In particular, Th/K and Th/U are often used for petrophysical interpretation and log correlation. In calculating these ratios, Schlumberger uses the following cut-offs: if uranium < 0.5 then uranium = 0.5; if potassium < 0.004 then potassium = 0.001 (according to my reference manual for the natural gamma tool).

In general, high K values may be caused by the presence of potassium feldspars or micas. Glauconite usually produces a spike in the K log. High Th values may be associated with the presence of heavy minerals, particularly in channel deposits. Increased Th values may also be associated with an increased input of terrigenous clays. Increases in U are frequently associated with the presence of organic matter. For example, according to the ODP, particularly high U concentrations (> 5 ppm) and low Th/U ratios (< 2) often occur in black shale deposits.

The logs here, from Kansas Geological Survey open file 90-27 by Macfarlane et al. shows a quite overt interpretive approach, with the Th/K log labelled with minerals (feldspar, mica, illite–smectite) and the Th/U log in uranium 'fixedness', a proxy for organic matter.

Sounds useful. But really, you can probably find just a paper to support just about any interpretation you want to make. Which isn't to say that spectral gamma-ray is no use — it's just not diagnostic on its own. You need to calibrate it to your own basin and your own stratigraphy. This means careful, preferably quantitative, comparison of core and logs. 

Further reading 

What is spectral gamma-ray?

The spectral gamma-ray log is a measure of the natural radiation in rocks. The amplitude of the signal from the gamma-ray tool, which is just a sensor with no active source, is proportional to the energy of the gamma-ray photons it encounters. Being able to differentiate between photons of different energies turns out to be very handy Compared to the ordinary gamma-ray log, which ignores the energies and only counts the photons, it's like seeing in colour instead of black and white.

Why do we care about gamma radiation?

First, what are gamma rays? Highly energetic photons: electromagnetic radiation with very short wavelengths. 

Being able to see different energies, or 'colours', means we can differentiate between the radioactive decay of different elements. Elements decay by radiating energy, and the 'colour' of that energy is characteristic of that element (actually, of each isotope). So, we can tell by looking at the energy of a photon if we are seeing a potassium atom (40K) or a uranium atom (238U) decay. These are very different isotopes, with very different habits. We can do geology!

In fact, all sorts of radioisotopes occur naturally in the earth. By far the most abundant are potassium 40K, thorium 232Th and uranium 238U. Of these, potassium is the most abundant in sedimentary rocks, but thorium and uranium are present in small quantities, and have particular sedimentological implications.

What exactly are we measuring?

Potassium 40K decays to argon about 10% of the time, with γ-emission at 1.46 MeV (the other 90% of the time it decays to calcium). However, all of the decay in the 232Th and 238U decay series occurs by α- and β-particle decay, which don't always result in photon emission. The tool in fact measures γ-radiation from the decay of thallium 208Tl in the 232Th series (right), and from bismuth 214Bi in the 238U series. The spectral gamma-ray tool must be calibrated to known samples to give concentrations of 232Th and 238U from its readings. Proper calibration is vital, and is temperature-sensitive (of note in Canada!).

The concentrations of the three elements are estimated from the spectral measure­ments. The concentration of potassium is usually measured in percent (%) or per mil (‰), or sometimes in kilograms per tonne, which is equivalent to per mil. The other two elements are measured in parts per million (ppm).

Here is the gamma-ray spectrum from a single sample from 509 m below the sea-floor at ODP Site 1201. The final spectrum (heavy black line) is shown after removing the background spectrum (gray region) and applying a three-point mean boxcar filter. The thin black line shows the raw spectrum. Vertical lines mark the interval boundaries defined by Peter Blum (an ODP scientist at Texas A&M). Prominent energy peaks relating to certain elements are identified at the top of the figure. The inset shows the spectrum for energies >1500 keV at an expanded scale. 

We wouldn't normally look at these spectra. Instead, the tool provides logs for K, Th, and U. Next time, I'll look at the logs.

Spectrum illustration by Wikipedia user Inductiveload, licensed GFDL; decay chain by Wikipedia user BatesIsBack, licensed CC-BY-SA.

Machines can read too

The energy industry has a lot of catching up to do. Humanity is faced with difficult, pressing problems in energy production and usage, yet our industry remains as secretive and proprietary as ever. One rich source of innovation we are seriously under-utilizing is the Internet. You have probably heard of it.

Machine experience design

semantic.jpg

Web sites are just the front-end of the web. Humans have particular needs when they read web pages — attractive design, clear navigation, etc. These needs are researched and described by the rapidly growing field of user experience design, often called UX. (Yes, the ways in which your intranet pages need fixing are well understood, just not by your IT department!)

But the web has a back-end too. Rather than being for human readers, the back-end is for machines. Just like human readers, machines—other computers—also have particular needs: structured data, and a way to make queries. Why do machines need to read the web? Because the web is full of data, and data makes the world go round. 

So website administrators need to think about machine experience design too. As well as providing beautiful web pages for humans to read, they should provide widely-accepted machine-readable format such as JSON or XML, and a way to make queries.

What can we do with the machine-readable web?

The beauty of the machine-readable web, sometimes called the semantic web, or Web 3.0, is that developers can build meta-services on it. For example, a website like hipmunk.com that finds the best flights, wherever they are. Or a service that provides charts, given some data or a function. Or a mobile app that knows where to get the oil price. 

In the machine-readable web, you could do things like:

  • Write a program to analyse bibliographic data from SEG, SPE and AAPG.
  • Build a mobile app to grab log mnemonics info from SLB's, HAL's, and BHI's catalogs.
  • Grab course info from AAPG, PetroSkills, and Nautilus to help people find training they need.

Most wikis have a public application programming interface, giving direct, machine-friendly access to the wiki's database. Here are two views of one wiki page — click on the images to see the pages:

At SEG last year, I suggested to a course provider that they might consider offering machine access to their course catalog—so that developers can build services that use their course information and thus send them more students. They said, "Don't worry, we're building better search tools for our users." Sigh.

In this industry, everyone wants to own their own portal, and tends to be selfish about their data and their users. The problem is that you don't know who your users are, or rather who they could be. You don't know what they will want to do with your data. If you let them, they might create unimagined value for you—as hipmunk.com does for airlines with reasonable prices, good schedules, and in-flight Wi-Fi. 

I can't wait for the Internet revolution to hit this industry. I just hope I'm still alive.

Dream geoscience courses

MOOCs mean it's never been easier to learn something new.This is an appeal for opinions. Please share your experiences and points of view in the comments.

Are you planning to take any technical courses this year? Are you satisfied with the range of courses offered by your company, or the technical societies, or the commercial training houses (PetroSkills, Nautilus, and so on)? And how do you choose which ones to take — do you just pick what you fancy, seek recommendations, or simply aim for field classes at low latitudes?

At the end of 2012, several geobloggers wrote about courses they'd like to take. Some of them sounded excellent to me too... which of these would you take a week off work for?

Here's my own list, complete with instructors. It includes some of the same themes...

  • Programming for geoscientists (learn to program!) — Eric Jones
  • Solving hard problems about the earth — hm, that's a tough one... Bill Goodway?
  • Communicating rocks online — Brian Romans or Maitri Erwin
  • Data-driven graphics in geoscience — the figure editor at Nature Geoscience
  • Mathematics clinic for geoscientists — Brian Russell
  • Becoming a GIS ninja — er, a GIS ninja
  • Working for yourself — needs multiple points of view
What do you think? What's your dream course? Who would teach it?

Making images or making prospects?

Well-rounded geophysicists will have experience in each of the following three areas: acquisition, processing, and interpretation. Generally speaking, these three areas make up the seismic method, each requiring highly specified knowledge and tools. Historically, energy companies used to control the entire spectrum, owning the technology, the know-how and the risk, but that is no longer the case. Now, service companies do the acquisition and the processing. Interpretation is largely hosted within E & P companies, the ones who buy land and drill wells. Not only has it become unreasonable for a single geophysicist to be proficient across the board, but organizational structures constrain any particular technical viewpoint. 

Aligning with the industry's strategy, if you are a geophysicist, you likely fall into one of two camps: those who make images, or those who make prospects. One set of people to make the data, one set of people to do the interpretation.

This seems very un-scientific to me.

Where does science fit in?

Science, the standard approach of rational inquiry and accruing knowledge, is largely vacant from the applied geophysical business landscape. But, when science is used as a model, making images and making prospects are inseperable.

Can applied geophysics use scientific behaviour as a central anchor across disciplines?

There is a significant amount of science that is needed in the way that we produce observations, in the way that we make images. But the business landscape built on linear procedures leaves no wiggle room for additional testing and refinement. How do processors get better if they don't hear about their results? As a way of compensating, processing has deflected away from being a science of questioning, testing, and analysis, and moved more towards, well,... a process.

The sure-fire way to build knowledge and decrease uncertainty, is through experimentation and testing. In this sense this notion of selling 'solutions', is incompatible with scientific behavior. Science doesn't claim to give solutions, science doesn't claim to give answers, but it does promise to address uncertainty; to tell you what you know.

In studying the earth, we have to accept a lack of clarity in our data, but we must not accept mistakes, errors, or mediocrity due to shortcomings in our shared methodologies.

We need a new balance. We need more connectors across these organizational and disciplinary divides. That's where value will be made as industry encounters increasingly tougher problems. Will you be a connector? Will you be a subscriber to science?

Hall, M (2012). Do you know what you think you know? CSEG Recorder 37 (2), February 2012, p 26–30. Free to download from CSEG. 

Filters that distort vision

Almost two weeks ago, I had LASIK vision correction surgery. Although the recovery took longer than average, I am seeing better than I ever did before with glasses or contacts. Better than 20/20. Here's why.

Low order and high order refractive errors

Most people (like me) who have (had) poor vision fall short of pristine correction because lenses only correct low order refractive errors. Still, any correction gives a dramatic improvement to the naked eye; further refinements may be negligible or imperceptible. Higher order aberrations, caused by small scale structural irregularities of the cornea, can still affect one's refractive power by up to 20%, and they can only be corrected using customized surgical methods.

It occurs to me that researchers in optometry, astronomy, and seismology face a common challenge: how to accurately measure and subsequently correct for structural deformations in refractive media, and the abberrations in wavefronts caused by such higher-order irregularities. 

The filter is the physical model

Before surgery, a wavefront imaging camera was used to make detailed topographic maps of my corneas, and estimate point spread functions for each eye. The point spread function is a 2D convolution operator that fuzzies the otherwise clear. It shows how a ray is scattered and smeared across the retina. Above all, it is a filter that represents the physical eye.

Point spread function (similar to mine prior to LASIK) representing refractive errors of the cornea (top two rows), and corrected vision (bottom row). Point spread functions are filters that distort both the visual and seismic realms. The seismic example is a segment of inline 25, Blake Ridge 3D seismic survey, available from the Open Seismic Repository (OSR).Observations in optics and seismology alike are only models of the physical system, models that are constrained by the filters. We don't care about the filters per se, but they do get in the way of the underlying system. Luckily, the behaviour of any observation can be expressed as a combination of filters. In this way, knowing the nature of reality literally means quantifying the filters that cause distortion. Change the filter, change the view. Describe the filter, describe the system. 

The seismic experiment yields a filtered earth; a smeared reality. Seismic data processing is the analysis and subsequent removal of the filters that distort geological vision. 

This image was made using the custom filter manipulation tool in FIJI. The seismic data is available from OpendTect's Open Seismic Repository.

5 ways to kickstart an interpretation project

Last Friday, teams around the world started receiving external hard drives containing this year's datasets for the AAPG's Imperial Barrel Award (IBA for short). I competed in the IBA in 2008 when I was a graduate student at the University of Alberta. We were coached by the awesome Dr Murray Gingras (@MurrayGingras), we won the Canadian division, and we placed 4th in the global finals. I was the only geophysical specialist on the team alongside four geology graduate students.

Five things to do

Whether you are a staff geoscientist, a contractor, or competitor, it can help to do these things first:

  1. Make a data availability map (preferably in QGIS or ArcGIS). A graphic and geospatial representation of what you have been given.
  2. Make well scorecards: as a means to demonstrate not only that you have wells, but what information you have within the wells.
  3. Make tables, diagrams, maps of data quality and confidence. Indicate if you have doubts about data origins, data quality, interpretability, etc.
  4. Background search: The key word is search, not research. Use Mendeley to organize, tag, and search through the array of literature
  5. Use Time-Scale Creator to make your own stratigraphic column. You can manipulate the vector graphic, and make it your own. Much better than copying an old published figure. But use it for reference.

All of these things can be done before assigning roles, before saying who needs to do what. All of this needs to be done before the geoscience and the prospecting can happen. To skirt around it is missing the real work, and being complacent. Instead of being a hammer looking for a nail, lay out your materials, get a sense of what you can build. This will enable educated conversations about how you can spend your geoscientific manpower, division of labour, resources, time, etc.

Read more, then go apply it 

In addition to these tips for launching out of the blocks, I have also selected and categorized blog posts that I think might be most relevant and useful. We hope they are helpful to all geoscientists, but especially for students. Visit the Agile blog highlights list on SubSurfWiki.

I wish a happy and exciting IBA competition to all participants, and their supporting university departments. If you are competing, say hi in the comments and tell us where you hail from. 

Great geophysicists #7: Leonhard Euler

Leonhard Euler (pronounced 'oiler') was born on 15 April 1707 in Basel, Switzerland, but spent most of his life in Berlin and St Petersburg, where he died on 18 September 1783. Has was blind from the age of 50, but took this handicap stoically—when he lost sight in his right eye at 28 he said, "Now I will have less distraction".

It's hard to list Euler's contributions to the toolbox we call seismic geophysics—he worked on so many problems in maths and physics. For example, much of the notation we use today was invented or at least popularized by him: (x), e, i, π. He reconciled Newton's and Liebnitz's versions of calculus, making huge advances in solving difficult real-world equations. But he made some particularly relevant advances that resonate still:

  • Leonardo and Galileo both worked on mechanical stress distribution in beams, but didn't have the luxuries of calculus or Hooke's law. Daniel Bernoulli and Euler developed an isotropic elastic beam theory, and eventually convinced people you could actually build things using their insights. 
  • Euler's equations of fluid dynamics pre-date the more complicated (i.e. realistic) Navier–Stokes equations. Nonetheless, this work continued into vibrating strings, getting Euler (and Bernoulli) close to a general solution of the wave equation. They missed the mark, however, leaving it to Jean-Baptiste le Rond d'Alembert
  • optics (also wave behaviour). Though many of Euler's ideas about dispersion and lenses turned out to be incorrect (e.g. Pedersen 2008, DOI 10.1162/posc.2008.16.4.392), Euler did at least progress the idea that light is a wave, helping scientists move away from Newton's corpuscular theory.

The moment of Euler's death was described by the Marquis de Condorcet in a eulogy:

He had full possession of his faculties and apparently all of his strength... after having enjoyed some calculations on his blackboard concerning the laws of ascending motion for aerostatic machines... [he] spoke of Herschel's planet and the mathematics concerning its orbit and a little while later he had his grandson come and play with him and took a few cups of tea, when all of a sudden the pipe that he was smoking slipped from his hand and he ceased to calculate and live.

"He ceased to calculate," I love that.

Blurry vision and refractive power

I'm getting LASIK eye surgery today, so I've been preparing myself by learning about the eye's optics, and the surgical procedure that enhances handicapped eyes like my own. Unsurprisingly, there are some noteworthy parallels with seismic.

The eye as a gather

The human eye is akin to a common-depth point (CDP) gather. Both are like cameras constructed to focus rays at an imaging point. The retina, in the case of the eye; the reflection boundary in the case of the gather. In the eye, there are exactly four refracting interfaces at which light rays bend towards the midline and ultimately converge on the retina. In the earth, there an unknown number of interfaces, surely more than four.

Myopia, or near-sightedness, is the condition where images are focused just in front of the retina. Hyperopia, or far-sightedness, is the condition where the eyeball is too short and images would be focused behined the retina. The structure and density of the tissues in the eye have to be aligned just so, for perfect vision. If any combination of them are out of whack, you get blurry vision. Really blurry, in my case.

Characterizing blurry vision can be thought of as a two step process of measurement and validation. First, measurements of the refractive power of the eye are made with an autorefractor; quantifying the amount of first order correction needed. The correction is applied, verified, and fine-tuned by a qualitative visual assessment test. The measurement gets you close to the perfect correction; any residual adjustments may be negligible or imperceptible. And the patient, a subjective observer, is the final judge of clarity and quality of vision.

Four corrections

There are at least four ways to correct for common vision problems. Each is a different way to force the ray geometry:

  • refract the light before it enters the eye (glasses),
  • refract the light just above the cornea (contact lenses), 
  • change the shape of the cornea using LASIK or PRK surgery, or 
  • change the shape or structure of the lens (cataract surgery or implants). 

If the earth were an eye

Seismic processing is the act of measuring the refractive structure of the earth, and correcting for it's natural blurryness. Static correction, is done first in an effort to align the rays into a plane wave before it enters the 'eye'. Seismic velocity analysis is carried out on the rays, as a crude measurement of the earth's 'refractive power'. Migration, is the process of forcing geometries, mathematically instead of surgically, in order to rearrange ray paths to improve focusing. Generally speaking it's the same two-step process: measurement and validation. As with the eye, the quality of the final image is a perceptual one, coming down to subjective visual assessment. But unlike the eye, fortunately, multiple observers can share the same image, talk about it even. Changing the entire discussion about what acuity really means.

The process of vision correction goes sequentially from low order to high order. In the next post I will talk about higher order anomalies within the eye, that, once corrected, can cause super-human vision. Measurements and maps of how the eye sees show surgeons how to correct optical images. In the same vein, measurements and maps of how the seismic experiment sees, show geophysicists how to correct images in the seismic realm.

The plainest English

If you're not already reading xkcd — the must-read sciencey thrice-weekly comic strip — then please give it a try. It's good for you. Check out this wonderful description of the Saturn V rocket, aka Up Goer Five, using only the 1000 most common words in English →

This particular comic took on a life of its own last week, when Theo Sanderson built a clever online text editor that parses your words and highlights the verboten ones. Then, following the lead of @highlyanne, a hydrologist, scientists all over Twitter quickly started describing and sharing parsimonious descriptions of what they do. Anne and her partner in crime, @Allochthonous, then compiled a log of every description they could find. It's worth looking at, though it would take a while to read them all. 

What's it like using only the simplest words? I tried to define a well...

A deep, round, empty space in the ground that is only about as wide as your hand. The empty space is very deep: up to about seven tens of hundreds of times as deep as a man is tall. It is full of water. After making the empty space, we can lower small computers into it. As we pull them out, the computers tell us things about the rocks they can 'see' — like how fast waves move through them, or how much water the rocks have in them.

It's quite hard. But refreshingly so. Here's reflection seismic...

We make a very loud, short sound on the land or in the water — like a cracking sound. The sound waves go down through the rocks under the ground. As they do so, some of them come back — just as waves come back from the side of a body of water when you throw in a small rock. We can listen to the sound waves that come back, and use a computer to help make a picture of what it looks like under the ground.

Is a world without jargon dumbed down, or opened up? What is it we do again?...

It is very hard to do this work. It takes a lot of money and a long time. The people that do it have to think hard about how to do it without hurting other people or the world we live in. We don't always manage to do it well, but we try to learn from the past so we can do better next time. Most people think we should stop, but if we did, the world would go dark, our homes would be cold (or hot), and people would not be able to go very far.

Check out Up Goer Six — Theo's new editor that colour codes each word according to just how common it is. Try it — what do you do for a living? 

The image is licensed CC-BY-NC-2.5 by Randall Munroe at xkcd.com.