Are you a poet or a mathematician?

Woolly ramsMany geologists can sometimes be rather prone to a little woolliness in their language. Perhaps because you cannot prove anything in geology (prove me wrong), or because everything we do is doused in interpretation, opinion and even bias, we like to beat about the bush. A lot.

Sometimes this doesn't matter much. We're just sparing our future self from a guilty binge of word-eating, and everyone understands what we mean—no harm done. But there are occasions when a measure of unambiguous precision is called for. When we might want to be careful about the technical meanings of words like approximately, significant, and certain.

Sherman Kent was a CIA analyst in the Cold War, and he tasked himself with bringing quantitative rigour to the language of intelligence reports. He struggled (and eventually failed), meeting what he called aesthetic opposition:

Sherman Kent portraitWhat slowed me up in the first instance was the firm and reasoned resistance of some of my colleagues. Quite figuratively I am going to call them the poets—as opposed to the mathematicians—in my circle of associates, and if the term conveys a modicum of disapprobation on my part, that is what I want it to do. Their attitude toward the problem of communication seems to be fundamentally defeatist. They appear to believe the most a writer can achieve when working in a speculative area of human affairs is communication in only the broadest general sense. If he gets the wrong message across or no message at all—well, that is life.

Sherman Kent, Words of Estimative Probability, CIA Studies in Intelligence, Fall 1964

Words of estimative probabilityKent proposed using some specific words to convey specific levels of certainty (right). We have used these words in our mobile app Risk*. The only modification I made was setting P = 0.99 for Certain, and P = 0.01 for Impossible (see my remark about proving things in geology).

There are other schemes. Most petroleum geologists know Peter Rose's work. A common language, with some quantitative meaning, can dull the pain of prospect risking sessions. Almost certainly. Probably.

Do you use systematic descriptions of uncertainty? Do you think they help? How can we balance our poetic side of geology with the mathematical?

The cratering hypothesis

A few years ago, I was interpreting the Devonian of northern Alberta in a beautiful 3D seismic reflection survey. Because the target zone was rather shallow, we had acquired a dataset with a very high trace density: a lot of spatial samples. This gave us a lot of detail in timeslices, even if the vertical section views weren't particularly high resolution in these deeper, high velocity sediments of the shallow Givetian carbonate seas.

A circular feature caught my eye. Unfortunately I can't show it to you because the data are proprietary, but it was quite conspicuous and impossible to ignore: perfectly round, about 1.5 km across, and with a central mound a couple of hundred metres in diameter. I showed it to a few people and everyone said, 'yeah, impact crater'. Or maybe I just always heard 'impact crater'. 

I really wanted it to be an impact crater. Bias number 1. 

My first action was to re-read one of my favourite papers ever: Simon Stewart's 1999 paper on circular geological features. I love papers like this: basic, practical advice for interpreters. His figure 1 (left) is a lovely graphic. Stewart himself is rather enamoured with impact structures—he was the 'for' advocate in the recent debate over the Silverpit structure in the North Sea. You can read some more about it here and here

The paper gives some equations which compute the probability that, given some assumptions about meteorite flux and so forth, a bolide has cratered right where you are standing at some point in geological history. I built this little Wolfram|Alpha Widget so you can try them yourself (need help?). Of course, this is far from the same thing as there being a crater preserved, or visible in seismic, but it's a start. Bias number 2: Numbers, even dubious ones, look like evidence.

I admit it, I got carried away. Bias number 3.

But then... we shot another survey. There turned out to be another crater. And then another. My biases weren't enough—new craters finished it. According to those equations, the probability of having one in a 600 km2 survey spanning 200 Ma of preservable time is 0.14, a 14% chance. Pretty good. But the probability of two is 0.012, and three is 0.0007. And these were contemporaneous. And, just as with Silverpit, there was salt. 

It should have been obvious all along. (Bias number 4.)

Reference
Stewart, S (1999). Seismic interpretation of circular geological structures. Petroleum Geoscience 5, p 273–285. DOI: 10.1144/petgeo.5.3.273

Image from Stewart 1999 is copyright of the Geological Society of London and the European Association of Geoscientists and Engineers, and is used here with permission and with thanks.

Four days of oil

The long-awaited news of oil in the Falkland Islands arrived in May last year when UK company Rockhopper Exploration drilled a successful well in the North Falkland Basin. After testing a second well, the estimated volume of recoverable oil in the field, called Sea Lion, was upped last month to 325 million barrels. A barrel is one bathtub, or 42 gallons, or 159 litres, or 0.159 m3. Let's be scientific and stick to SI units: the discovery is about 52 million cubic metres. Recoverable means the oil can be produced with foreseeable technology; about half the oil will likely not be produced and remain in the ground forever. Or until humans are desperate enough to get it out.

On its own, a claim of 325 million barrels is meaningless to those outside the oil business. But this is a good-sized discovery, certainly a company-maker for a small player like Rockhopper. But as we read news of recent big discoveries in the Gulf of Mexico by BP, Chevron and ExxonMobil, it's worth having some sort of yardstick to help visualize these strange units and huge numbers...

Read More

Workshop? Talkshop

Day 4 of the SEG Annual Meeting. I attended the workshop entitled Geophysical data interpretation for unconventional reservoirs. It was really about the state of the art of seismic technologies for shale gas exploration and exploitation, but an emergent theme was the treatment of the earth as an engineering material, as opposed to an acoustic or elastic medium.

Harvey Goodman from Chevron started the workshop by asking the packed room, "are there any engineers in the room?" Hilariously, a single lonesome hand was raised. "Well," he said "this talk is for you." Perhaps this wasn't the venue for it; so much for spreading cross-disciplinary love and the geophysical technical vernacular. 

Mark Zoback from Stanford presented decades worth of laboratory measurements on the elastic/plastic properties of shales. Specifically the concentrations of illite and TOC on mechanical stiffness and creep. When it came to questions, he provided the most compentent and cogent responses of the day: every one was gold. Your go-to guy for shale geomechanics.

Marita Gading of Statoil presented some facinating technology called Source Rock from Seismic (we mentioned this on Monday)—a way to estimate total organic carbon from seismic for basin modeling and play evaluation. She listed the factors controling acoustic properties of shales as

  1. organic content;
  2. compaction or porosity;
  3. lithotype and mineral composition;
  4. seismic to microscale anisotropy.

She showed an empirically derived acoustic impedance transform coupled with more interpretive methods, and the results are compelling. It's not clear how well this might work in ancient shales onshore, but it has apparently worked for Statoil in younger, offshore basins.

Galen Treadgold from Global Geophysical gave a visually stunning presentation showing the value of expansive data sets in the Eagle Ford shale. He showed 1000 km2 of 3D seismic that had been stitched together, highlighting the need to look at a regional picture. Patchwork data fails to give the same clarity of variation in mechanical stratigraphy.

The session shifted to the state of microseismic technology and 'getting beyond the dots'. Speakers from rival companies MicroSeismic, ESG Solutions, and Pinnacle described how microseismic waveforms are now being used to resolve moment tensors. These provide not only the location and magnitude but also the failure characteristic of every single event. While limited by uncertainty, they may be the way to get the industry beyond the prevailing bi-wing paradigm.

The session was a nice blend of disciplines, with ample time for question and answer. I struggle though to call it a workshop, it felt like a continuation of the huge number of talks that have been going on in the same room all week. Have you ever been to a stellar workshop? What made it great? 

More from our SEG 2011 experience.

Curvelets, dreamlets, and a few tears

Day 3 of the SEG Annual Meeting came and went in a bit of a blur. Delegates were palpably close to saturation, getting harder to impress. Most were no longer taking notes, happy to let the geophysical tide of acoustic signal, and occasional noise, wash over them. Here's what we saw.

Gilles Hennenfent, Chevron

I (Evan) loved Gilles's talk Interpretive noise attenuation in the curvelet domain. For someone who is merely a spectator in the arena of domain transforms and noise removal techniques, I was surprised to find it digestable and well-paced. Coherent noise can be difficult to remove independently from coherent signal, but using dyadic partitions of the frequency-wavenumber (f-k) domain, sectors called curvelets can be muted or amplified for reducing noise and increasing signal. Curvelets have popped up in a few talks, because they can be a very sparse representation of seismic data.

Speaking of exotic signal decompositions, Ru-Shan Wu, University of California at Santa Clara, took his audience to new heights, or depths, or widths, or... something. Halfway through his description of the seismic wavefield as a light-cone in 4D Fourier time-space best characterized by drumbeat beamlets—or dreamlets—we realized that we'd fallen through a wormhole in the seismic continuum and stopped taking notes.

Lev Vernik, Marathon

Lev dished a delicious spread of tidbits crucial for understanding the geomechanical control on hydraulic fracture stimulations. It's common practice to drill parallel to the minimum horizontal stress direction to optimize fracture growth away from the well location. For isotropic linear elastic fracture behaviour, the breakdown pressure of a formation is a function of the maximum horizontal stress, the vertical stress, the pore pressure, and the fracture toughness. Unfortunately, rocks we'd like to frack are not isotropic, and need to be understood in terms of anisotropy and inelastic strains.

Lastly, we stopped in to look at the posters. But instead of being the fun-fest of awesome geoscience we were looking forward to (we're optimistic people), it was a bit of a downer and made us rather sad. Posters are often a bit unsatisfactory for the presenter: they are difficult to make, and often tucked away in a seldom-visited corner of the conference. But there was no less-frequented corner of San Antonio, and possibly the state of Texas, than the dingy poster hall at SEG this year. There were perhaps 25 people looking at the 400-or-so posters. Like us, most of them were crying.

More posts from SEG 2011.

Randomness and thin beds

Day 2 of the SEG Annual Meeting brought another 93 talks in the morning, and 103 in the afternoon, leaving us bewildered again: how to choose ten or so talks to see? (We have some ideas on this, more of which another day). Matt tried just sitting through a single session (well, almost), whereas Evan adopted the migrant approach again. These are our picks, just for you.

Stewart Trickett, Kelman

There has never been a dull or difficult talk from Stewart, one of Calgary's smartest and most thoughtful programmer–processors. He has recently addressed the hot-topic of 5D interpolation, a powerful process for making the dream of cheap, dense sampling a reality. Today, he explained why we need to now think about optimizing acquisition not for imaging, but for interpolation. And interpolation really likes pseudorandom sampling, because it helps negotiate the terms & conditions of Nyquist and avoid spatial aliasing. He went on to show a 3D subsampled then imaged three ways: remove every other shot line, remove every other shot, or remove a random shot from every pair of shots. All reduce the fold to 12% of the full data. The result: pseudorandom sampling wins every time. But don't panic, the difference in the migrated images was much smaller than in the structure stacks.

Gaynor Payton, ffA

In what could have been a product-focused marketing talk, Gaynor did a good job of outlining five easy-to-follow, practical workflows for interpreters working on thin beds. She showed frequency- and phase-based methods that exploit near-tuning, unresolved doublets in the waveform. A nice-looking bandwidth enhancement result was followed up with ffA's new high-resolution spectral decomposition we covered recently. Then she showed how negative spikes in instantaneous frequency can reveal subtle doublets in the waveform. This was extended with a skeletonized image, a sort of band-limited reflectivity display. Finally, she showed an interesting display of signed trace slope, which seemed to reveal the extent of just-resolved doublets quite nicely.

Scott Mackay, consultant

Scott MacKay shared some of his deep experience with depth imaging, but specially translated for interpreters. And this is only right: depth imaging is first and foremost an interpretive, iterative process, not a product. He gave some basic heuristics, guiding principles for interpreters. The first velocity model should be smooth—really smooth. Iterations should be only incrementally less smooth, 'creeping up' on the solution. Structure should get less, not more, complex with each step. Gathers should be flattish, not flat. Be patient, and let the data speak. And above all, Don't Panic. Always good advice.

More posts about SEG 2011.

Diffractions, Dust, and sub-Hertz imaging

Even the keenest geophysicist (and we are the keenest geophysicists) had to miss 96 presentations today. The first afternoon of talks comprised an impressive 104 talks in 13 sessions. If only 10% of talks are great, you might see one, and you would miss at least nine. Fortunately there are two of us, so we double our chances. These are our highlights.

Will Burnett, UT Austin

Diffractions, usually thought of as noise, emanate predominantly from faults and discontinuities. Will wants not to eliminate them but to use them as a complementary signal to boost imaging. His talk on diffraction velocity analysis described how, instead of picking an exact velocity model, a range of velocities are used to compute independent test images of diffraction events. Because the apex of a diffraction is the same no matter what velocity is applied, a stack of test images results in a concentration of the diffractor at the apex; the remaining events are stacked out. Blending this image with a reflection processed seismic yields a more vivid image. Also, this work was done using Madagascar... yay, open source!

Kris Pister, Dust Networks

The power of mobile devices is impressive, but Dust Networks can build an accelerometer with optical communication and a microcontroller in a 5 mm3 box. The autonomous sensors build a time-synchronized mesh protocol with channel-hopping (yeah, they do!), meaning you end up with an internet-like network that tolerates dead nodes and other failures. Now Dust build such networks of all kinds of sensors, of all sizes, in industrial applications, and surely will soon be appearing in a wellbore or seismic array near you. One to watch.

Rebecca Saltzer, ExxonMobil

Easily the most enthusiastic presentation of the day was a rip-roaring tale from Wyoming. ExxonMobil buried fifty-five low-frequency Guralp CMG3T seismometers at their LaBarge oil and gas field. The devices were arranged in a line pointing towards the Pacific, to ensure a good source of earthquakes: the source for this grand experiment. The P-waves they intended to image with have a dominant frequency of about 1 Hz, hence the seismometers, with their 0.08 to 50 Hz bandwidth. And image they did: the result was a velocity model with 500 m vertical resolution and good agreement with a 1000-well velocity model.

More posts about SEG 2011.

Frontiers at the Forum

The SEG Forum was the main attraction on Day 1 of the SEG Annual Meeting in San Antonio. Several people commented that the turnout was rather poor, however, with no more than 400 people sitting in the Lila Cockrell Theatre, even at the start. Perhaps the event needs more publicity. There was plenty of time for questions from the audience, all of which the panel discussed quite candidly.

David Lawrence, Executive VP of Exploration and Commercial at Shell gave, predictably, a rather dry corporate presentation. We understand how presentations like this get hijacked by lawyers and corporate communications departments, but wish more executives would stand up to their captors, especially for a short presentation to a technical audience. Despite his shackles, he had some eyebrow-raising technology to brag about: futuristic autonomous-vehicle marine nodes, and a million-channel sensor network, a project development they're developing with HP, of all companies.

Tim Dodson, Executive VP of Exploration at Statoil and once Matt's boss there, seemed similarly held captive by his corporation's presentation sanitizers. Saved by his charisma, Tim characterized Statoil's steady approach in exploration: deep pockets, patience, and being comfortable with risk. They seem to have the same approach to technology innovation, as Tim highlighted their Source Rock from Seismic method for characterizing source rocks and the high-resolution spectral decomposition technology we wrote about recently. Both projects took several years to develop, and have paid off in discoveries like Aldous and Skrugard respectively.

Susan Cunningham, Senior VP of Exploration at Noble Energy, spoke about her company's approach to frontier exploration. Despite her chronic use of buzz-phrases (innovative thinking, integrated objective assessment, partner of choice), Susan gave a spirited outlook on the human angles of Noble's frontier thinking. She discussed Noble's perseverance in the Eastern Mediterranean 8.5 Tcf Tamar discovery in the Levant Basin, and went on to describe Noble as a large company in a small company framework, but we're not sure what that means. Is it good?

Carl Trowell, president of WesternGeco and the youngest panelist, was the most engaging (and convincing) speaker. Shell's corporate communications people need to see presentations like this one: more powerful and trustable for its candid, personal, style. As you'd expect, he had deep insight into where seismic technolology is going. He lamented that seismic is not used enough in risk mitigation for frontier wells; for example, borehole seismic-while-drilling, imaged in the time it takes to trip out of the hole, can help predict pore pressure and other hazards in near-real-time. His forward-looking, energetic style was refreshing and inspiring.

It was a slightly dry, but basically up-beat, kick-off to the meeting. Some high-altitude perspective before we helicopter down to the nitty-gritty of the talks this afternoon.

Click here for all the posts about SEG 2011

Bad Best Practice

Applied scientists get excited about Best Practice. New professionals and new hires often ask where 'the manual' is, and senior technical management or chiefs often want to see such documentation being spread and used by their staff. The problem is that the scientists in the middle strata of skill and influence think Best Practice is a difficult, perhaps even ludicrous, concept in applied geoscience. It's too interpretive, too creative.

But promoting good ideas and methods is important for continuous improvement. At the 3P Arctic Conference in Halifax last week, I saw an interesting talk about good seismic acquisiton practice in the Arctic of Canada. The presenter was Michael Enachescu of MGM Energy, well known in the industry for his intuitive and integrated approach to petroleum geoscience. He gave some problems with the term best practice, advocating instead phrases like good practice:

  • There's a strong connotation that it is definitively superlative
  • The corollary to this is that other practices are worse
  • Its existence suggests that there is an infallible authority on the subject (an expert)
  • Therefore the concept stifles innovation and even small steps towards improvement

All this is reinforced by the way Best Practice is usually written and distributed:

  • Out of frustration, a chief commissions a document
  • One or two people build a tour de force, taking 6 months to do it
  • The read-only document is published on the corporate intranet alongside other such documents
  • Its existence is announced and its digestion mandated

Unfortunately, the next part of the story is where things go wrong:

  • Professionals look at the document and find that it doesn't quite apply to their situation
  • Even if it does apply, they are slightly affronted at being told how to do their job
  • People know about it but lack the technology or motivation to change how they were already working
  • Within 3 years there is enough new business, new staff, and new technology that the document is forgotten about and obselete, until a high-up commissions a document...

So the next time you think to yourself, "We need a Best Practice for this", think about trying something different:

  • Forget top-down publishing, and instead seed editable, link-rich documents like wiki pages
  • Encourage discussion and ownership by the technical community, not by management
  • Request case studies, which emphasize practical adaptability, not theory and methodology
  • Focus first on the anti-pattern: common practice that is downright wrong

How do you spread good ideas and methods in your organization? Does it work? How would you improve it?

Pseudogeophysics

Black magic geophysicsSeventy-five years ago, the first paper of the first issue of the journal Geophysics was published: Black magic in geophysical prospecting by Ludwig Blau of the Humble Oil and Refining Company. If you are an exploration geoscientist, you must read it. Then go and get someone else to read it too.

The paper is remarkable for a few reasons, apart from simply being first:

  • it is scientific, but warm and humorous, and utterly compelling
  • it is subtle but devastating in its condemnation of some of the day's technology
  • the critical approach Blau obtusely describes is still relevant to us today

How to crack a nut

There are two parts to the paper: a brief spotter's guide to black magic, followed by eighteen examples from the author's own experience.

Blau's guide to the characterisitcs of a nutty inventor is timeless. It presages John Baez's wonderful Crackpot Index. Here are the highlights:

  • The inventor has been working alone for many years, usually about 20
  • The inventor has no formal training, regarding this as a hindrance
  • The inventor has many Nobel prize-winning scientist friends
  • None of these friends understand the contraption in question (owing to their hindrances)

Thus it was proved...

Yet more enlightening is Blau's categorization of geophysical technology. He identifies five modes of detection, from the merely implausible to the downright bizarre:

  • Particle radiation, akin to α or β radiation
  • Non-gravitaitonal forcefields, attracting 'bait' oil
  • Radiant vibrations, detectable by skilled divination
  • Electromagnetic waves, readily received by a radio
  • Sexual emanations. No, really.

But it's the vivid descriptions of the contraptions and their inventors that light the paper up. Blau's brief but scathing reviews are so drily delivered, one imagines he must have been a man of few, but golden, words.

Here is Blau is describing the conclusion, and coup de grâce, of a hotel meeting with a pair of gentlemen peddling a stick which, when primed with a capsule of oil (or any other sought-after substance), points decisively to the nearest reservoir:

When the “bait” was changed to whiskey, the device in the hands of the inventor stubbornly pointed to a leather bag lying on the bed; the inventor asked his friend how this could possibly be explained since they had finished the last bottle that morning and he had not bought more. Upon opening the bag a pint bottle was revealed and the friend admitted having bought it that afternoon without telling the inventor about it. Thus it was proved that the device was not manipulated or influenced by the operator.

I would give my eye teeth to have been a fly on the wall during that scene.

Houston in 1927

References

Blau, L (1936). Black magic in geophysical prospecting. Geophysics 1 (1). DOI:10.1190/1.1437076

I am very grateful to the Society of Exploration Geophysicists, Tulsa, OK, for permission to reproduce the first page and quote freely from this paper. The quoted passages are copyright of the SEG. The image of Houston, dating from 1927, is in the public domain and was obtained from Wikimedia Commons. The drawings are original.