Diffractions, Dust, and sub-Hertz imaging

Even the keenest geophysicist (and we are the keenest geophysicists) had to miss 96 presentations today. The first afternoon of talks comprised an impressive 104 talks in 13 sessions. If only 10% of talks are great, you might see one, and you would miss at least nine. Fortunately there are two of us, so we double our chances. These are our highlights.

Will Burnett, UT Austin

Diffractions, usually thought of as noise, emanate predominantly from faults and discontinuities. Will wants not to eliminate them but to use them as a complementary signal to boost imaging. His talk on diffraction velocity analysis described how, instead of picking an exact velocity model, a range of velocities are used to compute independent test images of diffraction events. Because the apex of a diffraction is the same no matter what velocity is applied, a stack of test images results in a concentration of the diffractor at the apex; the remaining events are stacked out. Blending this image with a reflection processed seismic yields a more vivid image. Also, this work was done using Madagascar... yay, open source!

Kris Pister, Dust Networks

The power of mobile devices is impressive, but Dust Networks can build an accelerometer with optical communication and a microcontroller in a 5 mm3 box. The autonomous sensors build a time-synchronized mesh protocol with channel-hopping (yeah, they do!), meaning you end up with an internet-like network that tolerates dead nodes and other failures. Now Dust build such networks of all kinds of sensors, of all sizes, in industrial applications, and surely will soon be appearing in a wellbore or seismic array near you. One to watch.

Rebecca Saltzer, ExxonMobil

Easily the most enthusiastic presentation of the day was a rip-roaring tale from Wyoming. ExxonMobil buried fifty-five low-frequency Guralp CMG3T seismometers at their LaBarge oil and gas field. The devices were arranged in a line pointing towards the Pacific, to ensure a good source of earthquakes: the source for this grand experiment. The P-waves they intended to image with have a dominant frequency of about 1 Hz, hence the seismometers, with their 0.08 to 50 Hz bandwidth. And image they did: the result was a velocity model with 500 m vertical resolution and good agreement with a 1000-well velocity model.

More posts about SEG 2011.

Frontiers at the Forum

The SEG Forum was the main attraction on Day 1 of the SEG Annual Meeting in San Antonio. Several people commented that the turnout was rather poor, however, with no more than 400 people sitting in the Lila Cockrell Theatre, even at the start. Perhaps the event needs more publicity. There was plenty of time for questions from the audience, all of which the panel discussed quite candidly.

David Lawrence, Executive VP of Exploration and Commercial at Shell gave, predictably, a rather dry corporate presentation. We understand how presentations like this get hijacked by lawyers and corporate communications departments, but wish more executives would stand up to their captors, especially for a short presentation to a technical audience. Despite his shackles, he had some eyebrow-raising technology to brag about: futuristic autonomous-vehicle marine nodes, and a million-channel sensor network, a project development they're developing with HP, of all companies.

Tim Dodson, Executive VP of Exploration at Statoil and once Matt's boss there, seemed similarly held captive by his corporation's presentation sanitizers. Saved by his charisma, Tim characterized Statoil's steady approach in exploration: deep pockets, patience, and being comfortable with risk. They seem to have the same approach to technology innovation, as Tim highlighted their Source Rock from Seismic method for characterizing source rocks and the high-resolution spectral decomposition technology we wrote about recently. Both projects took several years to develop, and have paid off in discoveries like Aldous and Skrugard respectively.

Susan Cunningham, Senior VP of Exploration at Noble Energy, spoke about her company's approach to frontier exploration. Despite her chronic use of buzz-phrases (innovative thinking, integrated objective assessment, partner of choice), Susan gave a spirited outlook on the human angles of Noble's frontier thinking. She discussed Noble's perseverance in the Eastern Mediterranean 8.5 Tcf Tamar discovery in the Levant Basin, and went on to describe Noble as a large company in a small company framework, but we're not sure what that means. Is it good?

Carl Trowell, president of WesternGeco and the youngest panelist, was the most engaging (and convincing) speaker. Shell's corporate communications people need to see presentations like this one: more powerful and trustable for its candid, personal, style. As you'd expect, he had deep insight into where seismic technolology is going. He lamented that seismic is not used enough in risk mitigation for frontier wells; for example, borehole seismic-while-drilling, imaged in the time it takes to trip out of the hole, can help predict pore pressure and other hazards in near-real-time. His forward-looking, energetic style was refreshing and inspiring.

It was a slightly dry, but basically up-beat, kick-off to the meeting. Some high-altitude perspective before we helicopter down to the nitty-gritty of the talks this afternoon.

Click here for all the posts about SEG 2011

Bad Best Practice

Applied scientists get excited about Best Practice. New professionals and new hires often ask where 'the manual' is, and senior technical management or chiefs often want to see such documentation being spread and used by their staff. The problem is that the scientists in the middle strata of skill and influence think Best Practice is a difficult, perhaps even ludicrous, concept in applied geoscience. It's too interpretive, too creative.

But promoting good ideas and methods is important for continuous improvement. At the 3P Arctic Conference in Halifax last week, I saw an interesting talk about good seismic acquisiton practice in the Arctic of Canada. The presenter was Michael Enachescu of MGM Energy, well known in the industry for his intuitive and integrated approach to petroleum geoscience. He gave some problems with the term best practice, advocating instead phrases like good practice:

  • There's a strong connotation that it is definitively superlative
  • The corollary to this is that other practices are worse
  • Its existence suggests that there is an infallible authority on the subject (an expert)
  • Therefore the concept stifles innovation and even small steps towards improvement

All this is reinforced by the way Best Practice is usually written and distributed:

  • Out of frustration, a chief commissions a document
  • One or two people build a tour de force, taking 6 months to do it
  • The read-only document is published on the corporate intranet alongside other such documents
  • Its existence is announced and its digestion mandated

Unfortunately, the next part of the story is where things go wrong:

  • Professionals look at the document and find that it doesn't quite apply to their situation
  • Even if it does apply, they are slightly affronted at being told how to do their job
  • People know about it but lack the technology or motivation to change how they were already working
  • Within 3 years there is enough new business, new staff, and new technology that the document is forgotten about and obselete, until a high-up commissions a document...

So the next time you think to yourself, "We need a Best Practice for this", think about trying something different:

  • Forget top-down publishing, and instead seed editable, link-rich documents like wiki pages
  • Encourage discussion and ownership by the technical community, not by management
  • Request case studies, which emphasize practical adaptability, not theory and methodology
  • Focus first on the anti-pattern: common practice that is downright wrong

How do you spread good ideas and methods in your organization? Does it work? How would you improve it?

Pseudogeophysics

Black magic geophysicsSeventy-five years ago, the first paper of the first issue of the journal Geophysics was published: Black magic in geophysical prospecting by Ludwig Blau of the Humble Oil and Refining Company. If you are an exploration geoscientist, you must read it. Then go and get someone else to read it too.

The paper is remarkable for a few reasons, apart from simply being first:

  • it is scientific, but warm and humorous, and utterly compelling
  • it is subtle but devastating in its condemnation of some of the day's technology
  • the critical approach Blau obtusely describes is still relevant to us today

How to crack a nut

There are two parts to the paper: a brief spotter's guide to black magic, followed by eighteen examples from the author's own experience.

Blau's guide to the characterisitcs of a nutty inventor is timeless. It presages John Baez's wonderful Crackpot Index. Here are the highlights:

  • The inventor has been working alone for many years, usually about 20
  • The inventor has no formal training, regarding this as a hindrance
  • The inventor has many Nobel prize-winning scientist friends
  • None of these friends understand the contraption in question (owing to their hindrances)

Thus it was proved...

Yet more enlightening is Blau's categorization of geophysical technology. He identifies five modes of detection, from the merely implausible to the downright bizarre:

  • Particle radiation, akin to α or β radiation
  • Non-gravitaitonal forcefields, attracting 'bait' oil
  • Radiant vibrations, detectable by skilled divination
  • Electromagnetic waves, readily received by a radio
  • Sexual emanations. No, really.

But it's the vivid descriptions of the contraptions and their inventors that light the paper up. Blau's brief but scathing reviews are so drily delivered, one imagines he must have been a man of few, but golden, words.

Here is Blau is describing the conclusion, and coup de grâce, of a hotel meeting with a pair of gentlemen peddling a stick which, when primed with a capsule of oil (or any other sought-after substance), points decisively to the nearest reservoir:

When the “bait” was changed to whiskey, the device in the hands of the inventor stubbornly pointed to a leather bag lying on the bed; the inventor asked his friend how this could possibly be explained since they had finished the last bottle that morning and he had not bought more. Upon opening the bag a pint bottle was revealed and the friend admitted having bought it that afternoon without telling the inventor about it. Thus it was proved that the device was not manipulated or influenced by the operator.

I would give my eye teeth to have been a fly on the wall during that scene.

Houston in 1927

References

Blau, L (1936). Black magic in geophysical prospecting. Geophysics 1 (1). DOI:10.1190/1.1437076

I am very grateful to the Society of Exploration Geophysicists, Tulsa, OK, for permission to reproduce the first page and quote freely from this paper. The quoted passages are copyright of the SEG. The image of Houston, dating from 1927, is in the public domain and was obtained from Wikimedia Commons. The drawings are original.

Geophysical stamps 4: Seismology

This is the last in a series of posts about some stamps I bought on eBay in May. I don't collect stamps, but these were irresistible to me. They are 1980-vintage East German stamps, not with cartoons or schematic illustrations but precisely draughted drawings of geophysical methods. I have already written about the the gravimeterthe sonic tool, and the geophone stamps; today it's time to finish off and cover the 50 pfennig stamp, depicting global seismology.

← The 50 pfennig stamp in the series of four shows not an instrument, but the method of deep-earth seismology. Earthquakes' seismic traces, left-most, are the basic pieces of data. Seismologists analyse the paths of these signals through the earth's crust (Erdkruste), mantle (Mantel) and core (Erdkern), right-most. The result is a model of the earth's deep interior, centre. Erdkrustenforschung translates as earth crust surveying. The actual size of the stamp is 43 × 26 mm.

To petroleum geophysicists and interpreters, global seismology may seem like a poor sibling of reflection seismology. But the science began with earthquake monitoring, which is centuries old. Earthquakes are the natural source of seismic energy for global seismological measurements; Zoeppritz wrote his equations about earthquake waves. (I don't know, but I can imagine seismologists feeling a guilty pang of anticipation when hearing of even quite deadly earthquakes.)

The M9.2 Sumatra-Andaman earthquake of 2004 lasted for an incredible 500 seconds—compared to a few seconds or tens of seconds for most earthquakes felt by humans. Giant events like this are rare (once a decade), and especially valuable because of the broad band of frequencies and very high amplitudes they generate. This allows high-fidelity detection by precision digital instruments like the Streckeisen STS-1 seismometer, positioned all over the world in networks like the United States' Global Seismographic Network, and the meta-network coordinated by the Federation of Digital Seismograph Networks, or FDSN. And these wavefields need global receiver arrays. 

The basic structure of the earth was famously elucidated decades ago by these patterns of wave reflection and refraction through the earth's fundamentally concentric spheres of the solid inner core, liquid outer core, viscous mantle, and solid crust. For example, the apparent inability of the outer core to support S-waves is the primary evidence for its interpretation as a liquid. Today, global seismologists are more concerned with the finer points of this structure, and eking more resolution out of the intrinsically cryptic filter that is the earth. Sound familiar? What we do in exploration seismology is just a high-resolution, narrow-band, controlled-source extension of these methods. 

Wherefore art thou, Expert?

I don't buy the notion that we should be competent at everything we do. Unless you have chosen to specialize, as a petrophysicist or geophysical analyst perhaps, you are a generalist. Perhaps you are the manager of an asset team, or a lone geophysicist in a field development project, or a junior geologist looking after a drilling program. You are certainly being handed tasks you have never done before, and being asked to think about problems you didn't even know existed this time last year. If you're anything like me, you are bewildered at least 50% of the time.

In this post, I take a look at some problems with assuming technical professionals can be experts at everything, especially in this world of unconventional plays and methods. And I even have some ideas about what geoscientists, specialists and service companies can do about them...

Read More

Niobrara shale field trip

Mike Batzle explaining rock physics in the fieldOn my last day in Colorado, I went on a field trip to learn about the geology of the area. The main event was a trip to the Lyons Cemex quarry north of Boulder, where they mine the Niobrara formation to make cement. Interestingly, the same formation is being penetrated for oil and gas beneath the surface only a few thousand metres away. Apparently, the composition of the Niobrara is not desireable for construction or building materials, but it makes the ideal cement for drilling and completion operations. I find it almost poetic that the western-uplifted part of the formation is mined so that the eastern-deeper parts can be drilled; a geologic skin-graft, of sorts...
Read More

The last chat chart

The 1IWRP technical program was closed with a one-hour brainstorming session; an attempt to capture the main issues and ideas moving forward. This was great stuff, and I was invited to jot down the bombardment of shout-outs from the crowd.   

Admittedly, no list is fully comprehensive, and this flip chart is almost laughable in its ruggedness. However, I think it represents the diversity in this crowd and the relevant issues that people will be working on in the future. The main points were:

  • Creating a common model and data sharing
  • The future of digital rock physics
  • Dealing with upscaling and scale dependant measurements
  • The use of rock physics for improving sub-salt AVO analyses
  • Strengthening the connection between rock physics and geomechanical applications

I have scribed this into a more legible form, and put some expanded commentary on AgileWiki if you want to read more about these points. 

Do you disagree with anything on this list? Have we missed something?

More 1IWRP highlights

As I reported on Wednesday, I've been at 1IWRP, a workshop on rock physics in the petroleum industry. Topics ranged from lab core studies to 3D digital scanners, and from seismic attenuation and dispersion to shales and anisotropy. Rock physics truly crosses a lot of subject areas.

Here are a few of the many great talks that really stood out for me:

Mark Chapman from the University of Edinburgh, submitted a new formulation for frequency dependant AVO analysis. He suggested that if a proper rock physics model of the rock is described, frequency can be decomposed from seismic gathers for improved reservoir characterization. Some folks in the crowd warned that the utility of this work might be limited to select cases with a full band impedance change, but his method appears to be a step beyond the traditional AVO workflow.

Arthur Cheng from Halliburton talked about modeling techniques to estimate anisotropic parameters from borehole measurements. He descibed the state of the art in acoustic logging tools, and used a ray-tracing VSP forward model to show a significant smear of reflection points through an anisotropic earth layer. He touched on the importance of close interaction between service companies and end users, especially those working in complex environments. In particular: service companies have a good understanding of data precision and accuracy, but it's usually not adequately transfered to the interpreter.

Colin Sayers from Schlumberger presented several talks, but I really enjoyed what he had to say about sonic and seismic anisotropy and how it is relevant to characterizing shale gas reservoirs. Fracture propagation depends on the 3D stress state in the rock: hard to capture with a 1D earth model. He showed an example of how hydraulic fracture behaviour could be more accurately predicted by incorporating anisotropic stress dependant elastic properties. I hope this insight permeates throughout the engineering community. 

Rob Lander from Geocosm showed some fresh-out-of-the-oven simulations of coupled diagenesis and rock physics models for predicting reservoir properties away from wells. His company's workflow has a basis in petrography, integrating cathodluminescence microscopy and diagenetic modeling. Really inspiring and integrated stuff. I submit to you that this presentation would be equally enjoyed at a meeting of AAPG, SPE, SPWLA, SEG, or SCA — that's not something that you can say about every talk. 

Every break heralded a new discussion. The delegates were very actively engaged. 

Today, I am going on a field trip to the Niobrara Shale Quarry. After four days indoors, I'm looking forward to getting outside and hammering some rocks! 

Digital rocks and accountability

There were three main sessions at the first day of the First International Workshop on Rock Physics, 1IWRP. Experimental methods, Digital rock physics, and Methods in rock physics, a softer, more philosophical session on perspectives in the lab and in the field. There have been several sessions of discussion too, occurring after every five presentations or so, which has been a refreshing addition to the program. I am looking for talks that will change the way we do things and two talks really stood out for me. 

Mark Knackstedt from Digitalcore in Australia, gave a visually stunning presentation on the state of the art in digital rock physics. You can browse Digitalcore's website and and view some of the animations that he showed. A few members of the crowd were skeptical about the nuances of characterizing microcracks and grain contacts near or below the resolution limits, as these tiny elements have a dominating role on a material's effective properties.  

In my opinion, in order to get beyond 3D visualizations, and the computational aspect of pixel counting, digital rock physicists need to integrate with petrophysicists to calibrate with logging tools. One exciting opportunity is deducing a link between laboratory and borehole-based NMR measurements for pore space and fluid characterization. 

In an inspired and slightly offbeat talk, Bill Murphy 3 from e4sciences challenged the community to make the profession better by increasing accountability. Being accountable means acknowledging what you know and what you don't know. He offered Atul Gawande's surgical writings as a model for all imperfect sciences. Instead of occupying a continuum from easy to hard, rock physics problems span a ternary space from simple to complicated to complex. Simple is something that can be described by a recipe or a definite measurement, complicated is like going to the moon, and complex is like raising a child, where there's an element of unpredictability. Part of our profession should be recognizing where our problems fall in this ternary space, and that should drive how we deal with these problems.

He also explained that ours is a science full of paradoxes:

  • Taking more measurements means that we need to make more hypotheses, not fewer
  • Ubiquitous uncertainty must be met with increased precision and rigor
  • Acknowledging errors is essential for professional and scientific accountability

The next time you are working on a problem, why not estimate where it plots in this ternary space? It's likely to contain some combination of all three, and it might evolve as the project progresses. And ask your colleagues where they would place the same problem—it might surprise you.