Workshop? Talkshop

Day 4 of the SEG Annual Meeting. I attended the workshop entitled Geophysical data interpretation for unconventional reservoirs. It was really about the state of the art of seismic technologies for shale gas exploration and exploitation, but an emergent theme was the treatment of the earth as an engineering material, as opposed to an acoustic or elastic medium.

Harvey Goodman from Chevron started the workshop by asking the packed room, "are there any engineers in the room?" Hilariously, a single lonesome hand was raised. "Well," he said "this talk is for you." Perhaps this wasn't the venue for it; so much for spreading cross-disciplinary love and the geophysical technical vernacular. 

Mark Zoback from Stanford presented decades worth of laboratory measurements on the elastic/plastic properties of shales. Specifically the concentrations of illite and TOC on mechanical stiffness and creep. When it came to questions, he provided the most compentent and cogent responses of the day: every one was gold. Your go-to guy for shale geomechanics.

Marita Gading of Statoil presented some facinating technology called Source Rock from Seismic (we mentioned this on Monday)—a way to estimate total organic carbon from seismic for basin modeling and play evaluation. She listed the factors controling acoustic properties of shales as

  1. organic content;
  2. compaction or porosity;
  3. lithotype and mineral composition;
  4. seismic to microscale anisotropy.

She showed an empirically derived acoustic impedance transform coupled with more interpretive methods, and the results are compelling. It's not clear how well this might work in ancient shales onshore, but it has apparently worked for Statoil in younger, offshore basins.

Galen Treadgold from Global Geophysical gave a visually stunning presentation showing the value of expansive data sets in the Eagle Ford shale. He showed 1000 km2 of 3D seismic that had been stitched together, highlighting the need to look at a regional picture. Patchwork data fails to give the same clarity of variation in mechanical stratigraphy.

The session shifted to the state of microseismic technology and 'getting beyond the dots'. Speakers from rival companies MicroSeismic, ESG Solutions, and Pinnacle described how microseismic waveforms are now being used to resolve moment tensors. These provide not only the location and magnitude but also the failure characteristic of every single event. While limited by uncertainty, they may be the way to get the industry beyond the prevailing bi-wing paradigm.

The session was a nice blend of disciplines, with ample time for question and answer. I struggle though to call it a workshop, it felt like a continuation of the huge number of talks that have been going on in the same room all week. Have you ever been to a stellar workshop? What made it great? 

More from our SEG 2011 experience.

Curvelets, dreamlets, and a few tears

Day 3 of the SEG Annual Meeting came and went in a bit of a blur. Delegates were palpably close to saturation, getting harder to impress. Most were no longer taking notes, happy to let the geophysical tide of acoustic signal, and occasional noise, wash over them. Here's what we saw.

Gilles Hennenfent, Chevron

I (Evan) loved Gilles's talk Interpretive noise attenuation in the curvelet domain. For someone who is merely a spectator in the arena of domain transforms and noise removal techniques, I was surprised to find it digestable and well-paced. Coherent noise can be difficult to remove independently from coherent signal, but using dyadic partitions of the frequency-wavenumber (f-k) domain, sectors called curvelets can be muted or amplified for reducing noise and increasing signal. Curvelets have popped up in a few talks, because they can be a very sparse representation of seismic data.

Speaking of exotic signal decompositions, Ru-Shan Wu, University of California at Santa Clara, took his audience to new heights, or depths, or widths, or... something. Halfway through his description of the seismic wavefield as a light-cone in 4D Fourier time-space best characterized by drumbeat beamlets—or dreamlets—we realized that we'd fallen through a wormhole in the seismic continuum and stopped taking notes.

Lev Vernik, Marathon

Lev dished a delicious spread of tidbits crucial for understanding the geomechanical control on hydraulic fracture stimulations. It's common practice to drill parallel to the minimum horizontal stress direction to optimize fracture growth away from the well location. For isotropic linear elastic fracture behaviour, the breakdown pressure of a formation is a function of the maximum horizontal stress, the vertical stress, the pore pressure, and the fracture toughness. Unfortunately, rocks we'd like to frack are not isotropic, and need to be understood in terms of anisotropy and inelastic strains.

Lastly, we stopped in to look at the posters. But instead of being the fun-fest of awesome geoscience we were looking forward to (we're optimistic people), it was a bit of a downer and made us rather sad. Posters are often a bit unsatisfactory for the presenter: they are difficult to make, and often tucked away in a seldom-visited corner of the conference. But there was no less-frequented corner of San Antonio, and possibly the state of Texas, than the dingy poster hall at SEG this year. There were perhaps 25 people looking at the 400-or-so posters. Like us, most of them were crying.

More posts from SEG 2011.

Randomness and thin beds

Day 2 of the SEG Annual Meeting brought another 93 talks in the morning, and 103 in the afternoon, leaving us bewildered again: how to choose ten or so talks to see? (We have some ideas on this, more of which another day). Matt tried just sitting through a single session (well, almost), whereas Evan adopted the migrant approach again. These are our picks, just for you.

Stewart Trickett, Kelman

There has never been a dull or difficult talk from Stewart, one of Calgary's smartest and most thoughtful programmer–processors. He has recently addressed the hot-topic of 5D interpolation, a powerful process for making the dream of cheap, dense sampling a reality. Today, he explained why we need to now think about optimizing acquisition not for imaging, but for interpolation. And interpolation really likes pseudorandom sampling, because it helps negotiate the terms & conditions of Nyquist and avoid spatial aliasing. He went on to show a 3D subsampled then imaged three ways: remove every other shot line, remove every other shot, or remove a random shot from every pair of shots. All reduce the fold to 12% of the full data. The result: pseudorandom sampling wins every time. But don't panic, the difference in the migrated images was much smaller than in the structure stacks.

Gaynor Payton, ffA

In what could have been a product-focused marketing talk, Gaynor did a good job of outlining five easy-to-follow, practical workflows for interpreters working on thin beds. She showed frequency- and phase-based methods that exploit near-tuning, unresolved doublets in the waveform. A nice-looking bandwidth enhancement result was followed up with ffA's new high-resolution spectral decomposition we covered recently. Then she showed how negative spikes in instantaneous frequency can reveal subtle doublets in the waveform. This was extended with a skeletonized image, a sort of band-limited reflectivity display. Finally, she showed an interesting display of signed trace slope, which seemed to reveal the extent of just-resolved doublets quite nicely.

Scott Mackay, consultant

Scott MacKay shared some of his deep experience with depth imaging, but specially translated for interpreters. And this is only right: depth imaging is first and foremost an interpretive, iterative process, not a product. He gave some basic heuristics, guiding principles for interpreters. The first velocity model should be smooth—really smooth. Iterations should be only incrementally less smooth, 'creeping up' on the solution. Structure should get less, not more, complex with each step. Gathers should be flattish, not flat. Be patient, and let the data speak. And above all, Don't Panic. Always good advice.

More posts about SEG 2011.

Diffractions, Dust, and sub-Hertz imaging

Even the keenest geophysicist (and we are the keenest geophysicists) had to miss 96 presentations today. The first afternoon of talks comprised an impressive 104 talks in 13 sessions. If only 10% of talks are great, you might see one, and you would miss at least nine. Fortunately there are two of us, so we double our chances. These are our highlights.

Will Burnett, UT Austin

Diffractions, usually thought of as noise, emanate predominantly from faults and discontinuities. Will wants not to eliminate them but to use them as a complementary signal to boost imaging. His talk on diffraction velocity analysis described how, instead of picking an exact velocity model, a range of velocities are used to compute independent test images of diffraction events. Because the apex of a diffraction is the same no matter what velocity is applied, a stack of test images results in a concentration of the diffractor at the apex; the remaining events are stacked out. Blending this image with a reflection processed seismic yields a more vivid image. Also, this work was done using Madagascar... yay, open source!

Kris Pister, Dust Networks

The power of mobile devices is impressive, but Dust Networks can build an accelerometer with optical communication and a microcontroller in a 5 mm3 box. The autonomous sensors build a time-synchronized mesh protocol with channel-hopping (yeah, they do!), meaning you end up with an internet-like network that tolerates dead nodes and other failures. Now Dust build such networks of all kinds of sensors, of all sizes, in industrial applications, and surely will soon be appearing in a wellbore or seismic array near you. One to watch.

Rebecca Saltzer, ExxonMobil

Easily the most enthusiastic presentation of the day was a rip-roaring tale from Wyoming. ExxonMobil buried fifty-five low-frequency Guralp CMG3T seismometers at their LaBarge oil and gas field. The devices were arranged in a line pointing towards the Pacific, to ensure a good source of earthquakes: the source for this grand experiment. The P-waves they intended to image with have a dominant frequency of about 1 Hz, hence the seismometers, with their 0.08 to 50 Hz bandwidth. And image they did: the result was a velocity model with 500 m vertical resolution and good agreement with a 1000-well velocity model.

More posts about SEG 2011.

Frontiers at the Forum

The SEG Forum was the main attraction on Day 1 of the SEG Annual Meeting in San Antonio. Several people commented that the turnout was rather poor, however, with no more than 400 people sitting in the Lila Cockrell Theatre, even at the start. Perhaps the event needs more publicity. There was plenty of time for questions from the audience, all of which the panel discussed quite candidly.

David Lawrence, Executive VP of Exploration and Commercial at Shell gave, predictably, a rather dry corporate presentation. We understand how presentations like this get hijacked by lawyers and corporate communications departments, but wish more executives would stand up to their captors, especially for a short presentation to a technical audience. Despite his shackles, he had some eyebrow-raising technology to brag about: futuristic autonomous-vehicle marine nodes, and a million-channel sensor network, a project development they're developing with HP, of all companies.

Tim Dodson, Executive VP of Exploration at Statoil and once Matt's boss there, seemed similarly held captive by his corporation's presentation sanitizers. Saved by his charisma, Tim characterized Statoil's steady approach in exploration: deep pockets, patience, and being comfortable with risk. They seem to have the same approach to technology innovation, as Tim highlighted their Source Rock from Seismic method for characterizing source rocks and the high-resolution spectral decomposition technology we wrote about recently. Both projects took several years to develop, and have paid off in discoveries like Aldous and Skrugard respectively.

Susan Cunningham, Senior VP of Exploration at Noble Energy, spoke about her company's approach to frontier exploration. Despite her chronic use of buzz-phrases (innovative thinking, integrated objective assessment, partner of choice), Susan gave a spirited outlook on the human angles of Noble's frontier thinking. She discussed Noble's perseverance in the Eastern Mediterranean 8.5 Tcf Tamar discovery in the Levant Basin, and went on to describe Noble as a large company in a small company framework, but we're not sure what that means. Is it good?

Carl Trowell, president of WesternGeco and the youngest panelist, was the most engaging (and convincing) speaker. Shell's corporate communications people need to see presentations like this one: more powerful and trustable for its candid, personal, style. As you'd expect, he had deep insight into where seismic technolology is going. He lamented that seismic is not used enough in risk mitigation for frontier wells; for example, borehole seismic-while-drilling, imaged in the time it takes to trip out of the hole, can help predict pore pressure and other hazards in near-real-time. His forward-looking, energetic style was refreshing and inspiring.

It was a slightly dry, but basically up-beat, kick-off to the meeting. Some high-altitude perspective before we helicopter down to the nitty-gritty of the talks this afternoon.

Click here for all the posts about SEG 2011

Bad Best Practice

Applied scientists get excited about Best Practice. New professionals and new hires often ask where 'the manual' is, and senior technical management or chiefs often want to see such documentation being spread and used by their staff. The problem is that the scientists in the middle strata of skill and influence think Best Practice is a difficult, perhaps even ludicrous, concept in applied geoscience. It's too interpretive, too creative.

But promoting good ideas and methods is important for continuous improvement. At the 3P Arctic Conference in Halifax last week, I saw an interesting talk about good seismic acquisiton practice in the Arctic of Canada. The presenter was Michael Enachescu of MGM Energy, well known in the industry for his intuitive and integrated approach to petroleum geoscience. He gave some problems with the term best practice, advocating instead phrases like good practice:

  • There's a strong connotation that it is definitively superlative
  • The corollary to this is that other practices are worse
  • Its existence suggests that there is an infallible authority on the subject (an expert)
  • Therefore the concept stifles innovation and even small steps towards improvement

All this is reinforced by the way Best Practice is usually written and distributed:

  • Out of frustration, a chief commissions a document
  • One or two people build a tour de force, taking 6 months to do it
  • The read-only document is published on the corporate intranet alongside other such documents
  • Its existence is announced and its digestion mandated

Unfortunately, the next part of the story is where things go wrong:

  • Professionals look at the document and find that it doesn't quite apply to their situation
  • Even if it does apply, they are slightly affronted at being told how to do their job
  • People know about it but lack the technology or motivation to change how they were already working
  • Within 3 years there is enough new business, new staff, and new technology that the document is forgotten about and obselete, until a high-up commissions a document...

So the next time you think to yourself, "We need a Best Practice for this", think about trying something different:

  • Forget top-down publishing, and instead seed editable, link-rich documents like wiki pages
  • Encourage discussion and ownership by the technical community, not by management
  • Request case studies, which emphasize practical adaptability, not theory and methodology
  • Focus first on the anti-pattern: common practice that is downright wrong

How do you spread good ideas and methods in your organization? Does it work? How would you improve it?

Pseudogeophysics

Black magic geophysicsSeventy-five years ago, the first paper of the first issue of the journal Geophysics was published: Black magic in geophysical prospecting by Ludwig Blau of the Humble Oil and Refining Company. If you are an exploration geoscientist, you must read it. Then go and get someone else to read it too.

The paper is remarkable for a few reasons, apart from simply being first:

  • it is scientific, but warm and humorous, and utterly compelling
  • it is subtle but devastating in its condemnation of some of the day's technology
  • the critical approach Blau obtusely describes is still relevant to us today

How to crack a nut

There are two parts to the paper: a brief spotter's guide to black magic, followed by eighteen examples from the author's own experience.

Blau's guide to the characterisitcs of a nutty inventor is timeless. It presages John Baez's wonderful Crackpot Index. Here are the highlights:

  • The inventor has been working alone for many years, usually about 20
  • The inventor has no formal training, regarding this as a hindrance
  • The inventor has many Nobel prize-winning scientist friends
  • None of these friends understand the contraption in question (owing to their hindrances)

Thus it was proved...

Yet more enlightening is Blau's categorization of geophysical technology. He identifies five modes of detection, from the merely implausible to the downright bizarre:

  • Particle radiation, akin to α or β radiation
  • Non-gravitaitonal forcefields, attracting 'bait' oil
  • Radiant vibrations, detectable by skilled divination
  • Electromagnetic waves, readily received by a radio
  • Sexual emanations. No, really.

But it's the vivid descriptions of the contraptions and their inventors that light the paper up. Blau's brief but scathing reviews are so drily delivered, one imagines he must have been a man of few, but golden, words.

Here is Blau is describing the conclusion, and coup de grâce, of a hotel meeting with a pair of gentlemen peddling a stick which, when primed with a capsule of oil (or any other sought-after substance), points decisively to the nearest reservoir:

When the “bait” was changed to whiskey, the device in the hands of the inventor stubbornly pointed to a leather bag lying on the bed; the inventor asked his friend how this could possibly be explained since they had finished the last bottle that morning and he had not bought more. Upon opening the bag a pint bottle was revealed and the friend admitted having bought it that afternoon without telling the inventor about it. Thus it was proved that the device was not manipulated or influenced by the operator.

I would give my eye teeth to have been a fly on the wall during that scene.

Houston in 1927

References

Blau, L (1936). Black magic in geophysical prospecting. Geophysics 1 (1). DOI:10.1190/1.1437076

I am very grateful to the Society of Exploration Geophysicists, Tulsa, OK, for permission to reproduce the first page and quote freely from this paper. The quoted passages are copyright of the SEG. The image of Houston, dating from 1927, is in the public domain and was obtained from Wikimedia Commons. The drawings are original.

Geophysical stamps 4: Seismology

This is the last in a series of posts about some stamps I bought on eBay in May. I don't collect stamps, but these were irresistible to me. They are 1980-vintage East German stamps, not with cartoons or schematic illustrations but precisely draughted drawings of geophysical methods. I have already written about the the gravimeterthe sonic tool, and the geophone stamps; today it's time to finish off and cover the 50 pfennig stamp, depicting global seismology.

← The 50 pfennig stamp in the series of four shows not an instrument, but the method of deep-earth seismology. Earthquakes' seismic traces, left-most, are the basic pieces of data. Seismologists analyse the paths of these signals through the earth's crust (Erdkruste), mantle (Mantel) and core (Erdkern), right-most. The result is a model of the earth's deep interior, centre. Erdkrustenforschung translates as earth crust surveying. The actual size of the stamp is 43 × 26 mm.

To petroleum geophysicists and interpreters, global seismology may seem like a poor sibling of reflection seismology. But the science began with earthquake monitoring, which is centuries old. Earthquakes are the natural source of seismic energy for global seismological measurements; Zoeppritz wrote his equations about earthquake waves. (I don't know, but I can imagine seismologists feeling a guilty pang of anticipation when hearing of even quite deadly earthquakes.)

The M9.2 Sumatra-Andaman earthquake of 2004 lasted for an incredible 500 seconds—compared to a few seconds or tens of seconds for most earthquakes felt by humans. Giant events like this are rare (once a decade), and especially valuable because of the broad band of frequencies and very high amplitudes they generate. This allows high-fidelity detection by precision digital instruments like the Streckeisen STS-1 seismometer, positioned all over the world in networks like the United States' Global Seismographic Network, and the meta-network coordinated by the Federation of Digital Seismograph Networks, or FDSN. And these wavefields need global receiver arrays. 

The basic structure of the earth was famously elucidated decades ago by these patterns of wave reflection and refraction through the earth's fundamentally concentric spheres of the solid inner core, liquid outer core, viscous mantle, and solid crust. For example, the apparent inability of the outer core to support S-waves is the primary evidence for its interpretation as a liquid. Today, global seismologists are more concerned with the finer points of this structure, and eking more resolution out of the intrinsically cryptic filter that is the earth. Sound familiar? What we do in exploration seismology is just a high-resolution, narrow-band, controlled-source extension of these methods. 

Wherefore art thou, Expert?

I don't buy the notion that we should be competent at everything we do. Unless you have chosen to specialize, as a petrophysicist or geophysical analyst perhaps, you are a generalist. Perhaps you are the manager of an asset team, or a lone geophysicist in a field development project, or a junior geologist looking after a drilling program. You are certainly being handed tasks you have never done before, and being asked to think about problems you didn't even know existed this time last year. If you're anything like me, you are bewildered at least 50% of the time.

In this post, I take a look at some problems with assuming technical professionals can be experts at everything, especially in this world of unconventional plays and methods. And I even have some ideas about what geoscientists, specialists and service companies can do about them...

Read More

Niobrara shale field trip

Mike Batzle explaining rock physics in the fieldOn my last day in Colorado, I went on a field trip to learn about the geology of the area. The main event was a trip to the Lyons Cemex quarry north of Boulder, where they mine the Niobrara formation to make cement. Interestingly, the same formation is being penetrated for oil and gas beneath the surface only a few thousand metres away. Apparently, the composition of the Niobrara is not desireable for construction or building materials, but it makes the ideal cement for drilling and completion operations. I find it almost poetic that the western-uplifted part of the formation is mined so that the eastern-deeper parts can be drilled; a geologic skin-graft, of sorts...
Read More