Randomness and thin beds

Day 2 of the SEG Annual Meeting brought another 93 talks in the morning, and 103 in the afternoon, leaving us bewildered again: how to choose ten or so talks to see? (We have some ideas on this, more of which another day). Matt tried just sitting through a single session (well, almost), whereas Evan adopted the migrant approach again. These are our picks, just for you.

Stewart Trickett, Kelman

There has never been a dull or difficult talk from Stewart, one of Calgary's smartest and most thoughtful programmer–processors. He has recently addressed the hot-topic of 5D interpolation, a powerful process for making the dream of cheap, dense sampling a reality. Today, he explained why we need to now think about optimizing acquisition not for imaging, but for interpolation. And interpolation really likes pseudorandom sampling, because it helps negotiate the terms & conditions of Nyquist and avoid spatial aliasing. He went on to show a 3D subsampled then imaged three ways: remove every other shot line, remove every other shot, or remove a random shot from every pair of shots. All reduce the fold to 12% of the full data. The result: pseudorandom sampling wins every time. But don't panic, the difference in the migrated images was much smaller than in the structure stacks.

Gaynor Payton, ffA

In what could have been a product-focused marketing talk, Gaynor did a good job of outlining five easy-to-follow, practical workflows for interpreters working on thin beds. She showed frequency- and phase-based methods that exploit near-tuning, unresolved doublets in the waveform. A nice-looking bandwidth enhancement result was followed up with ffA's new high-resolution spectral decomposition we covered recently. Then she showed how negative spikes in instantaneous frequency can reveal subtle doublets in the waveform. This was extended with a skeletonized image, a sort of band-limited reflectivity display. Finally, she showed an interesting display of signed trace slope, which seemed to reveal the extent of just-resolved doublets quite nicely.

Scott Mackay, consultant

Scott MacKay shared some of his deep experience with depth imaging, but specially translated for interpreters. And this is only right: depth imaging is first and foremost an interpretive, iterative process, not a product. He gave some basic heuristics, guiding principles for interpreters. The first velocity model should be smooth—really smooth. Iterations should be only incrementally less smooth, 'creeping up' on the solution. Structure should get less, not more, complex with each step. Gathers should be flattish, not flat. Be patient, and let the data speak. And above all, Don't Panic. Always good advice.

More posts about SEG 2011.

Diffractions, Dust, and sub-Hertz imaging

Even the keenest geophysicist (and we are the keenest geophysicists) had to miss 96 presentations today. The first afternoon of talks comprised an impressive 104 talks in 13 sessions. If only 10% of talks are great, you might see one, and you would miss at least nine. Fortunately there are two of us, so we double our chances. These are our highlights.

Will Burnett, UT Austin

Diffractions, usually thought of as noise, emanate predominantly from faults and discontinuities. Will wants not to eliminate them but to use them as a complementary signal to boost imaging. His talk on diffraction velocity analysis described how, instead of picking an exact velocity model, a range of velocities are used to compute independent test images of diffraction events. Because the apex of a diffraction is the same no matter what velocity is applied, a stack of test images results in a concentration of the diffractor at the apex; the remaining events are stacked out. Blending this image with a reflection processed seismic yields a more vivid image. Also, this work was done using Madagascar... yay, open source!

Kris Pister, Dust Networks

The power of mobile devices is impressive, but Dust Networks can build an accelerometer with optical communication and a microcontroller in a 5 mm3 box. The autonomous sensors build a time-synchronized mesh protocol with channel-hopping (yeah, they do!), meaning you end up with an internet-like network that tolerates dead nodes and other failures. Now Dust build such networks of all kinds of sensors, of all sizes, in industrial applications, and surely will soon be appearing in a wellbore or seismic array near you. One to watch.

Rebecca Saltzer, ExxonMobil

Easily the most enthusiastic presentation of the day was a rip-roaring tale from Wyoming. ExxonMobil buried fifty-five low-frequency Guralp CMG3T seismometers at their LaBarge oil and gas field. The devices were arranged in a line pointing towards the Pacific, to ensure a good source of earthquakes: the source for this grand experiment. The P-waves they intended to image with have a dominant frequency of about 1 Hz, hence the seismometers, with their 0.08 to 50 Hz bandwidth. And image they did: the result was a velocity model with 500 m vertical resolution and good agreement with a 1000-well velocity model.

More posts about SEG 2011.

Frontiers at the Forum

The SEG Forum was the main attraction on Day 1 of the SEG Annual Meeting in San Antonio. Several people commented that the turnout was rather poor, however, with no more than 400 people sitting in the Lila Cockrell Theatre, even at the start. Perhaps the event needs more publicity. There was plenty of time for questions from the audience, all of which the panel discussed quite candidly.

David Lawrence, Executive VP of Exploration and Commercial at Shell gave, predictably, a rather dry corporate presentation. We understand how presentations like this get hijacked by lawyers and corporate communications departments, but wish more executives would stand up to their captors, especially for a short presentation to a technical audience. Despite his shackles, he had some eyebrow-raising technology to brag about: futuristic autonomous-vehicle marine nodes, and a million-channel sensor network, a project development they're developing with HP, of all companies.

Tim Dodson, Executive VP of Exploration at Statoil and once Matt's boss there, seemed similarly held captive by his corporation's presentation sanitizers. Saved by his charisma, Tim characterized Statoil's steady approach in exploration: deep pockets, patience, and being comfortable with risk. They seem to have the same approach to technology innovation, as Tim highlighted their Source Rock from Seismic method for characterizing source rocks and the high-resolution spectral decomposition technology we wrote about recently. Both projects took several years to develop, and have paid off in discoveries like Aldous and Skrugard respectively.

Susan Cunningham, Senior VP of Exploration at Noble Energy, spoke about her company's approach to frontier exploration. Despite her chronic use of buzz-phrases (innovative thinking, integrated objective assessment, partner of choice), Susan gave a spirited outlook on the human angles of Noble's frontier thinking. She discussed Noble's perseverance in the Eastern Mediterranean 8.5 Tcf Tamar discovery in the Levant Basin, and went on to describe Noble as a large company in a small company framework, but we're not sure what that means. Is it good?

Carl Trowell, president of WesternGeco and the youngest panelist, was the most engaging (and convincing) speaker. Shell's corporate communications people need to see presentations like this one: more powerful and trustable for its candid, personal, style. As you'd expect, he had deep insight into where seismic technolology is going. He lamented that seismic is not used enough in risk mitigation for frontier wells; for example, borehole seismic-while-drilling, imaged in the time it takes to trip out of the hole, can help predict pore pressure and other hazards in near-real-time. His forward-looking, energetic style was refreshing and inspiring.

It was a slightly dry, but basically up-beat, kick-off to the meeting. Some high-altitude perspective before we helicopter down to the nitty-gritty of the talks this afternoon.

Click here for all the posts about SEG 2011

Broken ice

Click for the latest newsUntil today, I was an SEG virgin; I can now see what all the fuss is about.

The SEG Annual Meeting is big. Massive. And it feels important, or at least significant. It is clear that exploration geophysics lives here. Every step takes you past something cool... there's FairfieldNodal's seismic node exhibit, and here's Transform Software's stained-glass-window spectral display. And every other step is like flicking through an issue of Geophysics... there's Sergey Fomel, here's Öz Yilmaz. Although I know only a few people here, I have a stong feeling of familiarity and belonging. I like it. No: I love it.

I taught my writing course this morning. It was the smallest course in the world, with a grand total of three students of the written word. Fortunately, they turned out to be wonderful company, and taught me at least twice as much as I taught them. We spent much of the morning talking about new directions in science writing, openness in industry and academia, and the competition for attention. The SEG showed considerable faith in me and my subject matter in offering this course, because it has faltered before over the years. But clearly something needs to change if we agree to offer it again... It seems that honing soft skills is not what people are looking for. Perhaps a course like mine is better suited to online consumption in bite-size webcasts. Or maybe I just needed more elliptic partial differential equations.

What do you think of courses like this? Too fluffy? Too long? Too boring?

Click here for all the posts about SEG 2011

Follow the SEG conference

The eighty-first annual meeting of the Society of Exploration Geophysicists (SEG) will be held in San Antonio next week. The technical session will hold over 600 oral poster presentations, and the exposition hall will be hosted by more than 350 companies, government agencies, research and educational institutions. More than 8000 people from 85 countries will attend.

Whether you are roaming on-site, or stuck in your office, find out what people are saying, what's happening, and get involved in the conversation. You can follow live updates throughout the week by coming back to this post or searching for the hashtag #SEG11 using Twitter's search.


If you are going to the conference, consider sharing your ideas and engaging with this community. Or tell us what you think by leaving a comment.

Click here for all the posts about SEG 2011

G is for Gather

When a geophysicist speaks about pre-stack data, they are usually talking about a particular class of gather. A gather is a collection of seismic traces which share some common geometric attribute. The term gather usually refers to a common depth point (CDP) or common mid-point (CMP) gather. Gathers are sorted from field records in order to examine the dependence of amplitude, signal:noise, moveout, frequency content, phase, and other attributes that are important for data processing and imaging. 

Common shot or receiver gather: Basic quality assessment tools in field acquistion. When the traces of the gather come from a single shot and many receivers, it is called a common shot gather. A single receiver with many shots is called a common receiver gather. It is very easy to inspect traces in these displays for bad receivers or bad shots. 

shot gatherImage: gamut.to.it CC-BY-NC-NDCommon midpoint gather, CMP: The stereotypical gather: traces are sorted by surface geometry to approximate a single reflection point in the earth. Data from several shots and receivers are combined into a single gather. The traces are sorted by offset in order to perform velocity analysis for data processing and hyperbolic moveout correction. Only shot–receiver geometry is required to construct this type of gather.

Common depth point gather, CDP: A more sophisticated collection of traces that takes dipping reflector geometry other subsurface properties into account. CDPs can be stacked to produce a structure stack, and could be used for AVO work, though most authors recommend using image gathers or CIPs [see the update below for a description of CIPs]A priori information about the subsurface, usually a velocity model, must be applied with the shot–receiver geometry in order to construct this type of gather. [This paragraph has been edited to reflect the update below].

Common offset gather, COFF: Used for basic quality control, because it approximates a structural section. Since all the traces are at the same offset, it is also sometimes used in AVO analysis; one can quickly inspect the approximate spatial extent of a candidate AVO anomaly. If the near offset trace is used for each shot, this is called a brute stack.

Variable azimuth gather: If the offset between source and receiver is constant, but the azimuth is varied, the gather can be used to study variations in travel-time anisotropy from the presence of elliptical stress fields or reservoir fracturing. The fast and slow traveltime directions can be mapped from the sinsoidal curve. It can also be used as a pre-stack data quality indicator. 

Check out the wiki page for more information. Are there any gather types or applications that we have missed?

Find other A to Z posts

AVO* is free!

The two-bit experiment is over! We tried charging $2 for one of our apps, AVO*, as a sort of techno-socio-geological experiment, and the results are in: our apps want to be free. Here are our download figures, as of this morning: 

You also need to know when these apps came out. I threw some of the key statistics into SubSurfWiki and here's how they stack up when you account for how long they've been available:

It is clear that AVO* has performed quite poorly compared to its peers! The retention rate (installs/downloads) is 100% — the price tag buys you loyalty and even a higher perceived value perhaps? But the hit in adoption is too much to take. 

There are other factors: quality, relevance, usefulness, ease-of-use. It's hard to be objective, but I think AVO* is our highest quality app. It certainly has the most functionality, hence this experiment. It is rather niche: many geological interpreters may have no use for it. But it is certainly no more niche than Elastic*, and has about four times the functionality. On the downside, it needs an internet connection for most of its juicy bits.

In all, I think that we might have expected 200 installs for the app by now, from about 400–500 downloads. I conclude that charging $2 has slowed down its adoption by a factor of ten, and hereby declare it free for everyone. It deserves to be free! If you were one of the awesome early adopters that paid a toonie for it, I have only this to say to you: we love you.

So, if you have an Android device, scan the code or otherwise hurry to the Android Market!

News of the week

Dips from pics

Algeria foldsIn collaboration with the Geological Survey of Canada, Pangaea Software have built a very nifty tool, Orion, for computing dip from satellite images and digital elevation models. With these two pieces of data, and some assumptions about scale, it's possible to deduce the dip of strata without getting your boots muddy. Matt heard all about this tool from the GSC collaborator, Paul Budkewitsch, at the 3P Arctic conference in Halifax last week; here's their abstract

CGGV Trilobit nodeOcean bottom investment

CGGVeritas has made a commitment to manufacture 800 new Trilobit four-component deepwater nodes for seismic acquisition, to add to its existing pool. The device has three oriented accelerometers plus a hydrophone in addition to an onboard battery and recording system. This all-in-one design can be deployed on the seabed by most ROVs, making it easy to place near platforms and other infrastructure that towed streamer and cable systems cannot access. 

Arguably the industry leader in cableless systems is FairfieldNodal, who are already deploying more than a thousand nodes. It's great to see a big player like CGGVeritas coming to compete with this potentially transformative technology.

Update for Insight Earth

Colorado-based software company TerraSpark has just announced the release of Insight Earth 1.6, an integrated volume interpretation tool. Enhancements include a more interactive data import and export interface, improved velocity modeling, and upgrades to the automated fault extraction. In a January post, Evan highlighted an article by Stan Hammon of TerraSpark on the computational and psychological factors affecting intellegent design. It's inspired stuff.

Re-introducing SubSurfWiki

AgileWiki is now SubSurfWiki, at subsurfwiki.org. Please change your bookmarks! We felt that it was a little too Agile-centric and want to appear as open web-space for anything subsurface. We want it to grow, deepen and diversify, and above all be useful. So check it out and let us know if you have any feedback on utility, appearance and content.

More news... If you like this, check out previous news posts from Agile*

Orion is a trademark of Pangaea Software. Insight Earth is a trademark of TerraSpark. SubSurfWiki is a trademark of Agile Geoscience. The satellite image is copyright of Google. This regular news feature is for information only. We aren't connected with any of these organizations, and don't necessarily endorse their products or services.

What we did over the summer holidays

The half-life of a link is hilariously brief, so here is an attempt to bring some new life back into the depleted viewership of our summer-time blogging. Keep in mind that you can search for any of the articles on our blog using the search tool, shown here, or sign up for email updates lower down on the side bar, for hands-free, automated Agile goodness every time we post something new.  

Well worth showing off, 4 July: This post was a demonstration of the presentation tool Prezi applied to pseudo-digital geoscience data. Geoscience is inherently visual and scale dependant, so we strive to work and communicate in a helicoptery way. I used Prezi to navigate a poster presentation on sharing geo-knowledge beyond the experts

Geophysical stamps—Geophone, 15 July: Instalment 3 of Matt's vintage German postage stamps was a tribute to the geophone. This post prompted a few readers to interject with suggestions and technical corrections. We strive for an interactive, dynamic and malleable blog, and their comments certainly improved the post. It was a reminder to be ready to react when you realize someone is actually reading your stuff. 

Petrophysics cheatsheet, 25 July and its companion post: Born out a desire to make a general quick reference for well logs, we published the Petrophysics cheatsheet, the fourth in our series of cheatsheets. In this companion post, you can read why petrophysics is hard. It sits in a middle ground between drilling operations, geoscience, and reservoir engineering, and ironically petrophysical measurements seldom measure the properties we are actually interested in. Wireline data is riddled with many service providers and tool options, data formats, as well as historical and exhaustive naming conventions.

How to cheat at spot the difference, 3 Aug: Edward Tufte says, "to clarify, add detail". Get all your data into one view to assist your audience in making a comparison. In this two-part post Matt demonstrated the power of visual crossplotting using two examples: a satelite photo of a pyroclastic flow, and a subsurface horizon with seismic attributes overlain. Directly mapping partially varying properties is better than data abstractions (graphs, tables, numbers, etc). Richer images convey more information and he showed us how to cheat at spot the difference using simple image processing techniques.

Digital rocks and accountability, 10 Aug: At the First International Workshop in Rock Physics, I blogged about two exciting talks on the first day of the conference on the promise of digital rock physics and how applied scientists should strive to be better in their work. Atul Gawande's ternary space of complexity could serve as tool for mapping out geoscience investigations. Try it out on your next problem and ask your teammates to expose the problem as they see it.

Wherefore art thou, Expert?, 24 Aug: Stemming from a LinkedIn debate on the role of service companies in educating and empowering their customers, Matt reflected on the role of the bewildered generalist in today's upstream industry. Information systems have changed, perfection is a myth and domain expertise runs too deep. Generalists can stop worrying about not knowing enough, specialists can builder shallower and more accesible tools, and service companies can serve instead of sell. 

Pseudogeophysics, 31 Aug: Delusion, skeptisicm, and how to crack a nut. This post drew comments about copyright control and the cost of lost opportunity; make sure to read the comments section of this post.

So yeah, now go catch up on your reading. 

Bad Best Practice

Applied scientists get excited about Best Practice. New professionals and new hires often ask where 'the manual' is, and senior technical management or chiefs often want to see such documentation being spread and used by their staff. The problem is that the scientists in the middle strata of skill and influence think Best Practice is a difficult, perhaps even ludicrous, concept in applied geoscience. It's too interpretive, too creative.

But promoting good ideas and methods is important for continuous improvement. At the 3P Arctic Conference in Halifax last week, I saw an interesting talk about good seismic acquisiton practice in the Arctic of Canada. The presenter was Michael Enachescu of MGM Energy, well known in the industry for his intuitive and integrated approach to petroleum geoscience. He gave some problems with the term best practice, advocating instead phrases like good practice:

  • There's a strong connotation that it is definitively superlative
  • The corollary to this is that other practices are worse
  • Its existence suggests that there is an infallible authority on the subject (an expert)
  • Therefore the concept stifles innovation and even small steps towards improvement

All this is reinforced by the way Best Practice is usually written and distributed:

  • Out of frustration, a chief commissions a document
  • One or two people build a tour de force, taking 6 months to do it
  • The read-only document is published on the corporate intranet alongside other such documents
  • Its existence is announced and its digestion mandated

Unfortunately, the next part of the story is where things go wrong:

  • Professionals look at the document and find that it doesn't quite apply to their situation
  • Even if it does apply, they are slightly affronted at being told how to do their job
  • People know about it but lack the technology or motivation to change how they were already working
  • Within 3 years there is enough new business, new staff, and new technology that the document is forgotten about and obselete, until a high-up commissions a document...

So the next time you think to yourself, "We need a Best Practice for this", think about trying something different:

  • Forget top-down publishing, and instead seed editable, link-rich documents like wiki pages
  • Encourage discussion and ownership by the technical community, not by management
  • Request case studies, which emphasize practical adaptability, not theory and methodology
  • Focus first on the anti-pattern: common practice that is downright wrong

How do you spread good ideas and methods in your organization? Does it work? How would you improve it?