Great geophysicists #9: Ernst Chladni

Ernst Chladni was born in Wittenberg, eastern Germany, on 30 November 1756, and died 3 April 1827, at the age of 70, in the Prussian city of Breslau (now Wrocław, Poland). Several of his ancestors were learned theologians, but his father was a lawyer and his mother and stepmother from lawyerly families. So young Ernst did well to break away into a sound profession, ho ho, making substantial advances in acoustic physics. 

Chladni, 'the father of acoustics', conducted a large number of experiments with sound, measuring the speed of sound in various solids, and — more adventurously — in several gases too, including oxygen, nitrogen, and carbon dioxode. Interestingly, though I can find only one reference to it, he found that the speed of sound in Pinus sylvestris was 25% faster along the grain, compared to across it — is this the first observation of acoustic anisotropy? 

The experiments Chladni is known for, however, are the plates. He effectively extended the 1D explorations of Euler and Bernoulli in rods, and d'Alembert in strings, to the 2D realm. You won't find a better introduction to Chladni patterns than this wonderful blog post by Greg Gbur. Do read it — he segués nicely into quantum mechanics and optics, firmly linking Chladni with the modern era. To see the patterns forming for yourself, here's a terrific demonstration (very loud!)...

The drawings from Chladni's book Die Akustik are almost as mesmerizing as the video. Indeed, Chladni toured most of mainland Europe, demonstrating the figures live to curious Enlightenment audiences. When I look at them, I can't help wondering if there is some application for exploration geophysics — perhaps we are missing something important in the wavefield when we sample with regular acquisition grids?

References

Chladni, E, Die Akustik, Breitkopf und Härtel, Leipzig, 1830. Amazingly, this publishing company still exists.

Read more about Chladni in Wikipedia and in monoskop.org — an amazing repository of information on the arts and sciences. 

This post is part of a not-very-regular series of posts on important contributors to geophysics. It's going rather slowly — we're still in the eighteenth century. See all of them, and do make suggestions if we're missing some!

Grand challenges, anisotropy, and diffractions

Some more highlights from the two final days of the SEG Annual Meeting in Houston.

Grand challenges

On Friday, I reported on Chevron's take on the unsolved problems in petroleum geoscience. It was largely about technology. Ken Tubman, VP of Geoscience and Reservoir Engineering at ConocoPhillips gave an equally compelling outlook on some different issue. He had five points:

  • Protect the base — Fighting the decline of current production is more challenging than growing production.
  • Deepwater — Recent advances in drilling are providing access to larger fields in deep water, and compressed sampling in seismic will make exploration more efficient.
  • Unconventionals — In regard to the shale gas frenzy, it is not yet obvious why these reservoirs produce the way that they do. Also, since resource plays are so massive, a big challenge will be shooting larger surveys on land.
  • Environment and safety — Containment assurance is more critical than pay-zone management, and geophysics will find an expanding role in preventing and diagnosing environmental and safety issues.
  • People — Corporations are concerned about maintaining world class people. Which will only become more difficult as the demographic bump of senior knowledge heads off into retirement.

The Calgary crowd that harvested the list of unsolved problems at our unsession in May touched on many of these points, and identified many others that went unmentioned in this session.

Driving anisotropic ideas

In the past, seismic imaging and wave propagation were almost exclusively driven by isotropic ideas. In the final talk of the technical program, Leon Thomsen asserted that the industry has been doing AVO wrong for 30 years, and doing geomechanics wrong for 5 years. Three take-aways:

  • Isotropy is no longer an acceptable approximation. It is conceptually flawed to relate Young's modulus (an elastic property), to brittleness (a mode of failure). 
  • Abolish the terms vertically transverse isotropy (VTI), and horizontally transverse isotropy (HTI) from our vocabulary; how confusing to have types of anisotropy with isotropy in the name! Use polar anisotropy (for VTI), and azimuthal anisotropy (for HTI) instead.
  • λ13 is a simple expression of P-wave modulus M, and Thomsen's polar anisotropy parameter δ, so it should be attainable with logs.

Bill Goodway, whose work with elasticity has been criticized by Thomsen, walked to the microphone and pointed out to both the speaker and audience, that the tractability of λ13 is what he has been saying all along. Colin Sayers then stood up to reiterate that geomechanics is the statistics of extremes. Anisotropic rock physics is uncontestable, but the challenge remains to find correlations with things we actually measure.

Thomas Young's sketch of 2-slit diffraction, which he showed to the Royal Society in 1803.

Imaging fractures using diffractions

Diffractions are fascinating physical phenomena that occur when the conditions of wave propagation change dramatically. They are a sort of grey zone between reflection and scattering, and can be used to resolve fractures in the subsufrace. The question is whether or not there is enough diffraction energy to detect the fractures; it can be 10× smaller than a specular reflection, so one needs very good data acquisition. Problem is, we must subtract reflections — which we deliberately optimized for — from the wavefield to get diffractions. Evgeny Landa, from Opera Geophysical, was terse, 'we must first study noise, in this case the noise is the reflections... We must study the enemy before we kill it.'

Prospecting with plate tectonics

The Santos, Campos, and Espirito Basins off the coast of Brazil contain prolific oil discoveries and, through the application of plate tectonics, explorers have been able to extend the play concepts to offshore western Africa. John Dribus, Geological Advisor at Schlumberger, described a number of discoveries as 'kissing cousins' on either side of the Atlantic, using fundamental concepts of continental margin systems and plate tectonics (read more here). He spoke passionately about big ideas, and acknowledged collaboration as a necessity: 'if we don't share our knowledge we re-invent the wheel, and we can't do that any longer'.

In the discussion session afterwards, I asked him to comment on offshore successes, which has historically hovered around 14–18%. He noted that a step change — up to about 35% — in success occured in 2009, and he gave 3 causes for it: 

  • Seismic imaging around 2005 started dealing with anisotropy appropriately, getting the images right.
  • Improved understanding of maturation and petroleum system elements that we didn’t have before.
  • Access to places we didn’t have access to before.

Although the workshop format isn't all that different from the relentless PowerPoint of the technical talks, it did have an entirely different feeling. Was it the ample discussion time, or the fact that the trade show, now packed neatly in plywood boxes, boosted the signal:noise? Did you see anything remarkable at a workshop last week? 

Key technology trends in earth science

Yesterday, I went to the workshop entitled, Grand challenges and research opportunities in geophysics, organized by Cengiz Esmersoy, Wafik Beydoun, Colin Sayers, and Yoram Shoham. I was curious if there'd be overlap with the Unsolved Problems Unsession we hosted in Calgary, and had reservations about it being an overly fluffy talkshop, but it was much better than I expected.

Ken Tubman, VP of Geosciences and Reservoir Engineering at ConocoPhillips, gave a splendid talk to open the session. But it was the third talk of the session, from Mark Koelmel, General Manager of Earth Sciences at Chevron, that resonated most with me. He highlighted 5 trends in applied earth science.

Data and information management

Data volumes are expanding with Moore's law. Chevron has more than 15 petabytes of data, by 2020 they will have more than 100PB. Koelmel postulated that spatial metadata and tagging will become pervasive and our data formats will have to evolve accordingly. Instead of managing ridiculously large amounts of data, a better solution may be to 'tag it and chuck it in the closet' — Google's approach to the web (and we know the company has been exploring the use of Hadoop). Beyond hardware, he stressed that new industry standards are needed now. The status quo is holding us back.

Full azimuth seismic data

Only recently have we been able to wield the computing power to deal with the kind of processes needed for full-waveform inversion. It's not only because of data volumes that new processing facilities will not be cheap — or small. He predicted processing centres that resemble small cities in terms of their power consumption. An interesting notion of energy for energy, and the reason for recent massive growth in Google's power production capability. (Renewables for power, oil for cooling... how funny would that be?)

Interpretive seismic processing and imaging

Interpretation, and processing are actually the same thing. The segmentation of seismic technology will have to be stitched back together. Imagine the interpreter working on field data, with a mixing board to produce just the right image for today's work. How will service companies (who acquire data and make images), and operators (who interpret data and make prospects) merge their efforts? We may have to consider different business relationships.

Full-cycle interpretation systems

The current state of integration is sequential at best, each node in a workflow produces static inputs for the next step, with minimal iteration in between. Each component of the sequence typically ends with 'throwing things over the wall' to the next node. With this process, the uncertainties are cumulative throughout, which is unnerving because we don't often know what the uncertainties are. Koelmel's desired future state is one of seamless geophysical processing, static model-building, and dynamic reservoir simulation. It won't reduce uncertainties altogether, but by design it will make them easier to identify and addressed.

Intellectual property

The number of patents filed in this industry has more than tripled in the last decade. I assumed Koelmel was going to give a Big Oil lecture on secrecy and patents, touting them as a competitive advantage. He said just the opposite. He asserted that industries with excessive patenting (think technology, and Big Pharma) make innovation difficult. Chevron is no stranger to the patent processes, filing 125 patents both in 2012 and in 2011, but this is peanuts compared to Schlumberger (462 in 2012) and IBM (6457 in 2012). 

The challenges geophysicists are facing are not our own. They stem from the biggest problems in the industry, which are of incredible importance to mankind. Perhaps expanding the value proposition to such heights is more essential than ever. Geophysics matters.

The future is uncertain

Image: Repsol, SEG. Click for the abstract.

SEG Day 2. In the session entitled Exploration and Uncertainty Analysis, I was underwhelmed with the few talks that I attended, except for the last one of the session entitled, Measuring time-map uncertainty

Static uncertainty

It is commonly uttered that different data processing companies will produce different results; seismic processing is non-unique, and so on. But rarely do I get to see real examples of the kind of variances that can occur. Bruce Blake from Repsol showed seismic imaging results that came back from a number of contractors. The results were truly shocking. The example he showed was an extreme case of uncertainty caused by inadequate static solutions caused by the large sand dunes in Libya. The key point for me is exemplified by the figure shown on the right: the image from one vendor suggests a syncline, the image from the other suggest an anticline. Beware!

A hole in the theory

In the borehole sonic session, Xinding Fang, a student from MIT, reinforced a subtle but profound idea: it is tricky to measure the speed of sound in a rock when you drill a hole into it. The hole changes the stress field, and induces an anisotropic stiffness around the circumference of the borehole where sonic tools make their measurements. And since waves take the shortest travel path from source to receiver, speeds that are measured in the presence of an artificial stress are wrong.

Image: Xindang Fang, SEG. Click for the abstract.

The bigger issue here that Xinding has elucidated is that we routinely use sonic logs to make time-depth relationships and tie wells, especially in the absence of a check-shot survey. If it works, it works, but if ever discrepancies exists between seismic and well, the interpreter applies a stretch or a squeeze without much thought. Some may blame the discrepancy on dispersion alone, but that's evidently too narrow. Indeed, we rarely bother to investigate the reasons.

There's a profound point here. We have to drop the assumption that logs are the 'geological' truth upon which to hang an interpretation. We have to realize that the act of making the measurement changes the very thing we want to measure. 

Past, present, future SEG

Today was the first day of the SEG Annual Meeting in Houston. 

Last night we wandered around the icebreaker, still buzzing from the hackathon. The contrast was crushing. The exhibition is gigantic — it's an almost overwhelming amount of marketing. My thoughts on what the exhibition hall is, and what it represents, are not fully formed and might be a bit... ranty, so I will save them for a more considered post. 

As usual, SEG kicked off with a general session — much better attended this year, but also much less ambitious. At least 300 members came to hear outgoing president David Monk's perspective on SEG's future. His address mostly looked backwards, however, at the trends over the last few years. I guess the idea is to extrapolate from there... But maybe we can do even better than recent years? We mustn't forget to do completely new and unexpected things too. 

At the end of his slot, Monk showed some animated renderings of SEG's new building in Tulsa. The movie was accompanied by an almost comically strident anthem — evidently it is a big deal. As well as having a smart new office, the real estate will turn in some smart new revenue from other tenants. Ground was broken on Friday, and the opening is expected to be in December 2014. As you see, the architects understood industrial geophysics quite well, opting for a large black box

At the end of the day, Canada strode home to yet another SEG Challenge Bowl victory as the University of Manitoba fought off the Autonomous University of Mexico and Colorado School of Mines to prove that, while Texas might be the home of the industry, Canada is the home of exploration geophysics. 

Where's all the geophysics? Evan is compiling some technical highlights from the day as I type. Stay tuned for that. 

If you're at the conference, tell us what you've enjoyed most about the first 24 hours.

Looking forward to SEG 2013

The SEG Annual Meeting is coming! The program starts tomorrow with the DISC, and continues over the weekend with various other courses. It's not part of the conference, but we're looking forward to the Geophysics Hackathon, obviously. Curious? You're welcome to drop in.

The meeting boasts 124 technical sessions totalling over 1000 PowerPoint presentations. If you haven't looked at the list of expanded abstracts yet, I can't blame you, it's a massive amount of content and the website experience is, er, not optimal — and there's no helpful mobile app this year. [Update: The app came out today! Go get it, it's essential. Thank you Whitney at SEG for letting us know.] I've tried to pick out a few sessions that seem really exciting below.

Worst. App. Ever.Each day at 10:30 am, I will be doing a guest presentation at the Enthought booth, showing some novel geophysics tools that I've been making. They are powered by Python and Enthought's Canopy environment. Come by and I will show you that you can too! However, I need somebody to please go to this exhibition booth 'browser' and show me where the Enthought booth is. Worst. App. Ever.

And each day at 11 am, there's a 2-hour mini-wikithon. Stop by the Press Room for a quick tour of SEG Wiki, and find out how you can help make it better.

Monday

With no technical presentations on Monday morning, it is safe to assume that most delegates will be wandering around the exhibition hall. A few may trickle over to the unenticing Opening Session which Matt and I found was horribly attended last year and the year before. Matt at least will be there, mostly out of morbid curiosity.

Continuing the Hackathon's theme on error and uncertainty, I will be diving into the session on Monday afternoon called

From 3-6 pm be sure to check out the always popular SEG Student Challenge Bowl. The global finals, are hosted by the crowd-pleasing past SEG president (and fellow Canadian) Peter Duncan. Top pairings from Universities across the world duking it out in a button-pushing quiz show. Come out, cheer on the students and test your own geophysics trivia from the audience.

Tuesday

The sessions that look appealing to me on Tuesday are

Wednesday

Agile's good friend Maitri Erwin is the instigator behind the Women's Networking Breakfast. All are welcome; consider yourself lucky to connect with Maitri. As for talks, I will try make an appearance at

The first one I know quite a bit about, but can always use a refresher, and the second one I know very little about, but it's been a hot topic for 3 or 4 years now. If we aren't worn out at the end of the day, we might find some tickets to the Bayou Bash.

Thursday & Friday

There are over a dozen workshops on both Thursday and Friday. As far as I can tell, they are basically more talks, each around a central theme. Don't ask me how this is in any way distinguishable from the technical program, and there is still a full suite of technical sessions conflicting on Thursday morning. It's a shame because I'm curious to attend the session Fractures, shale and well-log characterization but I don't want to miss Workshop 2, Grand challenges, which takes place all day on Thursday. Then on Friday there's Characterizing fractures (Workshop 15).

There are many other events going on, so if you see something good, make sure you tweet the rest of us about it: @EvanBianco, @kwinkunks, @maitri, @toastar, and lots of others — follow hashtag #SEG13. (Not #SEG2013, that's all marketing wonks).

If you'll be at the Annual Meeting, do look out for us, we'd love to meet you. If you won't be there, tell us what you'd like to hear about. News from the exhibition? Our favourite talks? Detailed minutes from the committee meetings? Let us know in the comments.

Wiki world of geoscience

This weekend, I noticed that there was no Wikipedia article about Harry Wheeler, one of the founders of theoretical stratigraphy. So I started one. This brings the number of biographies I've started to 3:

  • Karl Zoeppritz — described waves almost perfectly, but died at the age of 26
  • Johannes Walther — started as a biologist, but later preferred rocks
  • Harry Wheeler — if anyone has a Wheeler diagram to share, please add it!

Many biographies of notable geoscientists are still missing (there are hundreds, but here are three): 

  • Larry Sloss — another pioneer of modern stratigraphy
  • Oz Yilmaz — prolific seismic theoretician and practioner
  • Brian Russell — entrepreneur and champion of seismic analysis

It's funny, Wikipedia always seems so good — it has deep and wide content on everything imaginable. I think I must visit it 20 or 30 times a day. But when you look closely, especially at a subject you know a bit about, there are lots of gaps (I wonder if this is one of the reasons people sometimes deride it?). There is a notability requirement for biographies, but for some reason this doesn't seem to apply to athletes or celebrities. 

I was surprised the Wheeler page didn't exist, but once you start reading, there are lots of surprises:

I run a geoscience wiki, but this is intended for highly esoteric topics that probably don't really belong in Wikipedia, e.g. setting parameters for seismic autopickers, or critical reviews of subsurface software (both on my wish list). I am currently working on a wiki for AAPG — is that the place for 'deep' petroleum geoscience? I also spend time on SEG Wiki... With all these wikis, I worry that we risk spreading ourselves too thinly? What do you think?

In the meantime, can you give 10 minutes to improve a geoscience article in Wikipedia? Or perhaps you have a classful of students to unleash on an assignment?

Tomorrow, I'll tell you about an easy way to help improve some geophysics content.

Seismic quality traffic light

We like to think that our data are perfect and limitless, because experiments are expensive and scarce. Only then can our interpretations hope to stand up to even our own scrutiny. It would be great if seismic data was a direct representation of geology, but it never is. Poor data doesn't necessarily mean poor acquisition or processing. Sometimes geology is complex!

In his book First Steps in Seismic Interpretation, Don Herron describes a QC technique of picking a pseudo horizon at three different elevations to correspond to poor, fair, and good data regions. I suppose that will do in a pinch, but I reckon it would take a long time, and it is rather subjective. Surely we can do better?

Computing seismic quality

Conceptually speaking, the ease of interpretation depends on things we can measure (and display), like coherency, bandwidth, amplitude strength, signal-to-noise, and so on. There is no magic combination of filters that will work for all data, but I am convinced that for every seismic dataset there is a weighted function of attributes that can be concocted to serve as a visual indicator of the data complexity:

So one of the first things we do with new data at Agile is a semi-quantitative assessment of the likely ease and reliability of interpretation.

This traffic light display of seismic data quality, corendered here with amplitude, is not only a precursor to interpretation. It should accompany the interpretation, just like an experiment reporting its data with errors. The idea is to show, honestly and objectively, where we can trust eventual interpretations, and where they not well constrained. A common practice is to cherry pick specific segments or orientations that support our arguments, and quietly suppress those that don't. The traffic light display helps us be more honest about what we know and what we don't — where the evidence for our model is clear, and where we are relying more heavily on skill and experience to navigate a model through an area where the data is unclear or unconvincing.

Capturing uncertainty and communicating it in our data displays is not only a scientific endeavour, it is an ethical one. Does it change the way we look at geology if we display our confidence level alongside? 

Reference

Herron, D (2012). First Steps in Seismic Interpretation. Geophysical Monograph Series 16. Society of Exploration Geophysicists, Tulsa, OK.

The seismic profile shown in the figure is from the Kennetcook Basin, Nova Scotia. This work was part of a Geological Survey of Canada study, available in this Open File report.

Colouring maps

Over the last fortnight, I've shared five things, and then five more things, about colour. Some of the main points:

  • Our non-linear, heuristic-soaked brains are easily fooled by colour.
  • Lots of the most common colour bars (linear ramps, bright spectrums) are not good choices.
  • You can learn a lot by reading Robert Simmon, Matteo Niccoli, and others.

Last time I finished on two questions:

  1. How many attributes can a seismic interpreter show with colour in a single display?
  2. On thickness maps should the thicks be blue or red?

One attribute, two attributes

The answer to the first question may be a matter of personal preference. Doubtless we could show lots and lots, but the meaning would be lost. Combined red-green-blue displays are a nice way to cram more into a map, but they work best on very closely related attributes, such as seismic amplitude of three particular frequencies

Here's some seismic reflection data — the open F3 dataset, offshore Netherlands, in OpendTect

A horizon — just below the prominent clinoforms — is displayed (below, left) and coloured according to elevation, using one of Matteo's perceptual colour bars (now included in OpendTect!). A colour scale like this varies monotonically in hue and luminance.

Some of the luminance channel (sometimes called brightness or value) is showing elevation, and a little is being used up by the 3D shading on the surface, but not much. I think the brain processes this automatically because the 3D illusion is quite good, especially when the scene is moving. Elevation and shape are sort of the same thing, so we've still only really got one attribute. Adding contours is quite nice (above, middle), and only uses a narrow slice of the luminance channel... but again, it's the same data. Much better to add new data. Similarity (a member of the family that includes coherence, semblance, and so on) is a natural fit: it emphasizes a particular aspect of the shape of the surface, but which was measured independently of the interpretaion, directly from the data itself. And it looks awesome (above, right).

Three attributes, four

OK, we have elevation and/or shape, and similarity. What else can we add? Another intuitive attribute of seismic is amplitude (below, left) — closely related to the strength of the reflected energy. Two things: we don't trust amplitudes in areas with low fold — so we can mask those (below, middle). And we're only really interested in bright spots, so we can edit the opacity profile of the attribute and make low values transparent (below, right). Two more attributes — amplitude (with a cut-off that reflects my opinion of what's interesting — is that an attribute?) and fold.

Since we have only used one hue for the amplitude, and it was not in Matteo's colour bar, we can layer it on the original map without clobbering anything. Unfortunately, there's no easy way for the low fold mask to modulate amplitude without interfering with elevation, because the elevation map needs to be almost completely opaque. What I need is a way to modulate a surface's opacity with an attribute it is not displaying with hue...

Thickness maps

The second question — what to colour thicks — is easy. Thicks should be towards the red end of the spectrum, sometimes not-necessarily-intuitively called 'warm' colours. (As I mentioned before in the comments, a quick Google image poll suggests that about 75% of people agree). If you colour your map otherwise, perhaps because you like the way it suggests palaeobathymetry in some depositional settings, be careful to make this very clear with labels and legends (which you always do anyway, right?). And think about just making a 'palaeobathymetry' map, not a thickness map.

I suspect there are lots of quite personal opinions out there. Like grammar, I do think much of this is a matter of taste. The only real test is clarity. Do you agree? Is there a right and wrong here? 

Five more things about colour

Last time I shared some colourful games, tools, and curiosities, including the weird chromostereopsis effect (right). Today, I've got links to much, much more 'further reading' on the subject of colour...


The provocation for this miniseries was Robert 'Blue Marble' Simmon's terrific blog series on colour, which he's right in the middle of. Robert is a data visualization pro at NASA Earth Observatory, so we should all listen to him. Here's his collection (updated after the original writing of this post):

Perception is everything! One of Agile's best friends is Matteo Niccoli, a quantitative geophysicist in Norway (for now). And one of his favourite subjects is colour — there are loads of great posts on his blog. He also has a fine collection of perceptual colour bars (left) for most seismic interpretation software. If you're still using Spectrum for maps, you need his help.

Dave Green is a physicist at the University of Cambridge. Like Matteo, he has written about the importance of using colour bars which have a linear increase in perceived brightness. His CUBEHELIX scheme (above) adapts easily to your needs — try out his colour bar creator. And if this level of geekiness gets you going, try David Dalrymple or Gregor Aisch.

ColorBrewer is a legendary web app and add-in for ArcGIS. It's worth playing with the various colour schemes, especially if you need a colour bar that is photocopy friendly, or that can still be used by colour blind people. The equally excellent, perhaps even slightly more excellent, i want hue is also worth playing with (thanks to Robert Simmon for that one). 

In scientific publishing, the Nature family of journals has arguably the finest graphics. Nature Methods carries a column called Points of View, which looks at scientific visualization. This mega-post on their Methagora blog links to them all, and covers everything from colour and 3D graphics to broader issues of design and typography. Wonderful stuff.

Since I don't seem to have exhausted the subject yet, we'll save a couple of practical topics for next time:

  1. A thought experiment: How many attributes can a seismic interpreter show with colour in a single display?
  2. Provoked by a reader via email, we'll think about that age old problem for thickness maps — should the thicks be blue or red?