Fibre optic seismology at #GeoCon14

We've been so busy this week, it's hard to take time to write. But for the record, here are two talks I liked yesterday at the Canada GeoConvention. Short version — Geophysics is awesome!

DAS good

Todd Bown from OptaSense gave an overview of the emerging applications for distributed acoustic sensing (DAS) technology. DAS works by shining laser pulses down a fibre optic cable, and measuring the amount of backscatter from impurities in the cable. Tiny variations in strain on the cable induced by a passing seismic wave, say, are detected as subtle time delays between light pulses. Amazing.

Fibre optic cables aren't as sensitive as standard geophone systems (yet?), but compared to conventional instrumentation, DAS systems have several advantages:

  • Deployment is easy: fibre is strapped to the outside of casing, and left in place for years.
  • You don't have to re-enter and interupt well operations to collect data.
  • You can build ultra-long receiver arrays — as long as your spool of fibre.
  • They are sensitive to a very broad band of signals, from DC to kilohertz.

Strain fronts

Later in the same session, Paul Webster (Shell) showed results from an experiment that used DAS as a fracture diagnosis tool. That means you can record for minutes, hours, even days; if you can cope with all that data. Shell has accumulated over 300 TB of records from a handful of projects, and seems to be a leader in this area.

By placing a cable in one horizontal well in order to listen to the frac treatment from another, the cable can effectively designed to record data similar to a conventional shot gather, except with a time axis of 30 minutes. On the gathers he drew attention to slow-moving arcuate events that he called strain fronts. He hypothesized a number of mechanisms that might cause these curious signals: the flood of fracking fluids finding their way into the wellbore, the settling and closing creep of rock around proppant, and so on. This work is novel and important because it offers insight into the mechanical behavoir of engineered reservoirs, not just during the treatment, but long after.

Why is geophysics awesome? We can measure sound with light. A mile underground. That's all.

Free the (seismic) data!

Yesterday afternoon Evan and I hosted the second unsession at the GeoConvention in Calgary. After last year exposing 'Free the data' as one of the unsolved problems in subsurface geoscience, we elected to explore this idea further. And we're addicted to this kind of guided, recorded conversation.

Attendance was a little thin, but those who came spent the afternoon deep in conversation about open data, open software, and greater industry transparency. And we unearthed an exciting and potentially epic conclusion that I hope leads to a small revolution.

What happened?

Rather than leaving the floor completely open, we again brought some structure to the proceedings. I'll post the full version to the wiki page, but here's the overview:

  1. Group seismic interpretation: 5 interpreters in 5 minutes.
  2. Stories about openness: which of 26 short stories resonate with you most?
  3. Open/closed, accessible/inaccessible: a scorecard for petroleum geoscience.
  4. Where are the opportunities? What should we move from closed to open?

As you might expect, the last part was the real point. We wanted to find some high-value areas to poke, or at least gather evidence around. And one area—one data type—was identified as being (a) closed and inaccessible in Canada and (b) much more impactful if it were open and accessible. I gave the punchline away in the title, but that data type is seismic data.

Open, public seismic data is much too juicy a topic to do justice to in this post, so stay tuned for a review of some the specifics of how that conversation went. Meanwhile, imagine a world with free, public seismic data...

Reflections on the 2nd edition

The afternoon went well, and the outcome was intriguing, but we were definitely disappointed by the turnout. We have multiple working hypotheses about it...

  • There may not be a strong appetite for this sort of session, especially on a 'soft' topic. Next time: seismic resolution?
  • The first day might not be the best time for it, because people are still in the mood for talks. Next time: Wednesday morning?
  • The programme maybe didn't reflect what the unsession was about, and the time was unclear. Next time: More visibility.
  • Three hours may be too much to ask from people, though you could say the same about any other session here.

We'd love to hear your thoughts too... Are we barking up completely the wrong tree? Does our community even want to have these conversations? Should we try again in 2015?

Looking forward to #GeoCon14

Agile is off to Calgary on Sunday. We have three things on our List of Thing To Do: 

  1. We're hosting another Unsession on Monday... If you're in Calgary, please come along! It's just like any other session at the conference, only a bit more awesome.
  2. We'll be blogging from GeoConvention 2014. If there's a talk you'd like to send us to, we take requests! Just drop us a line or tweet at us!
  3. Evan is teaching his Creative Geocomputing class. Interested? There are still places. A transformative experience, or your money back.

What's hot at GeoCon14

Here's a run-down of what we're looking forward to catching:

  • Monday: Maybe it's just me, but I always find seismic acquisition talks stimulating. In the afternoon, the Unsession is the place to be. Not Marco Perez's probably awesome talk about brittleness and stress. Definitely not. 
  • Tuesday: If it wasn't for the fear of thrombosis, it'd be tempting to go to Glen 206 and stay in Log Analysis sessions all day. In the afternoon, the conference is trying something new and interesting — Jen Russel-Houston (a bright spark if ever there was one) is hosting a PechaKucha — lightning versions of the best of GeoConvention 2013. 
  • Wednesday: This year's conference is unusually promising, because there is yet another session being given over to 'something different' — two actually. A career-focused track will run all day in Macleod D, called (slightly weirdly) ‘On Belay’: FOCUSing on the Climb that is a Career in Geoscience. Outside of that, I'd head for the Core Analysis sessions.
  • Friday: We won't be there this year, but the Core Conference is always worth going to. I haven't been to anything like it at any other conference. It's open on Thursday too, but go on the Friday for the barbeque (tix required).

The GeoConvention is always a good conference. It surprises me how few geoscientists come from outside of Canada to this event. Adventurous geophysicists especially should consider trying it one year — Calgary is really the epicentre of seismic geophysics, and perhaps of petrophysics too.

And the ski hills are still open.

How much rock was erupted from Mt St Helens?

One of the reasons we struggle when learning a new skill is not necessarily because this thing is inherently hard, or that we are dim. We just don't yet have enough context for all the connecting ideas to, well, connect. With this in mind I wrote this introductory demo for my Creative Geocomputing class, and tried it out in the garage attached to START Houston, when we ran the course there a few weeks ago.

I walked through the process of transforming USGS text files to data graphics. The motivation was to try to answer the question: How much rock was erupted from Mount St Helens?

This gorgeous data set can be reworked to serve a lot of programming and data manipulation practice, and just have fun solving problems. My goal was to maintain a coherent stream of instructions, especially for folks who have never written a line of code before. The challenge, I found, is anticipating when words, phrases, and syntax are being heard like a foriegn language (as indeed they are), and to cope by augmenting with spoken narrative.

Text file to 3D plot

To start, we'll import a code library called NumPy that's great for crunching numbers, and we'll abbreviate it with the nickname np:

>>> import numpy as np

Then we can use one of its functions to load the text file into an array we'll call data:

>>> data = np.loadtxt('z_after.txt')

The variable data is a 2-dimensional array (matrix) of numbers. It has an attribute that we can call upon, called shape, that holds the number of elements it has in each dimension,

>>> data.shape
(1370, 949)

If we want to make a plot of this data, we might want to take a look at the range of the elements in the array, we can call the peak-to-peak method on data,

>>> data.ptp()
41134.0

Whoa, something's not right, there's not a surface on earth that has a min to max elevation that large. Let's dig a little deeper. The highest point on the surface is,

>>> np.amax(data)
8367.0

Which looks to the adequately trained eye like a reasonable elevation value with units of feet. Let's look at the minimum value of the array,

>>> np.amin(data)
-32767.0 

OK, here's the problem. GIS people might recognize this as a null value for elevation data, but since we aren't assuming any knowledge of GIS formats and data standards, we can simply replace the values in the array with not-a-number (NaN), so they won't contaminate our plot.

>>> data[data==-32767.0] = np.nan

To view this surface in 3D we can import the mlab module from Mayavi

>>> from mayavi import mlab

Finally we call the surface function from mlab, and pass the input data, and a colormap keyword to activate a geographically inspired colormap, and a vertical scale coefficient.

>>> mlab.surf(data,
              colormap='gist_earth',
              warp_scale=0.05)

After applying the same procedure to the pre-eruption digits, we're ready to do some calculations and visualize the result to reveal the output and its fascinating characteristics. Read more in the IPython Notebook.

If this 10 minute introduction is compelling and you'd like to learn how to wrangle data like this, sign up for the two-day version of this course next week in Calgary. 

Eventbrite - Agile Geocomputing

April linkfest

It's time for our regular linkfest!

There's a new book in town... Rob Simm and Mike Bacon have put together a great-looking text on seismic amplitude intepretation (Cambridge, 2014). Mine hasn't arrived yet, so I can't say much more — for now, you can preview it in Google Books. I should add it to my list.

Staying with new literature, I started editing a new column in SEG's magazine The Leading Edge in February. I wrote about the first instalment, and now the second is out, courtesy of Leo Uieda — check out his tutorial on Euler deconvolution, complete with code. Next up is Evan with a look at synthetics.

On a related note, Matteo Niccoli just put up a great blog post on his awesome perceptual colourmaps, showing how to port them to matplotlib, the MATLAB-like plotting environment lots of people use with the Python programming language. 

Dolf Seilacher, the German ichnologist and palaeontologist, died 4 days ago at the age of 89. For me at least, his name is associated with the mysterious trace fossil Palaeodictyon — easily one of the weirdest things on earth (right). 

Geoscience mysteries just got a little easier to solve. As I mentioned the other day, there's a new place on the Internet for geoscientists to ask questions and help each other out. Stack Exchange, the epic Q&A site, has a new Earth Science site — check out this tricky question about hydrocarbon generation.

And finally, who would have thought that waiting 13 years for a drop of bitumen could be an anticlimax? But in the end, the long (if not eagerly) awaited 9th drop in the University of Queensland's epic experiment just didn't have far enough to fall...

If you can't get enough of this, you can wait for the 10th drop here. Or check back here in 2027.

A culture of asking questions

When I worked at ConocoPhillips, I was quite involved in their knowledge sharing efforts (and I still am). The most important part of the online component is a set of 100 or so open discussion forums. These are much like the ones you find all over the Internet (indeed, they're a big part of what made the Internet what it is — many of us remember Usenet, now Google Groups). But they're better because they're highly relevant, well moderated, and free of trolls. They are an important part of an 'asking' culture, which is an essential prerequisite for a learning organization

Stack Exchange is awesome

Today, the Q&A site I use most is Stack Overflow. I read something on it almost every day. This is the place to get questions about programming answered fast. It is one of over 100 sites at Stack Exchange, all excellent — readers might especially like the GIS Stack Exchange. These are not your normal forums... Fields medallist Tim Gowers recognizes Math Overflow as an important research tool. The guy has a blog. He is awesome.

What's so great about the Stack Exchange family? A few things:

  • A simple system of up- and down-voting questions and answers that ensures good ones are easy to find.
  • A transparent system of user reputation that reflects engagement and expertise, and is not easy to game. 
  • A well defined path from proposal, to garnering support, to private testing, to public testing, to launch.
  • Like good waiters, the moderators keep a very low profile. I rarely notice them. 
  • There are lots of people there! This always helps.

The new site for earth science

The exciting news is that, two years after being proposed in Area 51, the Earth Science site has reached the minimum commitment, spent a week in beta, and is now open to all. What happens next is up to us — the community of geoscientists that want a well-run, well-populated place to ask and answer scientific questions.

You can sign in instantly with your Google or Facebook credentials. So go and take a look... Then take a deep breath and help someone. 

Private public data

Our recent trip to the AAPG Annual Convention in Houston was much enhanced by meeting some inspiring geoscientist–programmers. People like...

  • Our old friend Jacob Foshee hung out with us and built his customary awesomeness.
  • Wassim Benhallam, at the University of Utah, came to our Rock Hack and impressed everyone with his knowledge of clustering algorithms, and sedimentary geology.
  • Sebastian Good, of Palladium Consulting, is full of beans and big ideas — and is a much more accomplished programmer than most of us will ever be. If you're coding geoscience, you'll like his blog.
  • We had a laugh with Nick Thompson from Schlumberger, who we bumped into at a 100% geeky meet-up for Python programmers interested in web sockets. I cannot explain why we were there.

Perhaps the most animated person we met was Ted Kernan (right). A recent graduate of Colorado School of Mines, Ted has taught himself PHP, one of the most prevalent programming languages on the web (WordPress, Joomla, and MediaWiki are written in PHP). He's also up on all the important bits of web tech, like hosting, and HTML frameworks.

But the really cool thing is what he's built: a search utility for public well data in the United States. You can go and check it out at publicwelldata.com — and if you like it, let Ted know!

Actually, that's not even the really cool thing. The really cool thing is how passionate he is about exposing this important public resource, and making it discoverable and accessible. He highlights the stark difference between Colorado's easy access to digital well data, complete with well logs, and the sorry state of affairs in North Dakota, where he can't even get his app in to read well names. 'Public data' can no longer mean "we'll sell you a paper printout for $40". It belongs on the web — machines can read too.

More than just wells

There's so much potential power here — not only for human geoscientists looking for well data, but also for geoscientist–programmers building tools that need well data. For example, I imagine being able to point modelr.io at any public well to grab its curves and make a quick synthetic. Ready access to open services like Ted's will free subsurface software from the deadweight of corporate databases filled with years of junk, and make us all a bit more nimble. 

We'll be discussing open data, and openness in general, at the Openness Unsession in Calgary on the afternoon of 12 May — part of GeoConvention 2014. Join us!

Can openness make us better? Help us find out!

Last year's Unsolved Problems Unsession (above) identified two openness issues — Less secrecy, more sharing and Free the data — as the greatest unsolved problems in our community. This year, we'll dig into that problem. Here's the blurb:

At the Unsolved Problems Unsession last year, this community established that Too much secrecy is one of the top unsolved problems in our industry. This year, we will dig into this problem, and ask what kind of opportunities solving it could create. What forces cause closedness to persist? What are the advantages of being more open? Where is change happening today? Where can we effect change next?

We offer no agenda, no experts, no talks, and no answers. This is an open space for everyone to come and be their best and brightest self. So bring it.

GeoConvention Monday 12 May, afternoon in Telus 108 (ground floor on the north side)

No experts? No answers? What on earth are we up to? Well, we think bringing questions to a group of engaged professionals is more fun than bringing answers. The idea is to talk about our greatest aspirations for our discipline, and how we can find out if greater transparency and openness can help us achieve them.

If you know someone else who would enjoy this, please tell them about it or bring them along. I hope we see you there on 12 May!

More AAPG highlights

Here are some of our highlights from the second half of the AAPG Annual Convention in Houston.

Conceptual uncertainty in interpretation

Fold-thrust belt, offshore Nigeria. Virtual Seismic Atlas.Rob Butler's research is concerned with the kinematic evolution of mountain ranges and fold thrust belts in order to understand the localization of deformation across many scales. Patterns of deformed rocks aren't adequately explained by stress fields alone; they are also controlled by the mechancial properties of the layers themselves. Given this fact, the definition of the layers becomes a doubly important part of the interpretation.

The biggest risk in structural interpretation is not geometrical accuracy but whether or not the concept is correct. This is not to say that we don't understand geologic processes. Rather, a section can always be described in more than one way. It is this risk in the first order model that impacts everything we do. To deal with conceptual uncertainty we must first capture the range, otherwise it is useless to do any more refinement. 

He showed a crowd-sourced compiliation of 24 interpretations from the Virtual Seismic Atlas as a way to stack up a series of possible structural frameworks. Fifteen out of twenty-four interviewees interpreted a continuous, forward-propagating thrust fault as the main structure. The disagreements were around the existence and location of a back thrust, linkage between fore- and back-thrusts, the existence and location of a detachment surface, and its linkage to the fault planes above. Given such complexity, "it's rather daft," he said, "to get an interpretation from only one or two people." 

CT scanning gravity flows

Mike Tilston and Bill Arnott gave a pair of talks about their research into sediment gravity flows in the lab. This wouldn't be newsworthy in itself, but their 2 key innovations caught our attention: 

  1. A 3D velocity profiler capable of making 23 measurements a second
  2. The flume tank ran through a CT scanner, giving a hi-res cross-section view

These two methods sidestep the two major problems with even low-density (say 4% by weight) sediment gravity flows: they are acoustically attenuative, and optically opaque. Using this approach Tilston and Arnott investigated the effect of grain size on the internal grain distribution, finding that fine-grained turbidity currents sustain a plug-like wall of sediment, while coarse-grained flows have a more carpet-like distribution. Next, they plan to look at particle shape effects, finer grain sizes, and grain mixtures. Technology for the win!

Hypothesizing a martian ocean

Lorena Moscardelli showed topograhic renderings of the Eberswalde delta (right) on the planet Mars, hypothesizing that some martian sedimentary rocks have been deposited by fluvial processes. An assertion that posits the red planet with a watery past. If there are sedimentary rocks formed by fluids, one of the fluids could have been water. If there has been water, who knows what else? Hydrocarbons? Imagine that! Her talk was in the afternoon session on Space and Energy Frontiers, sandwiched by less scientific speakers raising issues for staking claims and models for governing mineral and energy resources away from earth. The idea of tweaking earthly policies and state regulations to manage resources on other planets, somehow doesn't align with my vision of an advanced civilization. But the idea of doing seismic on other planets? So cool.

Poster gorgeousness

Matt and I were both invigorated by the quality, not to mention the giant size, of the posters at the back of the exhibition hall. It was a place for the hardcore geoscientists to retreat from the bright lights, uniformed sales reps, and the my-carpet-is-cushier-than-your-carpet marketing festival. An oasis of authentic geoscience and applied research.

We both finally got to meet Brian Romans, a sedimentologist at Virginia Tech, amidst the poster-paneled walls. He said that this is his 10th year venturing to the channel deposits that crop out in the Magallanes Basin of southern Chile. He is now one of the three young, energetic profs behind the hugely popular Chile Slope Systems consortium.

Three years ago he joined forces with Lisa Stright (University of Utah), and Steve Hubbard (University of Calgary) and formed the project investigating processes of sediment transfer across deepwater slopes exposed around Patagonia. It is a powerhouse of collaborative research, and the quality of graduate student work being pumped out is fantastic. Purposeful and intentional investigations carried out by passionate and tech-savvy scientists. What can be more exciting than that?

Do you have any highlights of your own? Please leave a note in the comments.