Laying it all out at the Core Conference

Bobbing in the wake of the talks, the Core Conference turned out to be more exemplary of this year's theme, Integration. Best of all were SAGD case studies, where multi-disciplinary experiments are the only way to make sense of the sticky stuff.

Coring through steam

Travis Shackleton from Cenovus did a wonderful presentation showing the impact of bioturbation, facies boundaries, and sedimentary structures on steam chamber evolution in the McMurray Formation at the FCCL project. And because I had the chance to work on this project with ConocoPhillips a few years ago, but didn't, this work induced both jealousy and awe. Their experiment design is best framed as a series of questions:

  • What if we drilled, logged, and instrumented two wells only 10 m apart? (Awesome.)
  • What if we collected core in both of them? (Double awesome.)
  • What if the wells were in the middle of a mature steam chamber? (Triple awesome.)
  • What if we collected 3D seismic after injecting all this steam and compare with with a 3D from before? (Quadruple awesome.)

It is the first public display of SAGD-depleted oil sand, made available by an innovation of high-temperature core recovery. Travis pointed to a portion of core that had been rinsed by more than 5 years of steam circulating through it. It had a pale brown color and a residual oil saturation SO of 15% (bottom sample in the figure). Then he pointed to a segment of core above the top of the steam chamber. It too was depleted, by essentially the same amount. You'd never know just by looking. It was sticky and black and largely unscathed. My eyes were fooled, direct observation deceived.

A bitumen core full of fractures

Jen-Russel-Houston held up a half-tube of core of high-density fractures riddled throughout bitumen saturated rock. The behemoth oil sands that require thermal recovery assistance have an equally promising but lesser known carbonate cousin, still in its infancy. It is the bitumen saturated Grosmont Formation, located to the west of the more mature in-situ projects in sand. The reservoir is entirely dolomite, hosting its own unique structures affecting the spreading of steam and the reduction of bitumen's viscosity to a flowable level.

Jen and her team at OSUM hope their pilot will demonstrate that these fractures serve as transport channels for the steam, allowing it to creep around tight spots in the reservoir, which would otherwise be block the steam in its tracks. These are not the same troubling baffles and barriers caused by mud plugs or IHS, but permeability heterogeneities caused by the dolomitization process. A big question is the effective permeability at the length scales of production, which is phenomenologically different to measurements made from cut core. I overheard a spectator suggest to Jen that she try to freeze a sleeve of core, soak it with acid then rinse the dolomite out the bottom. After which only a frozen sculpture of the bitumen would remain. Crazy? Maybe. Intriguing? Indeed. 

Let's do more science with rocks!

Two impressive experiments, unabashedly and literally laid out for all to see, equipped with clever geologists, and enriched by supplementary technology. Both are thoughtful initiatives—real scientific experiments—that not only make the operating companies more profitable, but also profoundly improve our understanding of a precious resource for society. Two role models for how comprehensive experiments can serve more than just those who conduct them. Integration at its very best, centered on core.

What are the best examples of integrated geoscience that you've seen?

Submitting assumptions for meaningful answers

The best talk of the conference was Ran Bachrach's on seismics for unconventionals. He enthusiastically described the physics to his spectators with conviction and duty, and explained why they should care. Isotropic, VTI, and orthorhombic media anisotropy models are used not because they are right, but because they are simple. If the assumptions you bring to the problem are reasonable, the answers can be considered meaningful. If you haven't considered and tested your assumptions, you haven't subscribed to reason. In a sense, you haven't held up your end of the bargain, and there will never be agreement. This talk should be mandatory viewing for anyone working seismic for unconventionals. Advocacy for reason. Too bad it wasn't recorded.

I am both privileged and obliged to celebrate such nuggets of awesomeness. That's a big reason why I blog. And on the contrary, we should call out crappy talks when we see them to raise the bar. Indeed, to quote Zen Faulkes, "...we should start creating more of an expectation that scientific talks will be reviewed and critiqued. And names will be named."

The talk from HEF Petrophysical entitled, Towards modelling three-dimensional oil sands permeability distribution using borehole image logs, drew me in. I was curious enough to show up. But as the talk unfolded, my curiosity was left unsatisfied. A potentially interesting workflow of transforming high-resolution resistivity measurements into flow permeability was obfuscated with a pointless upscaling step. The meat of anything like this is in the transform itself, but it was missing. It's also the most trivial bit; just cross-plot one property with another and show people. So I am guessing they didn't have any permeability data. If that was the case, how can you stand up and talk about permeability? It was a sandwich without the filling. The essential thing that defines a piece of work is the creativity. The thing you add that wasn't there before. I was disappointed. Disappointed that it was accepted, and that no one else piped up. 

I will paraphrase a conversation I had with Ran at the coffee break: Some are not aware, some choose to ignore, and some forget that works of geoscience are problems of extreme complexity. In fact, the only way we can cope with complexity is to make certain assumptions that make our problem solvable. If all you do is say "here is my solution", you suck. But if instead you ask, "Have I convinced you that my assumptions are reasonable?", it entirely changes the conversation. It entirely changes the specialist's role. Only when you understand your assumptions can we talk about whether the results are reasonable. 

Have you ever felt conflicted on whether or not you should say something?

A really good conversation

Today was Day 2 of the Canada GeoConvention. But... all we had the energy for was the famous Unsolved Problems Unsession. So no real highlights today, just a report from the floor of Room 101.

Today was the day. We slept about as well as two 8-year-olds on Christmas Eve, having been up half the night obsessively micro-hacking our meeting design (right). The nervous anticipation was richly rewarded. About 50 of the most creative, inquisitive, daring geoscientists at the GeoConvention came to the Unsession — mostly on purpose. Together, the group surfaced over 100 pressing questions facing the upstream industry, then filtered this list to 4 wide-reaching problems of integration:

  • making the industry more open
  • coping with error and uncertainty
  • improving seismic resolution
  • improving the way our industry is perceived

We owe a massive debt of thanks to our heroic hosts: Greg Bennett, Tannis McCartney, Chris Chalcraft, Adrian Smith, Charlene Radons, Cale White, Jenson Tan, and Tooney Fink. Every one of them far exceeded their brief and brought 100× more clarity and continuity to the conversations than we could have had without them. Seriously awesome people.  

This process of waking our industry up to new ways of collaborating is just beginning. We will, you can be certain, write more about the unsession after we've had a little time to parse and digest what happened.

If you're at the conference, tell us what we missed today!

A revolution in seismic acquisition?

We're in warm, sunny Calgary for the GeoConvention 2013. The conference feels like it's really embracing geophysics this year — in the past it's always felt more geological somehow. Even the exhibition floor felt dominated by geophysics. Someone we spoke to speculated that companies were holding their geological cards close to their chests, but the service companies are still happy to talk about (ahem, promote) their geophysical advances.

Are you at the conference? What do you think? Let us know in the comments.

We caught about 15 talks of the 100 or so on offer today. A few of them ignited the old whines about half-cocked proofs of efficacy. Why is it still acceptable to say that a particular seismic volume or inversion result is 'higher resolution' or 'more geological' with nothing more than a couple of sections or timeslices as evidence?

People are excited about designing seismic acquisition expressly for wavefield reconstruction. In a whole session devoted to the subject, for example, Mauricio Sacchi showed how randomization helps with regularization in processing, allowing us to either get better image quality, or to lower cost. It feels like the start of a new wave of innovation in acquisition, which has more than its fair share of recent innovation: multi-component, wide azimuth, dual-sensor, simultaneous source...

Is it a revolution? Or just the fallacy of new things looking revolutionary... until the next new thing? It's intriguing to the non-specialist. People are talking about 'beyond Nyquist' again, but this time without inducing howls of derision. We just spent an hour talking about it, and we think there's something deep going on... we're just not sure how to articulate it yet.

Unsolved problems

We were at the conference today, but really we are focused on the session we're hosting tomorrow morning. Along with a roomful of adventurous conference-goers (you're invited too!), looking for the most pressing questions in subsurface science. We start at 8 a.m. in Telus 101/102 on the main floor of the north building.

Here comes GeoConvention 2013

Next week Matt and I are heading to the petroleum capital of Canada for the 2013 GeoConvention. There will be 308 talks, 125 posters, over 4000 attendees, 100 exhibiting companies, and at least 2 guys blogging their highlights off.

My picks for Monday

Studying the technical abstracts ahead of time is the only way to make the most of your schedule. There are 9 sessions going on at any given time, a deep sense of FOMO has already set in. These are the talks I have decided on for Monday: 

Seismics for unconventionals

I watched Carl Reine from Nexen give a talk two years ago where he deduced a power-law relationship characterizing natural fracture networks in the Horn River shale. He will show how integrating such fracture intensity patterns with inversion models yields a powerful predictor of frackability, and uses microseismic to confirm it.

On a related note, and also from the Horn River Basin, Andreas Wuestefeld will show how microseismic can be used to identify fluid drainage patterns from microseismic data. Production simulation from an actual microseismic experiment. Numerical modeling, and physical experiment inextricably linked. I already love it.

Forward models and experimental tests

One is a design case study for optimizing interpolation, the other is a 3D seismic geometry experiment, the third is a benchtop physical fracture model made out of Plexiglass and resin.

Broadband seismic

Gets to the point of what hinders seismic resolution, and it does something about it through thoughtful design. This is just really nice looking data, two talks, same author: a step change, and impact of broadband

Best title award 

Goes to Forensic chemostratigraphy. Gimmicky name or revolutionary concept? You can't always judge a talk by the title, or the quality of the abstract. But it's hard not to. What talks are on your must-see list this year?

A really good conversation

Matt and I are hosting an unsession on the morning of Tuesday 7 May. It will be structured, interactive, and personal. The result: a ranked list of the most pressing problems facing the upstream geoscientists, especially in those hard to reach places between the disciplines. This is not a session where you sit and listen. Everyone will participate. We will explore questions that matter, connect diverse perspectives, and, above all, capture our collective knowledge. It might be scary, it might be uncomfortable, it might not be for you. But if you think it is, bring your experience and individuality, and we will do that thing called integration. We can only host 60 people, so if you don't want to be turned away, arrive early to claim a spot. We start at 8 a.m. in Telus 101/102 on the main floor of the north building.

News headlines

Our old friend the News post... We fell off the wagon there for a bit. From now on we'll just post news when we collect a few stories, or as it happens. If you miss the old last-Friday-of-the-month missive, we are open to being convinced!

First release of Canopy

Back in November we mentioned Canopy, Austin-based Enthought's new Python programming environment, especially aimed at scientists. Think of it as Python (an easy-to-use language) in MATLAB form (with file management, plotting, etc.). Soon, Enthought plan to add a geophysical toolbox — SEGY read/write, trace display, and so on. We're very, very excited for the future of rapid geophysical problem-solving! More on the Enthought blog.

The $99 supercomputer

I recently got a Raspberry Pi — a $35 Linux machine a shade larger than a credit card. We're planning to use it at The HUB South Shore to help kids learn to code. These little machines are part of what we think could be an R&D revolution, as it gets cheaper and cheaper to experiment. Check out the University of Southampton's Raspberry Pi cluster!

If that's not awesome enough for you, how about Parallella, which ships this summer and packs 64 cores for under $100! If you're a software developer, you need to think about whether your tools are ready for parallel processing — not just on the desktop, but everywhere. What becomes possible?

Geophysics + 3D printing = awesome

Unless you have been living on a seismic boat for the last 3 years, you can't have failed to notice 3D printing. I get very excited when I think about the possibilities — making real 3D geomodels, printing replacement parts in the field, manifesting wavefields, geobodies, and so on. The best actual application we've heard of so far — these awesome little physical models in the Allied Geophysical Laboratories at the University of Houston (scroll down a bit).

Sugru

Nothing to do with geophysics, but continuing the hacker tech and maker theme... check out sugru.com — amazing stuff. Simple, cheap, practical. I am envisaging a maker lab for geophysics — who wants in?

Is Oasis the new Ocean?

Advanced Seismic is a Houston-based geophysical software startup that graduated from the Surge incubator in 2012. So far, they have attracted a large amount of venture capital, and I understand they're after tens of millions more. They make exciting noises about Oasis, a new class of web-aware, social-savvy software with freemium pricing. But so far there's not a lot to see — almost everything on their site says 'coming soon' and Evan and I have had no luck running the (Windows-only) demo tool. Watch this space.

Slow pitch

The world's longest-running lab experiment is a dripping flask of pitch, originally set up in 1927. The hydrocarbon has a viscosity of about 8 billion centipoise, which is 1000 times more viscous than Alberta bitumen. So far 8 drops have fallen, the last on 28 November 2000. The next? Looks like any day now! Or next year. 

Image: University of Queensland, licensed CC-BY-SA. 

Well-tie workflow

We've had a couple of emails recently about well ties. Ever since my days as a Landmark workflow consultant, I've thought the process of calibrating seismic data to well data was one of the rockiest parts of the interpretation workflow—and not just because of SynTool. One might almost call the variety of approaches an unsolved problem.

Tying wells usually involves forward modeling a synthetic seismogram from sonic and density logs, then matching that synthetic to the seismic reflection data, thus producing a relationship between the logs (measured in depth) and the seismic (measured in travel time). Problems arise for all sorts of reasons: the quality of the logs, the quality of the seismic, confusion about handling the shallow section, confusion about integrating checkshots, confusion about wavelets, and the usability of the software. Like much of the rest of interpretation, there is science and judgment in equal measure. 

Synthetic seismogram (right) from the reservoir section of the giant bitumen field Surmont, northern Alberta. The reservoir is only about 450 m deep, and about 70 m thick. From Hall (2009), Calgary GeoConvention. 

Synthetic seismogram (right) from the reservoir section of the giant bitumen field Surmont, northern Alberta. The reservoir is only about 450 m deep, and about 70 m thick. From Hall (2009), Calgary GeoConvention

I'd go so far as to say that I think tying wells robustly is one of the unsolved problems of subsurface geoscience. How else can we explain the fact that any reasonably mature exploration project has at least 17 time-depth curves per well, with names like JLS_2002_fstk01_edit_cks_R24Hz_final?

My top tips

First, read up. White & Simm (2003) in First Break21 (10) is excellent. Rachel Newrick's essays in 52 Things are essential. Next, think about the seismic volume you are trying to tie to. Keep it to the nears if possible (don't use a full-angle stack unless it's all you have). Use a volume with less filtering if you have it (and you should be asking for it). And get your datums straight, especially if you are on land: make certain your seismic datum is correct. Ask people, look at SEGY headers, but don't be satisfied with one data point.

Once that stuff is ironed out:

  1. Chop any casing velocities or other non-data off the top of your log.
  2. Edit as gently and objectively as possible. Some of those spikes might be geology.
  3. Look at the bandwidth of your seismic and make an equivalent zero-phase wavelet.
  4. Don't extract a wavelet till you have a few good ties with a zero-phase wavelet, then extract from several wells and average. Extracting wavelets is a whole other post...
  5. Bulk shift the synthetic (e.g. by varying the replacement velocity) to make a good shallow event tie.
  6. Stretch (or, less commonly, squeeze) the bottom of the log to match the deepest event you can. 
  7. If possible, don't add any more tie points unless you really can't help yourself. Definitely no more than 5 tie points per well, and no closer than a couple of hundred milliseconds.
  8. Capture all the relevant data for every well as you go (screenshot, replacement velocity, cross-correlation coefficient, residual phase, apparent frequency content).
  9. Be careful with deviated wells; you might want to avoid tying the deviated section entirely and use verticals instead. If you go ahead, read your software's manual. Twice.
  10. Do not trust any checkshot data you find in your project — always go back to the original survey (they are almost always loaded incorrectly, mainly because the datums are really confusing).
  11. Get help before trying to load or interpret a VSP unless you really know what you are doing.

I could add some don'ts too...

  • Don't tie wells to 2D seismic lines you have not balanced yet, unless you're doing it as part of the process of deciding how to balance the seismic. 
  • Don't create multiple, undocumented, obscurely named copies or almost-copies of well logs and synthetics, unless you want your seismic interpretation project to look like every seismic interpretation project I've ever seen (you don't).

Well ties are one of those things that get in the way of 'real' (i.e. fun) interpretation so they sometimes get brushed aside, left till later, rushed, or otherwise glossed over. Resist at all costs. If you mess them up and don't find out till later, you will be very sad, but not as sad as your exploration manager.

Update

on 2013-04-27 13:25 by Matt Hall

Can't resist posting this most excellent well tie. Possibly the best you'll ever see.

Picture by Maitri, licensed CC-BY-NC-SA

Update

on 2014-07-04 13:53 by Matt Hall

Evan has written a deconstructed well-tie workflow, complete with IPython Notebook for you to follow along with, for The Leading Edge. Read Well-tie calculus here.

What is an unsession?

Yesterday I invited you (yes, you) to our Unsolved Problems Unsession on 7 May in Calgary. What exactly will be involved? We think we can accomplish two things:

  1. Brainstorm the top 10, or 20, or 50 most pressing problems in exploration geoscience today. Not limited to but focusing on those problems that affect how well we interface — with each other, with engineers, with financial people, with the public even. Integration problems.
  2. Select one or two of those problems and solve them! Well, not solve them, but explore ways to approach solving them. What might a solution be worth? How many disciplines does it touch? How long might it take? Where could we start? Who can help?Word cloud

There are bright, energetic young people out there looking for relevant problems to work on towards a Master's or PhD. There are entrepreneurs looking for high-value problems to create a new business from. And software companies looking for ways to be more useful and relevant to their users. And there is more than one interpreter wishing that innovation would speed up a bit in our industry and make their work a little — or a lot — easier. 

We don't know where it will lead, but we think this unsession is one way to get some conversations going. This is not a session to dip in and out of — we need 4 hours of your time. Bring your experience, your uniqueness, and your curiosity.

Let's reboot our imaginations about what we can do in our science.

An invitation to a brainstorm

Who of us would not be glad to lift the veil behind which the future lies hidden; to cast a glance at the next advances of our science and at the secrets of its development during future centuries? What particular goals will there be toward which the leading [geoscientific] spirits of coming generations will strive? What new methods and new facts in the wide and rich field of [geoscientific] thought will the new centuries disclose?

— Adapted from David Hilbert (1902). Mathematical Problems, Bulletin of the American Mathematical Society 8 (10), p 437–479. Originally appeared in in Göttinger Nachrichten, 1900, pp. 253–297.

Back at the end of October, just before the SEG Annual Meeting, I did some whining about conferences: so many amazing, creative, energetic geoscientists, doing too much listening and not enough doing. The next day, I proposed some ways to make conferences for productive — for us as scientists, and for our science itself. 

Evan and I are chairing a new kind of session at the Calgary GeoConvention this year. What does ‘new kind of session’ mean? Here’s the lowdown:

The Unsolved Problems Unsession at the 2013 GeoConvention will transform conference attendees, normally little more than spectators, into active participants and collaborators. We are gathering 60 of the brightest, sparkiest minds in exploration geoscience to debate the open questions in our field, and create new approaches to solving them. The nearly 4-hour session will look, feel, and function unlike any other session at the conference. The outcome will be a list of real problems that affect our daily work as subsurface professionals — especially those in the hard-to-reach spots between our disciplines. Come and help shed some light, room 101, Tuesday 7 May, 8:00 till 11:45.

What you can do

  • Where does your workflow stumble? Think up the most pressing unsolved problems in your workflows — especially ones that slow down collaboration between the disciplines. They might be organizational, they might be technological, they might be scientific.
  • Put 7 May in your calendar and come to our session! Better yet, bring a friend. We can accommodate about 60 people. Be one of the first to experience a new kind of session!
  • If you would like to help host the event, we're looking for 5 enthusiastic volunteers to play a slightly enlarged role, helping guide the brainstorming and capture the goodness. You know who you are. Email hello@agilegeoscience.com

Backwards and forwards reasoning

Most people, if you describe a train of events to them will tell you what the result will be. There will be few people however, that if you told them a result, would be able to evolve from their own consciousness what the steps were that led to that result. This is what I mean when I talk about reasoning backward.

— Sherlock Holmes, A Study in Scarlet, Sir Arthur Conan Doyle (1887)

Reasoning backwards is the process of solving an inverse problem — estimating a physical system from indirect data. Straight-up reasoning, which we call the forward problem, is a kind of data collection: empiricism. It obeys a natural causality by which we relate model parameters to the data that we observe.

Modeling a measurement

Marmousi_Forward_Inverse_800px.png

Where are you headed? Every subsurface problem can be expressed as the arrow between two or more such panels.Inverse problems exists for two reasons. We are incapable of measuring what we are actually interested in, and it is impossible to measure a subject in enough detail, and in all aspects that matter. If, for instance, I ask you to determine my weight, you will be troubled if the only tool I allow is a ruler. Even if you are incredibly accurate with your tool, at best, you can construct only an estimation of the desired quantity. This estimation of reality is what we call a model. The process of estimation is called inversion.

Measuring a model

Forward problems are ways in which we acquire information about natural phenomena. Given a model (me, say), it is easy to measure some property (my height, say) accurately and precisely. But given my height as the starting point, it is impossible to estimate the me from which it came. This is an example of an ill-posed problem. In this case, there is an infinite number of models that share my measurements, though each model is described by one exact solution. 

Solving forward problems are nessecary to determine if a model fits a set of observations. So you'd expect it to be performed as a routine compliment to interpretation; a way to validate our assumptions, and train our intuition.  

The math of reasoning

Forward and inverse problems can be cast in this seemingly simple equation.

Gm=d

where d is a vector containing N observations (the data), m is a vector of M model parameters (the model), and G is a N × M matrix operator that connects the two. The structure of G changes depending on the problem, but it is where 'the experiment' goes. Given a set of model parameters m, the forward problem is to predict the data d produced by the experiment. This is as simple as plugging values into a system of equations. The inverse problem is much more difficult: given a set of observations d, estimate the model parameters m.

Marmousi_G_Model_Data_800px_updated.png

I think interpreters should describe their work within the Gm = d framework. Doing so would safeguard against mixing up observations, which should be objective, and interpretations, which contain assumptions. Know the difference between m and d. Express it with an arrow on a diagram if you like, to make it clear which direction you are heading in.

Illustrations for this post were created using data from the Marmousi synthetic seismic data set. The blue seismic trace and its corresponding velocity profile is at location no. 250.