The Surmont Supermerge

In my recent Abstract horror post, I mentioned an interesting paper in passing, Durkin et al. (2017):

 

Paul R. Durkin, Ron L. Boyd, Stephen M. Hubbard, Albert W. Shultz, Michael D. Blum (2017). Three-Dimensional Reconstruction of Meander-Belt Evolution, Cretaceous Mcmurray Formation, Alberta Foreland Basin, Canada. Journal of Sedimentary Research 87 (10), p 1075–1099. doi: 10.2110/jsr.2017.59

 

I wanted to write about it, or rather about its dataset, because I spent about 3 years of my life working on the USD 75 million seismic volume featured in the paper. Not just on interpreting it, but also on acquiring and processing the data.

Let's start by feasting our eyes on a horizon slice, plus interpretation, of the Surmont 'Supermerge' 3D seismic volume:

Figure 1 from Durkin et al (2017), showing a stratal slice from 10 ms below the top of the McMurray Formation (left), and its interpretation (right). © 2017, SEPM (Society for Sedimentary Geology) and licensed CC-BY.

Figure 1 from Durkin et al (2017), showing a stratal slice from 10 ms below the top of the McMurray Formation (left), and its interpretation (right). © 2017, SEPM (Society for Sedimentary Geology) and licensed CC-BY.

A decade ago, I was 'geophysics advisor' on Surmont, which is jointly operated by ConocoPhillips Canada, where I worked, and Total E&P Canada. My line manager was a Total employee; his managers were ex-Gulf Canada. It was a fantastic, high-functioning team, and working on this project had a profound effect on me as a geoscientist. 

The Surmont bitumen field

The dataset covers most of the Surmont lease, in the giant Athabasca Oil Sands play of northern Alberta, Canada. The Surmont field alone contains something like 25 billions barrels of bitumen in place. It's ridiculously massive — you'd be delighted to find 300 million bbl offshore. Given that it's expensive and carbon-intensive to produce bitumen with today's methods — steam-assisted gravity drainage (SAGD, "sag-dee") in Surmont's case — it's understandable that there's a great deal of debate about producing the oil sands. One factoid: you have to burn about 1 Mscf or 30 m³ of natural gas, costing about USD 10–15, to make enough steam to produce 1 bbl of bitumen.

Detail from Figure 12 from Durkin et al (2017), showing a seismic section through the McMurray Formation. Most of the abandoned channels are filled with mudstone (really a siltstone). The dipping heterolithic strata of the point bars, so obvious in …

Detail from Figure 12 from Durkin et al (2017), showing a seismic section through the McMurray Formation. Most of the abandoned channels are filled with mudstone (really a siltstone). The dipping heterolithic strata of the point bars, so obvious in horizon slices, are quite subtle in section. © 2017, SEPM (Society for Sedimentary Geology) and licensed CC-BY.

The field is a geoscience wonderland. Apart from the 600 km² of beautiful 3D seismic, there are now about 1500 wells, most of which are on the 3D. In places there are more than 20 wells per section (1 sq mile, 2.6 km², 640 acres). Most of the wells have a full suite of logs, including FMI in 2/3 wells and shear sonic as well in many cases, and about 550 wells now have core through the entire reservoir interval — about 65–75 m across most of Surmont. Let that sink in for a minute.

What's so awesome about the seismic?

OK, I'm a bit biased, because I planned the acquisition of several pieces of this survey. There are some challenges to collecting great data at Surmont. The reservoir is only about 500 m below the surface. Much of the pay sand can barely be called 'rock' because it's unconsolidated sand, and the reservoir 'fluid' is a quasi-solid with a viscosity of 1 million cP. The surface has some decent topography, and the near surface is glacial till, with plenty of boulders and gravel-filled channels. There are surface lakes and the area is covered in dense forest. In short, it's a geophysical challenge.

Nonetheless, we did collect great data; here's how:

  • General information
    • The ca. 600 km² Supermerge consists of a dozen 3Ds recorded over about a decade starting in 2001.
    • The northern 60% or so of the dataset was recombined from field records into a single 3D volume, with pre- and post-stack time imaging.
    • The merge was performed by CGG Veritas, cost nearly $2 million, and took about 18 months.
  • Geometry
    • Most of the surveys had a 20 m shot and receiver spacing, giving the volume a 10 m by 10 m natural bin size
    • The original survey had parallel and coincident shot and receiver lines (Megabin); later surveys were orthogonal.
    • We varied the line spacing between 80 m and 160 m to get trace density we needed in different areas.
  • Sources
    • Some surveys used 125 g dynamite at a depth of 6 m; others the IVI EnviroVibe sweeping 8–230 Hz.
    • We used an airgun on some of the lakes, but the data was terrible so we stopped doing it.
  • Receivers
    • Most of the surveys were recorded into single-point 3C digital MEMS receivers planted on the surface.
  • Bandwidth
    • Most of the datasets have data from about 8–10 Hz to about 180–200 Hz (and have a 1 ms sample interval).

The planning of these surveys was quite a process. Because access in the muskeg is limited to 'freeze up' (late December until March), and often curtailed by wildlife concerns (moose and elk rutting), only about 6 weeks of shooting are possible each year. This means you have to plan ahead, then mobilize a fairly large crew with as many channels as possible. After acquisition, each volume spent about 6 months in processing — mostly at Veritas and then CGG Veritas, who did fantastic work on these datasets.

Kudos to ConocoPhillips and Total for letting people work on this dataset. And kudos to Paul Durkin for this fine piece of work, and for making it open access. I'm excited to see it in the open. I hope we see more papers based on Surmont, because it may be the world's finest subsurface dataset. I hope it is released some day, it would have huge impact.


References & bibliography

Paul R. Durkin, Ron L. Boyd, Stephen M. Hubbard, Albert W. Shultz, Michael D. Blum (2017). Three-Dimensional Reconstruction of Meander-Belt Evolution, Cretaceous Mcmurray Formation, Alberta Foreland Basin, Canada. Journal of Sedimentary Research 87 (10), p 1075–1099. doi: 10.2110/jsr.2017.59 (not live yet).

Hall, M (2007). Cost-effective, fit-for-purpose, lease-wide 3D seismic at Surmont. SEG Development and Production Forum, Edmonton, Canada, July 2007.

Hall, M (2009). Lithofacies prediction from seismic, one step at a time: An example from the McMurray Formation bitumen reservoir at Surmont. Canadian Society of Exploration Geophysicists National Convention, Calgary, Canada, May 2009. Oral paper.

Zhu, X, S Shaw, B Roy, M Hall, M Gurch, D Whitmore and P Anno (2008). Near-surface complexity masquerades as anisotropy. SEG Annual Convention, Las Vegas, USA, November 2008. Oral paper. doi: 10.1190/1.3063976.

Surmont SAGD Performance Review (2016), by ConocoPhillips and Total geoscientists and engineers. Submitted to AER, 258 pp. Available online [PDF] — and well worth looking at.

Trad, D, M Hall, and M Cotra (2008). Reshooting a survey by 5D interpolation. Canadian Society of Exploration Geophysicists National Convention, Calgary, Canada, May 2006. Oral paper. 

Laying it all out at the Core Conference

Bobbing in the wake of the talks, the Core Conference turned out to be more exemplary of this year's theme, Integration. Best of all were SAGD case studies, where multi-disciplinary experiments are the only way to make sense of the sticky stuff.

Coring through steam

Travis Shackleton from Cenovus did a wonderful presentation showing the impact of bioturbation, facies boundaries, and sedimentary structures on steam chamber evolution in the McMurray Formation at the FCCL project. And because I had the chance to work on this project with ConocoPhillips a few years ago, but didn't, this work induced both jealousy and awe. Their experiment design is best framed as a series of questions:

  • What if we drilled, logged, and instrumented two wells only 10 m apart? (Awesome.)
  • What if we collected core in both of them? (Double awesome.)
  • What if the wells were in the middle of a mature steam chamber? (Triple awesome.)
  • What if we collected 3D seismic after injecting all this steam and compare with with a 3D from before? (Quadruple awesome.)

It is the first public display of SAGD-depleted oil sand, made available by an innovation of high-temperature core recovery. Travis pointed to a portion of core that had been rinsed by more than 5 years of steam circulating through it. It had a pale brown color and a residual oil saturation SO of 15% (bottom sample in the figure). Then he pointed to a segment of core above the top of the steam chamber. It too was depleted, by essentially the same amount. You'd never know just by looking. It was sticky and black and largely unscathed. My eyes were fooled, direct observation deceived.

A bitumen core full of fractures

Jen-Russel-Houston held up a half-tube of core of high-density fractures riddled throughout bitumen saturated rock. The behemoth oil sands that require thermal recovery assistance have an equally promising but lesser known carbonate cousin, still in its infancy. It is the bitumen saturated Grosmont Formation, located to the west of the more mature in-situ projects in sand. The reservoir is entirely dolomite, hosting its own unique structures affecting the spreading of steam and the reduction of bitumen's viscosity to a flowable level.

Jen and her team at OSUM hope their pilot will demonstrate that these fractures serve as transport channels for the steam, allowing it to creep around tight spots in the reservoir, which would otherwise be block the steam in its tracks. These are not the same troubling baffles and barriers caused by mud plugs or IHS, but permeability heterogeneities caused by the dolomitization process. A big question is the effective permeability at the length scales of production, which is phenomenologically different to measurements made from cut core. I overheard a spectator suggest to Jen that she try to freeze a sleeve of core, soak it with acid then rinse the dolomite out the bottom. After which only a frozen sculpture of the bitumen would remain. Crazy? Maybe. Intriguing? Indeed. 

Let's do more science with rocks!

Two impressive experiments, unabashedly and literally laid out for all to see, equipped with clever geologists, and enriched by supplementary technology. Both are thoughtful initiatives—real scientific experiments—that not only make the operating companies more profitable, but also profoundly improve our understanding of a precious resource for society. Two role models for how comprehensive experiments can serve more than just those who conduct them. Integration at its very best, centered on core.

What are the best examples of integrated geoscience that you've seen?

Seeing red

Temperature is not often a rock property given a lot of attention by geoscientists. Except in oil sands. Bitumen is a heavily biodegraded oil greater than 10 000 cP and less than 10˚API. It is a viscoelastic solid at room temperature, and flows only when sufficiently heated. Operators inject steam (through a process called SAGD), as opposed to hot water, because steam carrys a large portion of its energy as latent heat. When steam condenses against the chamber walls, it transfers heat into the surrounding reservoir. This is akin to the pain you'd feel when you place your hand over a pot of rolling water.

This image is a heat map across 3 well pairs (green dots) at the Underground Test Facility (UTF) in the Early Cretaceous McMurray Formation in the Athabasca oil sands of Alberta. This data is from downhole thermocouple measurements, shown in white dots, the map was made by doing a linear 2D interpolation.

Rather than geek out on the physics and processes taking place, I'd rather talk about why I think this is a nifty graphic.

What I like about this figure

Colour is intiutive – Blue for cold, red for hot, it doesn't get much more intuitive than that. A single black contour line delineates the zone of stable steam and a peripheral zone being heated.  

Unadulterated interpolation – There are many ways of interpolating or filling-in where there is no data. In this set, the precision of each measurement is high, within a degree or two, but the earth is sampled irregularly. There is much higher sampling in the vertical direction than the x,y direction, and this presents, somewhat unsightly, as horizontal edges on the interpolated colours. To smooth the interpolation, or round its slightly jagged edges would, in my opinion, degrade the information contained in the graphic. It's a display of the sparseness of the measurements. 

Sampling is shown – You see exactly how many points make up the data set. Fifteen thermocouples in each of 7 observation wells. It makes the irregularities in the contours okay, meaningful even. I wouldn’t want to smooth it. I think map makers and technical specialists too readily forget about where their data comes from. Recognize the difference between hard data and interpolation, and recognize the difference between observation and interpretation.

Sampling is scale – Imagine what this image would look like if we took the first, third, fifth, and seventh observation well away. Our observations and thus physical interpretation would be dramatically different. Every data point is accurate, but resolution depends on sample density.

Layers of context – Visualizing data enables heightened interpretation. Interpreting the heated zone is a simply a temperature contour (isotherm). Even though this is just a heat map, you can infer that one steam chamber is isolated, and two have joined into one another. Surely, more can be understood by adding more context, by integrating other subsurface observations.

In commercial scale oil sands operations, it is rare to place observation wells so close to each other. But if we did, and recorded the temperature continuously, would we even need time lapse seismic at all? (see right) 

If you are making a map or plot of any kind, I encourage you to display the source data. Both its location and its value. It compels the viewer to ask questions like, Can we make fewer measurements in the next round? Do we need more? Can we drill fewer observation wells and still infer the same resolution? Will this cost reduction change how we monitor the depletion process?