The core of the conference

Andrew Couch of Statoil answering questions about his oil sands core, standing in front of a tiny fraction of the core collection at the ERCBToday at the CSPG CSEG CWLS convention was day 1 of the core conference. This (unique?) event is always well attended and much talked-about. The beautiful sunshine and industry-sponsored lunch today helped (thanks Weatherford!).

One reason for the good turn-out is the incredible core research facility here in Calgary. This is the core and cuttings storage warehouse and lab of the Energy Resources Conservation Board, Alberta's energy regulator. I haven't been to a huge number of core stores around the world, but this is easily the largest, cleanest, and most efficient one I have visited. The picture gives no real indication of the scale: there are over 1700 km of core here, and cuttings from about 80 000 km of drilling. If you're in Calgary and you've never been, find a way to visit. 

Ross Kukulski of the University of Calgary is one of Stephen Hubbard's current MSc students. Steve's students are consistently high performers, with excellent communication and drafting skills; you can usually spot their posters from a distance. Ross is no exception: his poster on the stratigraphic architecture of the Early Cretaceous Monach Formation of NW Alberta was a gem. Ross has integrated data from about 30 cores, 3300 (!) well logs, and outcrop around Grand Cache. While this is a fairly normal project for Alberta, I was impressed with the strong quantitative elements: his provenance assertions were backed up with Keegan Raines' zircon data, and channel width interpretation was underpinned by Bridge & Tye's empirical work (2000; AAPG Bulletin 84).

The point bar in Willapa Bay where Jesse did his coring. Image from Google Earth. Jesse Schoengut is a MSc student of Murray Gingras, part of the ichnology powerhouse at the University of Alberta. The work is an extension of Murray's long-lived project in Willapa Bay, Washington, USA. Not only had the team collected vibracore along a large point bar, but they had x-rayed these cores, collected seismic profiles across the tidal channel, and integrated everything into the regional dataset of more cores and profiles. The resulting three-dimensional earth model is helping solve problems in fields like the super-giant Athabasca bitumen field of northeast Alberta, where the McMurray Formation is widely interpreted to be a tidal estuary somewhat analogous to Willapa. 

Greg Hu of Tarcore presented his niche business of photographing bitumen core, and applying image processing techniques to complement and enhance traditional core descriptions and analysis. Greg explained that unrecovered core and incomplete sampling programs result in gaps and depth misalignment—a 9 m core barrel can have up to several metres of lost core which can make integrating core information with other subsurface information intractable. To help solve this problem, much of Tarcore's work is depth-correcting images. He uses electrical logs and FMI images to set local datums on centimetre-scale beds, mud clasts, and siderite nodules. Through color balancing, contrast stretching, and image analysis, shale volume (a key parameter in reservoir evaluation) can be computed from photographs. This approach is mostly independent of logs and offers much higher resolution.

It's awesome how petroleum geologists are sharing so openly at this core workshop, and it got us thinking: what would a similar arena look like for geophysics or petrophysics? Imagine wandering through a maze of 3D seismic volumes, where you can touch, feel, ask, and learn.

Don't miss our posts from day 1 of the convention, and from days 2 and 3.

Cracks, energy, and nanoseismic

Following on from our post on Monday, here are some presentations that caught our attention on days 2 and 3 at the CSPG CSEG CWLS convention this week in Calgary. 

On Tuesday, Eric von Lunen of Nexen chose one of the more compelling titles of the conference: What do engineers need from geophysicists in shale resource plays? Describing some of the company's work in the Horn River sub-basin, he emphasized the value of large, multi-faceted teams of subsurface scientists, including geochemists, geologists, geophysicists, petrophysicists, and geo-mechanics. One slightly controversial assertion: Nexen interprets less than 20% of the fractures as vertical, and up to 40% as horizontal. 

Jon Olson is Associate Professor at University of Texas at Austin, shared some numerical modeling and physical experiments that emphasized the relevance of subcritical crack indices for unconventional reservoir exploitation. He presented the results of a benchtop hydrofracking experiment on a cubic foot of gyprock. By tinting frac fluids with red dye, Jon is able to study the fracture patterns directly by slicing the block and taking photographs. It would be curious to perform micro-micro-seismic (is that nanoseismic?) experiments, to make a more complete small-scale analog.

Shawn Maxwell of Schlumberger is Mr Microseismic. We're used to thinking of the spectrum of a seismic trace; he showed the spectrum of a different kind of time series, the well-head pressure during a fracture stimulation. Not surprisingly, most of the energy in this spectrum is below 1 Hz. What's more, if you sum the energy recorded by a typical microseismic array, it amounts to only one millionth of the total energy pumped into the ground. The deficit is probably aseismic, at least certainly outside the seismic band (about 5 Hz to 200 Hz on most jobs). Where is the rest of the pumped energy? Some sinks are: friction losses in the pipe, friction losses in the reservoir, heat, etc.

Image of Horn River shale is licensed CC-BY-SA, from Qyd on Wikimedia Commons. 

Noise, sampling, and the Horn River Basin

Some highlights from day 1 of GeoCon11, the CSPG CSEG CWLS annual convention in Calgary.

Malcolm Lansley of Sercel, with Peter Maxwell of CGGVeritas, presented a fascinating story of a seismic receiver test in a Maginot Line bunker in the Swiss Alps. The goal was to find one of the quietest places on earth to measure the sensitivity to noise at very low frequencies. The result: if signal is poor then analog geophones outperform MEMS accelerometers in the low frequency band, but MEMS are better in high signal:noise situations (for example, if geological contrasts are strong).

Click for the reportWarren Walsh and his co-authors presented their work mapping gas in place for the entire Horn River Basin of northeast British Columbia, Canada. They used a stochastic approach to simulate both free gas (held in the pore space) and adsorbed gas (bound to clays and organic matter). The mean volume: 78 Tcf, approximately the same size as the Hugoton Natural Gas Area in Kansas, Texas, and Oklahoma. Their report (right) is online

RECON Petrotechnologies showed results from an interesting physical experiment to establish the importance of well-log sample rate in characterizing thin beds. They constructed a sandwich of gyprock, between slices of aluminium and magnesium, then pulled a logging tool through a hole in the middle of the sandwich. An accurate density measurement in a 42-cm thick slice of gyprock needed 66 samples per metre, much higher than the traditional 7 samples per metre, and double the so-called 'high resolution' rate of 33 samples per metre. Read their abstract

Carl Reine at Nexen presented Weighing in on seismic scale, exploring the power law relationship of fracture lengths in Horn River shales. He showed that the fracture system has no characteristic scale, and fractures are present at all lengths. Carl used two independent seismic techniques for statistically characterizing fracture lengths and azimuths, which he called direct and indirect. Direct fault picking was aided by coherency (a seismic attribute) and spectral decomposition; indirect fault picking used 3D computations of positive and negative curvature. Integrating these interpretations with borehole and microseismic data allowed him to completely characterize fractures in a reservoir model. (See our post about crossing scales in interpretation.)

Evan and Matt are tweeting from the event, along with some other attendees; follow the #geocon11 hashtag to get the latest.

 

Seeing red

Temperature is not often a rock property given a lot of attention by geoscientists. Except in oil sands. Bitumen is a heavily biodegraded oil greater than 10 000 cP and less than 10˚API. It is a viscoelastic solid at room temperature, and flows only when sufficiently heated. Operators inject steam (through a process called SAGD), as opposed to hot water, because steam carrys a large portion of its energy as latent heat. When steam condenses against the chamber walls, it transfers heat into the surrounding reservoir. This is akin to the pain you'd feel when you place your hand over a pot of rolling water.

This image is a heat map across 3 well pairs (green dots) at the Underground Test Facility (UTF) in the Early Cretaceous McMurray Formation in the Athabasca oil sands of Alberta. This data is from downhole thermocouple measurements, shown in white dots, the map was made by doing a linear 2D interpolation.

Rather than geek out on the physics and processes taking place, I'd rather talk about why I think this is a nifty graphic.

What I like about this figure

Colour is intiutive – Blue for cold, red for hot, it doesn't get much more intuitive than that. A single black contour line delineates the zone of stable steam and a peripheral zone being heated.  

Unadulterated interpolation – There are many ways of interpolating or filling-in where there is no data. In this set, the precision of each measurement is high, within a degree or two, but the earth is sampled irregularly. There is much higher sampling in the vertical direction than the x,y direction, and this presents, somewhat unsightly, as horizontal edges on the interpolated colours. To smooth the interpolation, or round its slightly jagged edges would, in my opinion, degrade the information contained in the graphic. It's a display of the sparseness of the measurements. 

Sampling is shown – You see exactly how many points make up the data set. Fifteen thermocouples in each of 7 observation wells. It makes the irregularities in the contours okay, meaningful even. I wouldn’t want to smooth it. I think map makers and technical specialists too readily forget about where their data comes from. Recognize the difference between hard data and interpolation, and recognize the difference between observation and interpretation.

Sampling is scale – Imagine what this image would look like if we took the first, third, fifth, and seventh observation well away. Our observations and thus physical interpretation would be dramatically different. Every data point is accurate, but resolution depends on sample density.

Layers of context – Visualizing data enables heightened interpretation. Interpreting the heated zone is a simply a temperature contour (isotherm). Even though this is just a heat map, you can infer that one steam chamber is isolated, and two have joined into one another. Surely, more can be understood by adding more context, by integrating other subsurface observations.

In commercial scale oil sands operations, it is rare to place observation wells so close to each other. But if we did, and recorded the temperature continuously, would we even need time lapse seismic at all? (see right) 

If you are making a map or plot of any kind, I encourage you to display the source data. Both its location and its value. It compels the viewer to ask questions like, Can we make fewer measurements in the next round? Do we need more? Can we drill fewer observation wells and still infer the same resolution? Will this cost reduction change how we monitor the depletion process?

What changes sea-level?

Relative sea-level is complicated. It is measured from some fixed point in the sediment pile, not a fixed point in the earth. So if, for example, global sea-level (eustasy) stays constant but there is local subsidence at a fault, say, then we can say that relative sea-level has increased. Another common cause is isostatic rebound during interglacials, causing a fall in relative sea-level and a seaward regression of the coastline. Because the system didn't build out into the sea by itself, this is sometimes called a forced regression. Here's a nice example of a raised beach formed this way, from Langerstone Point, near Prawle in Devon, UK:

Image: Tony Atkin, licensed under CC-BY-SA-2.0. From Wikimedia Commons

Two weeks ago I wrote about some of the factors affecting relative sea-level, and the scales on which those processes operate. Before that, I had mentioned my undergraduate fascination with Milankovitch cyclicity and its influence on a range of geological processes. Complexity and interaction were favourite subjects of mine, and I built on this a bit in my graduate studies. To try to visualize some of the connectedness of the controls on sea-level, I drew a geophantasmagram that I still refer to occasionally:

Accommodation refers to the underwater space available for sediment deposition; it is closely related to relative sea-level. The end of the story, at least as far as gross stratigraphy is concerned, is the development of stratigraphic package, like a shelf-edge delta or a submarine fan. Systems tracts is just a jargon term for these packages when they are explicitly related to changes in relative sea-level. 

I am drawn to making diagrams like this; I like mind-maps and other network-like graphs. They help me think about complex systems. But I'm not sure they always help anyone other than the creator; I know I find others' efforts harder to read than my own. But if you have suggestions or improvements to offer, I'd love to hear from you.

Shattering shale

In shale gas exploration, one of the most slippery attributes we are interested in is fracability. The problem is that the rocks we study have different compositions and burial histories, so it's hard to pin down the relative roles of intrinsic rock properties and extrinsic stress states. Glass could be considered an end member for brittleness, and it has fairly uniform elastic parameters and bulk composition (it's amorphous silica). Perhaps we can learn something about the role of stresses by looking more closely at how glass fractures. 

The mechanics of glass can be characterized by two aspects: how it's made, and how it breaks.

Annealed glass is made by pouring molten glass onto a thin sheet of tin. Upon contact, the tin melts allowing for two perfectly smooth and parallel surfaces. The glass is cooled slowly so that stress irregularities dissipate evenly throughout, reducing local weak points. This is ordinary glass, as you might find in a mirror.

Tempered glass is made by heating annealed glass to near its softening point, about 720˚C, and then quickly cooling it by quenching with air jets. The exterior surface shrinks, freezing it into compression, while the soft interior of the glass gets pulled out by tensional forces as it freezes (diagram). 

How glass is made is directly linked to how it breaks. Annealed glass is weaker, and breaks into sparse splinters. The surface of tempered glass is stronger, and when it breaks, it breaks catastrophically; the interior tensional energy releases cracks from the inside out.

A piece of tempered glass is 4-6 times stronger than a piece of annealed glass with the same elastic properties, composition, density and dimensions. This means it looks almost identical but requires much more stress to break. Visually and empirically, it is not easy to tell the difference between annealed and tempered glass. But when you break it, the difference is obvious. So here, for two very brittle materials, with all else being equal, the stress state plays the dominant role in determining the mode of failure.

Because natural permeability is so low in fine grained rocks, production companies induce artificial fractures to connect flow pathways to the wellbore. The more surface area exposed, the more methane will be liberated.

If we are trying to fracture-stimulate shale to get at the molecules trapped inside, we would clearly prefer shale that shatters like tempered glass. The big question is: how do we explore for shale like this?

One approach is to isolate parameters such as natural fractures, anisotropy, pore pressure, composition, and organic content and study their independent effects. In upcoming posts, we'll explore the tools and techniques for measuring these parameters across scale space for characterizing fracability. 

Scales of sea-level change

Relative sea-level curve for the PhanerozoicClick to read about sea level on Wikipedia. Image prepared by Robert Rohde and licensed for public use under CC-BY-SA.Sea level changes. It changes all the time, and always has (right). It's well known, and obvious, that levels of glaciation, especially at the polar ice-caps, are important controls on the rate and magnitude of changes in global sea level. Less intuitively, lots of other effects can play a part: changes in mid-ocean ridge spreading rates, the changing shape of the geoid, and local tectonics.

A recent paper in Science by Petersen et al (2010) showed evidence for mantle plumes driving the cyclicity of sedimentary sequences. This would be a fairly local effect, on the order of tens to hundreds of kilometres. This is important because some geologists believe in the global correlatability of these sequences. A fanciful belief in my view—but that's another story.

The paper reminded me of an attempt I once made to catalog the controls on sea level, from long-term global effects like greenhouse–icehouse periods, to short-term local effects like fault movement. I made the table below. I think most of the data, perhaps all of it, were from Emery and Aubrey (1991). It's hard to admit, because I don't feel that old, but this is a rather dated publication now; I think it's solid enough for the sort of high-level overview I am interested in. 

After last week's doodling, the table inspired me to try another scale-space cartoon. I put amplitude on the y-axis, rate on the x-axis. Effects with global reach are in bold, those that are dominantly local are not. The rather lurid colours represent different domains: magmatic, climatic, isostatic, and (in green) 'other'. The categories and the data correspond to the table.
Infographic: scales of sea level changeIt is interesting how many processes are competing for that top right-hand corner: rapid, high-amplitude sea level change. Clearly, those are the processes we care about most as sequence stratigraphers, but also as a society struggling with the consequences of our energy addiction.

References
Emery, K & D Aubrey (1991). Sea-levels, land levels and tide gauges. Springer-Verlag, New York, 237p.
Petersen, K, S Nielsen, O Clausen, R Stephenson & T Gerya (2010). Small-scale mantle convection produces stratigraphic sequences in sedimentary basins. Science 329 (5993) p 827–830, August 2010. DOI: 10.1126/science.1190115

The scales of geoscience

Helicopter at Mount St Helens in 2007. Image: USGS.Geoscientists' brains are necessarily helicoptery. They can quickly climb and descend, hover or fly. This ability to zoom in and out, changing scale and range, develops with experience. Thinking and talking about scales, especially those outside your usual realm of thought, are good ways to develop your aptitude and intuition. Intuition especially is bound to the realms of your experience: millimetres to kilometres, seconds to decades. 

Being helicoptery is important because processes can manifest themselves in different ways at different scales. Currents, for example, can result in sorting and rounding of grains, but you can often only see this with a hand-lens (unless the grains are automobiles). The same environment might produce ripples at the centimetre scale, dunes at the decametre scale, channels at the kilometre scale, and an entire fluvial basin at another couple of orders of magnitude beyond that. In moments of true clarity, a geologist might think across 10 or 15 orders of magnitude in one thought, perhaps even more.

A couple of years ago, the brilliant web comic artist xkcd drew a couple of beautiful infographics depicting scale. Entitled height and depth (left), they showed the entire universe in a logarithmic scale space. More recently, a couple of amazing visualizations have offered different visions of the same theme: the wonderful Scale of the Universe, which looks at spatial scale, and the utterly magic ChronoZoom, which does a similar thing with geologic time. Wonderful.

These creations inspired me to try to map geological disciplines onto scale space. You can see how I did below. I do like the idea but I am not very keen on my execution. I think I will add a time dimension and have another go, but I thought I'd share it at this stage. I might even try drawing the next one freehand, but I ain't no Randall Munroe.

I'd be very happy to receive any feedback about improving this, or please post your own attempts!

What's hot in geophysics?

Two weeks ago I visited Long Beach, California, attending a conference called Mathematical and Computational Issues in the Geosciences, organized by the Society of Industrial and Applied Mathematicians. I wanted to exercise my cross-thinking skills. 

As expected, the week was very educational for me. Well, some of it was. Some of it was like being beaten about the head with a big bag of math. Anyone for quasi-monotone advection? What about semi-implicit, semi-Lagrangian, P-adaptive discontinuous Galerkin methods then?

Notwithstanding my apparent learning disability, I heard about some fascinating new things. Here are three highlights.

Read More

Great geophysicists #3

Today is a historic day for greatness: Rene Descartes was born exactly 415 years ago, and Isaac Newton died 284 years ago. They both contributed to our understanding of physical phenomena and the natural world and, while not exactly geophysicists, they changed how scientists think about waves in general, and light in particular.

Unweaving the rainbow

Scientists of the day recognized two types of colour. Apparent colours were those seen in prisms and rainbows, where light itself was refracted into colours. Real colours, on the other hand, were a property of bodies, disclosed by light but not produced by that light. Descartes studied refraction in raindrops and helped propagate Snell’s law in his 1637 paper, Dioptrica. His work severed this apparent–real dichotomy: all colours are apparent, and the colour of an object depends on the light you shine on it.

Newton began to work seriously with crystalline prisms around 1666. He was the first to demonstrate that white light is a scrambled superposition of wavelengths; a visual cacophony of information. Not only does a ray bend in relation to the wave speed of the material it is entering (read the post on Snellius), but Newton made one more connection. The intrinsic wave speed of the material, in turn depends on the frequency of the wave. This phenomenon is known as dispersion; different frequency components are slowed by different amounts, angling onto different paths.

What does all this mean for seismic data?

Seismic pulses, which strut and fret through the earth, reflecting and transmitting through its myriad contrasts, make for a more complicated type of prism-dispersion experiment. Compared to visible light, the effects of dispersion are subtle, negligible even, in the seismic band 2–200 Hz. However, we may measure a rock to have a wave speed of 3000 m/s at 50 Hz, and 3500 m/s at 20 kHz (logging frequencies), and 4000 m/s at 10 MHz (core laboratory frequencies). On one hand, this should be incredibly disconcerting for subsurface scientists: it keeps us from bridging the integration gap empirically. It is also a reason why geophysicists get away with haphazardly stretching and squeezing travel time measurements taken at different scales to tie wells to seismic. Is dispersion the interpreters’ fudge-factor when our multi-scale data don’t corroborate?

Chris Liner, blogging at Seismos, points out

...so much of classical seismology and wave theory is nondispersive: basic theory of P and S waves, Rayleigh waves in a half-space, geometric spreading, reflection and transmission coefficients, head waves, etc. Yet when we look at real data, strong dispersion abounds. The development of spectral decomposition has served to highlight this fact.

We should think about studying dispersion more, not just as a nuisance for what is lost (as it has been traditionally viewed), but as a colourful, scale-dependant property of the earth whose stories we seek to hear.