This business of cards


At Recovery 2011, I enjoyed observing the idiosyncratic and almost compulsive act of eyes wandering from eyes, zeroing in on nametags, and the customary exchange of business cards. It is the default behavior when we are bombarded with too many strangers, all of whom deserve and desire our attention. To lessen the chore of post-processing all of these interactions, I think I will give Google Goggles a go to turn this pile of paper into digital contacts.

This got me wondering about the tools we have for making deeper connections. New digital business cards are more than geeky toys. Through cardcloud, card exchanges are paperless, and with added benefits of storing the geographic location of the encounter, and implanting social networking usernames and links right onto your cards. If you like to jot down notes on the back of paper cards, here too, you can flip them over and type your text in; "hire this guy", "he owes me a lunch next time". Not only does it stitch to all your electronic devices, but I think the virtual business card space can a be more inviting calling card for your profession.

Business cards may not be going paperless just yet, but there are many ways of bringing the digital world to these tangible bits of stationery. QR codes and RFIDs, are linking technologies so that business cards can carry digital content. Online profiles such as Facebook, Twitter, and About.me add depth, but how about functional or personalized business cards to really stand out?   

Have you changed the scope of your business card to align with the way you work?

Why we should embrace openness

Openness—open ideas, open data, open teams—can help us build more competitive, higher performing, more sutainable organizations in this industry.

Last week I took this message to the annual convention of the three big applied geoscience organizations in Canada: the Canadian Society of Petroleum Geologists (CSPG), the Canadian Society of Exploration Geophysicist (CSEG), and the Canadian Well Logging Society (CWLS). Evan and I attended the conference as scientists, but also experimented a bit with live tweeting and event blogging.

The talk was a generalization of the talk I did in March about open source software in geoscience. I wasn't sure at all how it would go over, and spent most of the morning sitting in technical talks fretting about how flaky and meta my talk would sound. But it went quite well, and at least served as some light relief from the erudition in the rest of the agenda. It was certainly fun to give an opinion-filled talk, and it started plenty of conversations afterwards.

You can access a PDF of the visuals, with commentary, from the thumbnail (left).

What do you think? Is a competitive, secretive industry like oil and gas capable of seeing value in openness? Might regulators eventually force us to share more as the resources society demands become scarcer? Or are we doomed to more mistrust and secrecy as oil and gas become more expensive to produce?

← Click the image for the PDF (6.8M)

The core of the conference

Andrew Couch of Statoil answering questions about his oil sands core, standing in front of a tiny fraction of the core collection at the ERCBToday at the CSPG CSEG CWLS convention was day 1 of the core conference. This (unique?) event is always well attended and much talked-about. The beautiful sunshine and industry-sponsored lunch today helped (thanks Weatherford!).

One reason for the good turn-out is the incredible core research facility here in Calgary. This is the core and cuttings storage warehouse and lab of the Energy Resources Conservation Board, Alberta's energy regulator. I haven't been to a huge number of core stores around the world, but this is easily the largest, cleanest, and most efficient one I have visited. The picture gives no real indication of the scale: there are over 1700 km of core here, and cuttings from about 80 000 km of drilling. If you're in Calgary and you've never been, find a way to visit. 

Ross Kukulski of the University of Calgary is one of Stephen Hubbard's current MSc students. Steve's students are consistently high performers, with excellent communication and drafting skills; you can usually spot their posters from a distance. Ross is no exception: his poster on the stratigraphic architecture of the Early Cretaceous Monach Formation of NW Alberta was a gem. Ross has integrated data from about 30 cores, 3300 (!) well logs, and outcrop around Grand Cache. While this is a fairly normal project for Alberta, I was impressed with the strong quantitative elements: his provenance assertions were backed up with Keegan Raines' zircon data, and channel width interpretation was underpinned by Bridge & Tye's empirical work (2000; AAPG Bulletin 84).

The point bar in Willapa Bay where Jesse did his coring. Image from Google Earth. Jesse Schoengut is a MSc student of Murray Gingras, part of the ichnology powerhouse at the University of Alberta. The work is an extension of Murray's long-lived project in Willapa Bay, Washington, USA. Not only had the team collected vibracore along a large point bar, but they had x-rayed these cores, collected seismic profiles across the tidal channel, and integrated everything into the regional dataset of more cores and profiles. The resulting three-dimensional earth model is helping solve problems in fields like the super-giant Athabasca bitumen field of northeast Alberta, where the McMurray Formation is widely interpreted to be a tidal estuary somewhat analogous to Willapa. 

Greg Hu of Tarcore presented his niche business of photographing bitumen core, and applying image processing techniques to complement and enhance traditional core descriptions and analysis. Greg explained that unrecovered core and incomplete sampling programs result in gaps and depth misalignment—a 9 m core barrel can have up to several metres of lost core which can make integrating core information with other subsurface information intractable. To help solve this problem, much of Tarcore's work is depth-correcting images. He uses electrical logs and FMI images to set local datums on centimetre-scale beds, mud clasts, and siderite nodules. Through color balancing, contrast stretching, and image analysis, shale volume (a key parameter in reservoir evaluation) can be computed from photographs. This approach is mostly independent of logs and offers much higher resolution.

It's awesome how petroleum geologists are sharing so openly at this core workshop, and it got us thinking: what would a similar arena look like for geophysics or petrophysics? Imagine wandering through a maze of 3D seismic volumes, where you can touch, feel, ask, and learn.

Don't miss our posts from day 1 of the convention, and from days 2 and 3.

Cracks, energy, and nanoseismic

Following on from our post on Monday, here are some presentations that caught our attention on days 2 and 3 at the CSPG CSEG CWLS convention this week in Calgary. 

On Tuesday, Eric von Lunen of Nexen chose one of the more compelling titles of the conference: What do engineers need from geophysicists in shale resource plays? Describing some of the company's work in the Horn River sub-basin, he emphasized the value of large, multi-faceted teams of subsurface scientists, including geochemists, geologists, geophysicists, petrophysicists, and geo-mechanics. One slightly controversial assertion: Nexen interprets less than 20% of the fractures as vertical, and up to 40% as horizontal. 

Jon Olson is Associate Professor at University of Texas at Austin, shared some numerical modeling and physical experiments that emphasized the relevance of subcritical crack indices for unconventional reservoir exploitation. He presented the results of a benchtop hydrofracking experiment on a cubic foot of gyprock. By tinting frac fluids with red dye, Jon is able to study the fracture patterns directly by slicing the block and taking photographs. It would be curious to perform micro-micro-seismic (is that nanoseismic?) experiments, to make a more complete small-scale analog.

Shawn Maxwell of Schlumberger is Mr Microseismic. We're used to thinking of the spectrum of a seismic trace; he showed the spectrum of a different kind of time series, the well-head pressure during a fracture stimulation. Not surprisingly, most of the energy in this spectrum is below 1 Hz. What's more, if you sum the energy recorded by a typical microseismic array, it amounts to only one millionth of the total energy pumped into the ground. The deficit is probably aseismic, at least certainly outside the seismic band (about 5 Hz to 200 Hz on most jobs). Where is the rest of the pumped energy? Some sinks are: friction losses in the pipe, friction losses in the reservoir, heat, etc.

Image of Horn River shale is licensed CC-BY-SA, from Qyd on Wikimedia Commons. 

Noise, sampling, and the Horn River Basin

Some highlights from day 1 of GeoCon11, the CSPG CSEG CWLS annual convention in Calgary.

Malcolm Lansley of Sercel, with Peter Maxwell of CGGVeritas, presented a fascinating story of a seismic receiver test in a Maginot Line bunker in the Swiss Alps. The goal was to find one of the quietest places on earth to measure the sensitivity to noise at very low frequencies. The result: if signal is poor then analog geophones outperform MEMS accelerometers in the low frequency band, but MEMS are better in high signal:noise situations (for example, if geological contrasts are strong).

Click for the reportWarren Walsh and his co-authors presented their work mapping gas in place for the entire Horn River Basin of northeast British Columbia, Canada. They used a stochastic approach to simulate both free gas (held in the pore space) and adsorbed gas (bound to clays and organic matter). The mean volume: 78 Tcf, approximately the same size as the Hugoton Natural Gas Area in Kansas, Texas, and Oklahoma. Their report (right) is online

RECON Petrotechnologies showed results from an interesting physical experiment to establish the importance of well-log sample rate in characterizing thin beds. They constructed a sandwich of gyprock, between slices of aluminium and magnesium, then pulled a logging tool through a hole in the middle of the sandwich. An accurate density measurement in a 42-cm thick slice of gyprock needed 66 samples per metre, much higher than the traditional 7 samples per metre, and double the so-called 'high resolution' rate of 33 samples per metre. Read their abstract

Carl Reine at Nexen presented Weighing in on seismic scale, exploring the power law relationship of fracture lengths in Horn River shales. He showed that the fracture system has no characteristic scale, and fractures are present at all lengths. Carl used two independent seismic techniques for statistically characterizing fracture lengths and azimuths, which he called direct and indirect. Direct fault picking was aided by coherency (a seismic attribute) and spectral decomposition; indirect fault picking used 3D computations of positive and negative curvature. Integrating these interpretations with borehole and microseismic data allowed him to completely characterize fractures in a reservoir model. (See our post about crossing scales in interpretation.)

Evan and Matt are tweeting from the event, along with some other attendees; follow the #geocon11 hashtag to get the latest.

 

Seeing red

Temperature is not often a rock property given a lot of attention by geoscientists. Except in oil sands. Bitumen is a heavily biodegraded oil greater than 10 000 cP and less than 10˚API. It is a viscoelastic solid at room temperature, and flows only when sufficiently heated. Operators inject steam (through a process called SAGD), as opposed to hot water, because steam carrys a large portion of its energy as latent heat. When steam condenses against the chamber walls, it transfers heat into the surrounding reservoir. This is akin to the pain you'd feel when you place your hand over a pot of rolling water.

This image is a heat map across 3 well pairs (green dots) at the Underground Test Facility (UTF) in the Early Cretaceous McMurray Formation in the Athabasca oil sands of Alberta. This data is from downhole thermocouple measurements, shown in white dots, the map was made by doing a linear 2D interpolation.

Rather than geek out on the physics and processes taking place, I'd rather talk about why I think this is a nifty graphic.

What I like about this figure

Colour is intiutive – Blue for cold, red for hot, it doesn't get much more intuitive than that. A single black contour line delineates the zone of stable steam and a peripheral zone being heated.  

Unadulterated interpolation – There are many ways of interpolating or filling-in where there is no data. In this set, the precision of each measurement is high, within a degree or two, but the earth is sampled irregularly. There is much higher sampling in the vertical direction than the x,y direction, and this presents, somewhat unsightly, as horizontal edges on the interpolated colours. To smooth the interpolation, or round its slightly jagged edges would, in my opinion, degrade the information contained in the graphic. It's a display of the sparseness of the measurements. 

Sampling is shown – You see exactly how many points make up the data set. Fifteen thermocouples in each of 7 observation wells. It makes the irregularities in the contours okay, meaningful even. I wouldn’t want to smooth it. I think map makers and technical specialists too readily forget about where their data comes from. Recognize the difference between hard data and interpolation, and recognize the difference between observation and interpretation.

Sampling is scale – Imagine what this image would look like if we took the first, third, fifth, and seventh observation well away. Our observations and thus physical interpretation would be dramatically different. Every data point is accurate, but resolution depends on sample density.

Layers of context – Visualizing data enables heightened interpretation. Interpreting the heated zone is a simply a temperature contour (isotherm). Even though this is just a heat map, you can infer that one steam chamber is isolated, and two have joined into one another. Surely, more can be understood by adding more context, by integrating other subsurface observations.

In commercial scale oil sands operations, it is rare to place observation wells so close to each other. But if we did, and recorded the temperature continuously, would we even need time lapse seismic at all? (see right) 

If you are making a map or plot of any kind, I encourage you to display the source data. Both its location and its value. It compels the viewer to ask questions like, Can we make fewer measurements in the next round? Do we need more? Can we drill fewer observation wells and still infer the same resolution? Will this cost reduction change how we monitor the depletion process?

Our next experiment: AgileWiki

We're excited to announce AgileWiki, a publicly editable encyclopedia of the subsurface, especially for the energy industry. You are unlikely to find how-to's on seismic interpretation in Wikipedia, or tutorials on basic concepts in Schlumberger's glossary, but this is exactly the sort of thing you'll find in AgileWiki.

If you're thinking: 'Great, but the last thing I need is another wiki...' then, fair enough, we sympathize. But here's why we think AgileWiki should have a role in your professional life:

  • It's only about the subsurface of the earth
  • It's only about the science and industry of energy
  • It's licensed CC-BY, which means you can re-purpose its content however you like, with due credit

We don't want to give the impression that AgileWiki is finished, or even very useful yet — it's very much early days and it has a long way to go. This is the Agile way: release early and update often. We think doing things like this in public is the best way to ensure it is relevant and useful. Evan and I are using the wiki already, so it is already growing organically.

A wiki is really just a website that anyone can edit (after you create an account; I have disabled anonymous edits). Please don't feel like you have to use it just as an encyclopedia. Here are some other ways to use AgileWiki:

  • As a way to share notes: next time you want to share something of general interest and email doesn't quite suit, feel free to document it in the wiki and simply send a URL.
  • As a personal and open notebook: You can store anything you want on your User page, which is automatically created when you create your account. 
  • As a source for non-proprietary content for an in-house or corporate wiki. Feel free to push generic content of your own back to us!

If you're not ready to jump into editing just yet but you'd like to see how this experiment can help you, why not try requesting an article? We're not promising to deliver overnight, but we want to help create content you want. So make a wish-list and let us know about it, on the wiki or in the comments below.

Last thing: please consider the ownership and possible confidentiality of whatever you want to share. By putting your material in the wiki, you agree to share it with others under the terms of its license.

News of the week

New site for seismic related work

There is a new job site for the seismic and geophysical industry called Seismic Works. Search for jobs, post jobs, advertise, or find crews, vendors, or equipment. Still very much in its early days, the site is nicely designed and very full-featured. You can also connect with SeismicWorks on Facebook. We look forward to seeing more sites and social connectors like this in our community.

SonicScope, a new real-time geophysical tool

Schlumberger has introduced a new logging-while-drilling tool called SonicScope that can be run directly behind the bit while drilling. The tool has potential to improve rock mechanics measurements and fracture identification by getting to the borehole wall immediately after penetration, minimizing the effects of washout and invasion. Real-time sonic measurements could enable pore pressure monitoring, time-to-depth information for seismic-well ties, and borehole damage assessment. We hope to see technology like this increase the relevance of rock physics. 

Geophysicist on last shuttle

Fellow geoscientist Andrew Feustel will be flying on space shuttle Endeavour's last trip into orbit. Andrew has an MSc in Geophysics from Purdue University in Indiana, USA, and a PhD in seismology from Queen's University in Ontario, Canada. He's even a fully paid-up member of the American Geophysical Union and the Society of Exploration Geophysicists. In addition to his scientific prowess, Andrew brings considerable mechanical tinkering skills, and will serve as the crew's repairman during the voyage. You can read more about him on his NASA profile page. Good luck, Andrew!

Nexen joins Marathon to explore shale gas in Poland

Canadian based Nexen Inc. will acquire a 40% working interest in ten of Marathon Oil's land agreements in Poland's Palaeozoic shale. Nexen brings experience in unconverntional drilling and completion technologies and shale gas experience from their assets in the Horn River shale in British Columbia, Canada. Europe's development of shale gas resources are complicated by population density, but activty in Poland's shale gas resources is evolving rapidly in the wake of successful shale plays in North America. Read more about the deal.

See all of our News of the week posts.

This regular news feature is for information only. We aren't connected with any of these organizations, and don't necessarily endorse their products or services. Image of Feustel is courtesy of NASA.

Geophysics apps FTW

Google Nexus SI have used an Apple iPhone for several years. It's probably the loveliest technology I've ever owned. But now it's gone, it's over between us, and it will never come back. Because now I've found Android

Last Wednesday I got a Google Nexus S, chosen for its relative purity: built by Samsung, it's a Google-branded phone, so it has less of the carrier's fingerprints on it, and it gets OS updates faster. But it's not the phone I love—it doesn't have the industrial beauty of the iPhone®. It's not even Google's Android™ operating system that I'm besotted with—though it is pretty fantastic. The thing I love is App Inventor. 

If you've never tried programming a computer, you really should give it a try. For me, learning to program transforms a computer from a mere tool into a workshop. Or if you prefer, from an instrument into an orchestra—sounds a bit less utilitarian that way. And I tentatively assert that you will never look at a problem, at least a technical one, in the same way again.

Google App Inventor™ is a programming environment for your phone. You do the programming in a web browser, but the thing you build runs on your phone, or anyone else's phone (as long as it's running Android, natch). Everything is free. And it's easy. Not 'quite easy'. Really easy. If this doesn't sound pretty amazing, you should probably stop reading now. 

Agile Volume app screenshotSince last Friday, I have built four applications, three of which are geoscientiferous:

  • Fold* computes fold and trace density, given a seismic acquisition geometry
  • Elastic* finds all of the elastic parameters, given VP, VS, and density
  • Volume* calculates oil in place, given some reservoir properties (shown)

I had each app working, in a basic way, inside an hour. The only slightly tricky thing is setting up the logic to handle blank fields, weird oilfield units, and that sort of thing. Aesthetics can also be fiddly, especially if you are making custom graphics. But if you skip looks and error handling, perhaps because you don't intend to give the app to anyone else, then you can be done in under an hour. 

Evan and I have barely started to explore the tools available. The language inside App Inventor is based on MIT Scratch, the building-block visual interface with a long history at MIT. The vocabulary is very rich: there are math processes, logical constructs, text handlers. You can access the phone, email, the GPS, and even the accelerometer (for instance, in our apps you can shake the phone to clear the parameters and start over). You can draw interactive graphics, scan barcodes, or build a persistent database.

The only problem we've run into so far is the final hurdle: you cannot (yet—App Inventor is still in beta) easily publish your finished app to the Android Market, so that others can download it. There are non-easy ways, and we hope to have our apps up soon. They will be free, though we may experiment with freemium

Next week I'll write a bit about Volume* and show you how the inside of it looks. In the meantime, give it a try... or if you prefer, let us know if there's a killer geoscience app you'd love to have on your phone. I'm on a roll!

Find out more

The Wikipedia articles on Android and App Inventor are very nice summaries. 

iPhone is a registered trademark of Apple, Inc. Android and App Inventor are trademarks of Google Inc. Agile is not connected in any way with any of these marks or companies.

Your competitive advantage is...

Two weeks ago I explained why I think competitive advantage is often attached to the wrong things. So corporations hide things they would get more benefit from sharing. This week, I offer one opinion on what competitive advantage is, and how to get more of it.

Data are often not even treated as an asset, never mind an advantageI recently happened on a blog by Rob Karel about business process. That sounds a bit dull perhaps, but I liked this:

The "data is an asset" rhetoric doesn't translate to [monetary value], because data in and of itself has no value! The only value data/information has to offer... is in the context of the business processes, decisions, customer experiences, and competitive differentiators it can enable.

So if data are not a competitive advantage, and neither are most of the software and industrial technology we use, and nor yet are industry training courses or consoritum-based research programs, then what is? My list follows after the jump.

Read More