More powertools, and a gobsmacking

Yesterday was the second day of the open geophysics software workshop I attended in Houston. After the first day (which I also wrote about), I already felt like there were a lot of great geophysical powertools to follow up on and ideas to chase up, but day two just kept adding to the pile. In fact, there might be two piles now.

First up, Nick Vlad from FusionGeo gave us another look at open source systems from a commercial processing shop's perspective. Along with Alex (on day 1) and Renée (later on), he gave plenty of evidence that open source is not only compatible with business, but it's good for business. FusionGeo firmly believe that no one package can support them exclusively, and showed us GeoPro, their proprietary framework for integrating SEPlib, SU, Madagascar, and CP Seis. 

SEP logoYang Zhang from Stanford then showed us how reproducibility is central to SEPlib (as it is to Madagascar). When possible, researchers in the Stanford Exploration Project build figures with makefiles, which can be run by anyone to easily reproduce the figure. When this is not possible, a figure is labelled as non-reproducible; if there are some dependencies, on data for example, then it is called conditionally reproducible. (For the geeks out there, the full system for implementing this involves SEPlib, GNU make, Vplot, LaTeX, and SCons). 

Next up was a reproducibility system with ancestry in SEPlib: Madagascar, presented by the inimitable Sergey Fomel. While casually downloading and compiling Madagascar, he described how it allows for quick regeneration of figures, even from other sources like Mathematica. There are some nice usability features of Madagascar: you can easily interface with processes using Python (as well as Java, among other languages), and tools like OpendTect and BotoSeis can even provide a semi-graphical interface. Sergey also mentioned the importance of a phenomenon called dissertation procrastination, and why grad students sometimes spend weeks writing amazing code:

"Building code gives you good feelings: you can build something powerful, and you make connections with the people who use it"

After the lunch break, Joe Dellinger from BP explained how he thought some basic interactivity could be added to Vplot, SEP's plotting utility. The goal would not be to build an all-singing, all-dancing graphics tool, but to incrementally improve Vplot to support editing labels, changing scales, and removing elements. A good goal for a 1-day hack-fest?

The show-stopper of the day was Bjorn Olofsson of SeaBird Exploration. I think it's fair to say that everyone was gobsmacked by his description of SeaSeis, a seismic processing system that he has built with his own bare hands. This was the first time he has presented the system, but he started the project in 2005 and open-sourced it about 18 months ago. Bjorn's creation stemmed from an understandable (to me) frustration with other packages' apparent complexity and unease-of-use. He has built enough geophysical algorithms for SeaBird to use the software at sea, but the real power is in his interactive viewing tools. Built with Java, Bjorn has successfully exploited all the modern GUI libraries at his disposal. Due to constraints on his time, the future is uncertain. Message of the day: Help this man!

Renée Bourque of dGB also opened a lot of eyes with her overview of OpendTect and the Open Seismic Repository. dGB's tools are modern, user-friendly, and flexible. I think many people present realized that these tools—if combined with the depth and breadth of more fundamental pieces like SU, SEPlib and Madagascar—could offer the possibility of a robust, well-supported, well-documented, and rich environment that processors can use every day, without needing a lot of systems support or hacking skills. The paradigm already exists: Madagascar has an interface in OpendTect today.

As the group began to start thinking about the weekend, it was left to me, Matt Hall, to see if there was any more appetite for hearing about geophysics and computers. There was! Just enough for me to tell everyone a bit about mobile devices, the Android operating system, and the App Inventor programming canvas. More on this next week!

It was an inspiring and thought-provoking workshop. Thank you to Karl Schleicher and Robert Newsham for organizing, and Cheers! to the new friends and acquaintances. My own impression was that the greatest challenge ahead for this group is not so much computational, but more about integration and consolidation. I'm looking forward to the next one!

Open seismic processing, and dolphins

Today was the first day of the Petroleum Technology Transfer Council's workshop Open software for reproducible computational geophysics, being held at the Bureau of Economic Geology's Houston Research Center and organized skillfully by Karl Schleicher of the University of Texas at Austin. It was a full day of presentations (boo!), but all the presentations had live installation demos and even live coding (yay!). It was fantastic. 

Serial entrepreneur Alex Mihai Popovici, the CEO of Z-Terra, gave a great, very practical, overview of the relative merits of three major seismic processing packages: Seismic Unix (SU), Madagascar, and SEPlib. He has a very real need: delivering leading edge seismic processing services to clients all over the world. He more or less dismissed SEPlib on the grounds of its low development rate and difficulty of installation. SU is popular (about 3300 installs) and has the best documentation, but perhaps lacks some modern imaging algorithms. Madagascar, Alex's choice, has about 1100 installs, relatively terse self-documentation (it's all on the wiki), but is the most actively developed.

The legendary Dave Hale (I think that's fair), Colorado School of Mines, gave an overview of his Mines Java Toolkit (JTK). He's one of those rare people who can explain almost anything to almost anybody, so I learned a lot about how to manage parallelization in 2D and 3D arrays of data, and how to break it. Dave is excited about the programming language Scala, a sort of Java lookalike (to me) that handles parallelization beautifully. He also digs Jython, because it has the simplicity and fun of Python, but can incorporate Java classes. You can get his library from his web pages. Installing it on my Mac was a piece of cake, needing only three terminal commands: 

  • svn co http://boole.mines.edu/jtk
  • cd jtk/trunk
  • ant

Chuck Mosher of ConocoPhillips then gave us a look at JavaSeis, an open source project that makes handling prestack seismic data easy and very, very fast. It has parallelization built into it, and is perfect for large, modern 3D datasets and multi-dimensional processing algorithms. His take on open source in commerce: corporations are struggling with the concept, but "it's in their best interests to actively participate".

Eric Jones is CEO of Enthought, the innovators behind (among other things) NumPy/SciPy and the Enthought Python Distribution (or EPD). His take on the role of Python as an integrator and facilitator, handling data traffic and improving usability for the legacy software we all deal with, was practical and refreshing. He is not at all dogmatic about doing everything in Python. He also showed a live demo of building a widget with Traits and Chaco. Awesome.

After lunch, BP's Richard Clarke told us about the history and future of FreeUSP and FreeDDS, a powerful processing system. FreeDDS is being actively developed and released gradually by BP; indeed, a new release is due in the next fews days. It will eventually replace FreeUSP. Richard and others also mentioned that Randy Selzler is actively developing PSeis, the next generation of this processing system (and he's looking for sponsors!). 

German Garabito of the Federal University of Parà, Brazil, generated a lot of interest in BotoSeis, the GUI he has developed to help him teach SU. It allows one to build and manage processing flows visually, in a Java-built interface inspired by Focus, ProMax and other proprietary tools. The software is named after the Amazon river dolphin, or boto (left). Dave Hale described his efforts as the perfect example of the triumph of 'scratching your own itch'.

Continuing the usability theme, Karl Schleicher followed up with a nice look at how he is building scripts to pull field data from the USGS online repository, and perform SU and Madagascar processing flows on them. He hopes he can build a library of such scripts as part of Sergey Fomel's reproducible geophysics efforts. 

Finally, Bill Menger of Global Geophysical told the group a bit about two projects he open sourced when he was at ConocoPhillips: GeoCraft and CPSeis. His insight on what was required to get them into the open was worth sharing: 

  1. Get permission, using a standard open source license (and don't let lawyers change it!)
  2. Communicate the return on investment carefully: testing, bug reporting, goodwill, leverage, etc.
  3. Know what you want to get out of it, and have a plan for how to get there
  4. Pick a platform: compiler, dependencies, queueing, etc (unless you have a lot of time for support!)
  5. Know the issues: helping users, dealing with legacy code, dependency changes, etc.

I am looking forward to another awesome-packed data tomorrow. My own talk is the wafer-thin mint at the end!

What is commercial?

Just another beautiful geomorphological locality in Google's virtual globe software, a powerful teaching aid and just downright fun to play withAt one of my past jobs, we were not allowed to use Google Earth: 'unlicensed business use is not permitted'. So to use it we had to get permission from a manager, then buy the $400 Professional license. This came about because an early End-User License Agreement (EULA) had stipulated 'not for business use'. However, by the time the company had figured out how to enforce this stipulation with an auto-delete from PCs every Tuesday, the EULA had changed. The free version was allowed to be used in a business context (my interpretation: for casual use, learning, or illustration), but not for direct commercial gain (like selling a service). Too late: it was verboten. A game-changing geoscience tool was neutered, all because of greyness around what commercial means. 

Last week I was chastised for posting a note on a LinkedIn discussion about our AVO* mobile app. I posted it to an existing discussion in a highly relevant technical group, Rock Physics. Now, this app costs $2, in recognition of the fact that it is useful and worth something. It will not be profitable, simply because the total market is probably well under 500 people. The discussion was moved to Promotions, where it will likely never be seen. I can see that people don't want blatant commeriality in technical discussion groups. But maybe we need to apply some common sense occasionally: a $2 mobile app is different from a $20k software package being sold for real profit. Maybe that's too complicated and 'commercial means commercial'. What do you think?

But then again, really? Is everyone in applied science not ultimately acting for commercial gain? Is that not the whole point of applied science? Applied to real problems... more often than not for commercial gain, at some point and by somebody. It's hopelessly idealistic, or naïve, to think otherwise. Come to think of it, who of us can really say that what we do is pure academy? Even universities make substantial profits—from their students, licensing patents, or spinning off businesses. Certainly most research in our field (hydrocarbons and energy) is paid for by commercial interests in some way.

I'm not saying that the reason we do our work is for commercial gain. Most of us are lucky enough to love what we do. But more often than not, it's the reason we are gainfully employed to do them. It's when we try to draw that line dividing commercial from non-commercial that I, for one, only see greyness.

News of the week

A geoscience and technology news round-up. If you spot anything we can highlight next week, drop us a line!

Using meteorite impacts as seismic sources on Mars

On Earth and Mars alike, when earthquakes (or Marsquakes) occur, they send energy into the planet's interior that can be used for tomographic imaging. Because the positions of these natural events is never known directly, several recording stations are required to locate these data by triangulation. The earth has an amazing array of stations but not Mars. 

Nick Teanby and James Wookey, geophysicists at the University of Bristol, UK (@UOBEarthScience on Twitter), invvestigated whether meteorite impacts on Mars provide a potentially valuable seismic signal for seeing into the interior of the planet. Because new craters can be resolved precisely from orbital photographs, accurate source positions can be determined without triangulation, and thus used in imaging. 

Investigation showed that seismicity induced by most meteorites is detectable, but only at short ranges, and good for investigating the near surface. Only the largest impacts, which only happen about once every ten years, are strong enough for deep imaging. Read more in their Physics of the Earth and Planetary Interiors paper here. Image credit: NASA/JPL.

Geomage acquires Petro Trace 

Seismic processing company, Geomage, has joined forces with Petro Trace Services in a move to become a full-workflow seismic processing service shop. The merging of these two companies will likely make them the largest geophysical service provider in Russia. Geomage has a proprietary processing technology called Multifocusing, and uses Paradigm's software for processing and interpretation. Click here to read more about the deal.

New bathymetric data for Google Earth

Google Earth now contains bathymetric data from more than two decades of seafloor scanning expeditions. The update was released on World Oceans Day, and represents 500 different surveys covering the size of North America. This new update will allow you to plan your next virtual underwater adventure or add more flair to your envrionmental impact assessment. Google Earth might have to seriously reconsider adapting their streetview name to what,... fishview? Wired.com has a nice demo to get you started. Image: Google Earth.

Workshop: open source software in geophysics

The AAPG's Petroleum Technology Transfer Council, PTTC, is having a workshop on open source software next week. The two-day workshop is on open software tools and reproducibility in geophysics, and will take place at the Houston Research Center in west Houston. Matt will be attending, and is talking about mobile tools on the Friday afternoon. There are still places, and you can register on the University of Texas at Austin website; the price is only $300, or $25 for students. The organizer is Karl Schleicher of UT and BEG.

This regular news feature is for information only. We aren't connected with any of these organizations, and don't necessarily endorse their products or services. Image of Mars credit: NASA/JPL-caltech/University of Arizona. Image of Earth: Google, TerraMetrics, DigitalGlobe, IBCAO.

F is for Frequency

Frequency is the number of times an event repeats per unit time. Periodic signals oscillate with a frequency expressed as cycles per second, or hertz: 1 Hz means that an event repeats once every second. The frequency of a light wave determines its color, while the frequency of a sound wave determines its pitch. One of the greatest discoveries of the 18th century is that all signals can be decomposed into a set of simple sines and cosines oscillating at various strengths and frequencies. 

I'll use four toy examples to illustrate some key points about frequency and where it rears its head in seismology. Each example has a time-series representation (on the left) and a frequency spectrum representation (right).

The same signal, served two ways

This sinusoid has a period of 20 ms, which means it oscillates with a frequency of 50 Hz (1/20 ms-1). A sinusoid is composed of a single frequency, and that component displays as a spike in the frequency spectrum. A side note: we won't think about wavelength here, because it is a spatial concept, equal to the product of the period and the velocity of the wave.

In reflection seismology, we don't want things that are of infinitely long duration, like sine curves. We need events to be localized in time, in order for them to be localized in space. For this reason, we like to think of seismic impulses as a wavelet.

The Ricker wavelet is a simple model wavelet, common in geophysics because it has a symmetric shape and it's a relatively easy function to build (it's the second derivative of a Gaussian function). However, the answer to the question "what's the frequency of a Ricker wavelet?" is not straightforward. Wavelets are composed of a range (or band) of frequencies, not one. To put it another way: if you added monotonic sine waves together according to the relative amplitudes in the frequency spectrum on the right, you would produce the time-domain representation on the left. This particular one would be called a 50 Hz Ricker wavelet, because it has the highest spectral magnitude at the 50 Hz mark—the so-called peak frequency

Bandwidth

For a signal even shorter in duration, the frequency band must increase, not just the dominant frequency. What makes this wavelet shorter in duration is not only that it has a higher dominant frequency, but also that it has a higher number of sine waves at the high end of the frequency spectrum. You can imagine that this shorter duration signal traveling through the earth would be sensitive to more changes than the previous one, and would therefore capture more detail, more resolution.

The extreme end member case of infinite resolution is known mathematically as a delta function. Composing a signal of essentially zero time duration (notwithstanding the sample rate of a digital signal) takes not only high frequencies, but all frequencies. This is the ultimate broadband signal, and although it is impossible to reproduce in real-world experiments, it is a useful mathematical construct.

What about seismic data?

Real seismic data, which is acquired by sending wavelets into the earth, also has a representation in the frequency domain. Just as we can look at seismic data in time, we can look at seismic data in frequency. As is typical with all seismic data, the example below set lacks low and high frequencies: it has a bandwidth of 8–80 Hz. Many geophysical processes and algorithms have been developed to boost or widen this frequency band (at both the high and low ends), to increase the time domain resolution of the seismic data. Other methods, such as spectral decomposition, analyse local variations in frequency curves that may be otherwise unrecognizable in the time domain. 

High resolution signals are short in the time domain and wide or broadband in the frequency domain. Geoscientists often equate high resolution with high frequency, but that it not entirely true. The greater the frequency range, the larger the information carrying capacity of the signal.

In future posts we'll elaborate on Fourier transforms, sampling, and frequency domain treatments of data that are useful for seismic interpreters.

For more posts in our Geophysics from A to Z posts, click here.

To free, or not to free?

Yesterday, Evan and I published our fourth mobile app for geoscientists. It's called AVO*, it does reflectivity modeling, and it costs $2. 

Two bucks?? What's the point? Why isn't it free? Well, it went something like this...

- So, this new app: is it free?
- Well, all our apps are free. I guess it's free.
- Yeah, we don't want to stop it from spreading. If it wants to spread, that is...
- But does free look... worthless? I mean, 'you get what you pay for', right? Look at all the awesome stuff we pay for: Amazon web services, Squarespace web hosting, Hover domain hosting, awesome computers,...
- So what would we charge?
- What do other people charge? 
- There are no 'other people'... But there are technical apps for oil and gas out there. Most of them cost $1.99, some are $4.99, one or two are $9.99. Who knows how many downloads they get? 
- I bet the total revenue is constant: if you charge $1 and get 1000 downloads, then you'll get 100 at $10. But that's an experiment you can never do—once you've charged some amount, you can't really go up. Or down.
- How do other people decide what to charge?
- I guess traditionally you might use a cost-plus model: the cost of production, plus a profit margin.
- What's our cost of production?
- Well, a few days of time... let's call it $5000. If we wanted to make $10 000, and only expect 500 people to even be in the market... It doesn't work. No-one will pay $20 for a cell phone widget.
- Won't they just expense it?
- Maybe... I don't think the industry is quite there yet.
- Hmm... I downloaded an app for $20 once [a seismograph]. And another for $10 [a banjo tuner]. I don't even think about paying $1 or $2. That amount is basically free. $1 is free.
- But a buck... isn't it just a pain to even get your credit card out?
- Well, once you're set up in Google Checkout, or iTunes or whatever, it's essentially one click. And then we get a sense of the real user base. The hard core!
- Yeah... right now about 50% of people who install an app nuke it a few days later.
- At least if it's under $5 we probably won't have to deal with refunds and other nonsense.
- Arrgghhhh... why is this so hard? 
- Let's make it $2. 
- Let's make it free.
- But this app is awesome. Awesome shouldn't be free. Awesome is never free. Awesome costs. 
- But isn't this really just a thing that says "Agile is awesome, check us out, hire us"? It's marketing.
- Maybe... but it's useful too. It works. It does something. It has Science Inside™. People will get $1-worth out of it every time they use it. If this was a <insert energy software empire> app it would cost $10 000.
- Can we ask people to pay what they want? Like what Radiohead did with In Rainbows?
- No because they're already huge. They invoke mass hysteria in grown men. We don't invoke mass hysteria. Among anyboy.
- Damn. OK. Let's make it nearly free. As-good-as-free. Free-ish. Pseudo-free. Free*.
- $2?
- $2.

So the app costs a toonie, and we promise you won't regret it for a second. If you can't afford it, email us and we'll send you a free one. If you really hate it, email us and we'll send you $3.

What is AVO?

I used to be a geologist (but I'm OK now). When I first met seismic data, I took the reflections and geometries quite literally. The reflections come from geology, so it seems reasonable to interpret them as geology. But the reflections are waves, and waves are slippery things: they have to travel through kilometres of imperfectly known geology; they can interfere and diffract; they emanate spherically from the source and get much weaker quickly. This section from the Rockall Basin in the east Atlantic shows this attenuation nicely, as well as spectacular echo reflections from the ocean floor called multiples:

Rockall seismicData from the Virtual Seismic Atlas, contributed by the British Geological Survey.

Impedance is the product of density and velocity. Despite the complexity of seismic reflections, all is not lost. Even geologists interpreting seismic know that the strength of seismic reflections can have real, quantitative, geological meaning. For example, amplitude is related to changes in acoustic impedance Z, which is equal to the product of bulk density ρ and P-wave velocity V, itself related to lithology, fluid, and porosity.

Flawed cartoon of a marine seismic survey. OU, CC-BY-SA-NC.

But when the amplitude versus offset (AVO) behaviour of seismic reflections gets mentioned, most non-geophysicists switch off. If that's your reaction too, don't be put off by the jargon, it's really not that complicated.

The idea that we collect data from different angles is not complicated or scary. Remember the classic cartoon of a seismic survey (right). It's clear that some of the ray paths bounce off the geological strata at relatively small incidence angles, closer to straight down-and-up. Others, arriving at receivers further away from the source, have greater angles of incidence. The distance between the source and an individual receiver is called offset, and is deducible from the seismic field data because the exact location of the source and receivers is always known.

The basic physics behind AVO analysis is that the strength of a reflection does not only depend on the acoustic impedance—it also depends on the angle of incidence. Only when this angle is 0 (a vertical, or zero-offset, ray) does the simple relationship above hold.

Total internal reflection underwater. Source: Mbz1 via Wikimedia Commons.Though it may be unintuitive at first, angle-dependent reflectivity is an idea we all know well. Imagine an ordinary glass window: you can see through it perfectly well when you look straight through it, but when you move to a wide angle it suddenly becomes very reflective (at the so-called critical angle). The interface between water and air is similarly reflective at wide angles, as in this underwater view.

Karl Bernhard Zoeppritz (German, 1881–1908) was the first seismologist to describe the relationship between reflectivity and angle of incidence. In this context, describe means write down the equations for. Not two or three equations, lots of equations.

The Zoeppritz equations are very good model for how seismic waves propagate in the earth. There are some unnatural assumptions about isotropy, total isolation of the interface, and other things, but they work well in many real situations. The problem is that the equations are unwieldy, especially if you are starting from seismic data and trying to extract rock properties—trying to solve the so-called inverse problem. Since we want to be able to do useful things quickly, and since seismic data are inherently approximate anyway, several geophysicists have devised much friendlier models of reflectivity with offset.Google Nexus S

I'll take a look at these more friendly models next time, because I want to tell a bit about how we've implemented them in our soon-to-be-released mobile app, AVO*. No equations, I promise! Well, one or two...

News of the week

This week has been fairly quiet for geoscience and technology news, so we're hijacking our own post to highlight a couple of Agile* changes you might have missed. The first one is this very feature—our News of the week post. More or less every Friday we round up some geoscience news with an oil & gas or technology angle. If you spot something you think we should include, please scribble a quick note to hello at agilegeoscience dot com!

Another new feature on our site is subscription by email. Every blog post comes right to your email inbox, so you won't miss any Agile* goodness. Go to the SUBSCRIBE box on the right (under the tag cloud), enter your email address and hit Subscribe. It's that easy! No password to remember, 100% spam free, and you can unsubscribe any time. Powered by Google.

If you're a regular reader then you know all about our new mobile apps. At the moment, for completely practical reasons, they are only available for Android™ devices. We just upgraded our first app, Volume*, a prospect volumetrics tool—now you can save and recall prospects! In the next couple of days we will launch our first über-app, AVO*. Visit the wiki for a sneak peak.

The wiki? Yes, last month we launched AgileWiki, an experiment in sharing knowledge about the subsurface. We know that much of what we know and do as industrial, applied scientists is proprietary—that's what in-house corporate wikis and knowledge bases are for. But some of it at least is basic, foundational, and generic in nature. And that's what AgileWiki is for. Join in, share what you know!

Agile's YouTube channelYou might have noticed we've started dabbling a bit with video, and have a nascent YouTube channel. Today, the focus is on our mobile apps, but we are planning features on seismic interpretation workflows and other fun things. And we're open to feedback and suggestions on this effort, please let us know what you think!

Agile Geoscience brochure imageWe work hard to be interesting and relevant, not self-promoting and commercial. But occasionally people ask us what we actually do. So we made a one-pager setting out our stall. If you need some help doing something weird and wonderful, or just tricky and time-consuming, keep us in mind! We love helping people.

This ends the public service announcement. Back to our regular news feature next week!

Pair picking

Even the Lone Ranger didn't work alone all of the timeImagine that you are totally entrained in what you are doing: focused, dedicated, and productive. If you've lost track of time, you are probably feeling flow. It's an awesome experience when one person gets it, imagine the power when teams get it. Because there are so many interruptions that can cause turbulence, it can be especially difficult to establish coherent flow for the subsurface team. But if you learn how to harness and hold onto it, it's totally worth it.

Seismic interpreters can seek out flow by partnering up and practising pair picking. Having a partner in the passenger seat is not only ideal for training, but it is a superior way to get real work done. In other industries, this has become routine because it works. Software developers sometimes code in pairs, and airline pilots share control of an aircraft. When one person is in charge of the controls, the other is monitoring, reviewing, and navigating. One person for tactical jobs, one for strategic surveillance.

Here are some reasons to try pair picking:

Solve problems efficiently — If you routinely have an affiliate, you will have someone to talk to when you run into a challenging problem. Mundane or sticky workarounds become less tenuous when you have a partner. You'll adopt more sensible solutions to your fit-for-purpose hacks.

Integrate smoothly — There's a time for hand-over, and there will be times when you must call upon other people's previous work to get your job done. 'No! Don't use Top_Cretaceous_candidate_final... use Evan_K_temp_DO-NOT-USE.' Pairing with the predecessors and successors of your role will get you better-aligned.

Minimize interruptionitis — if you have to run to a meeting, or the phone rings, your partner can keep plugging away. When you return you will quickly rejoin. It is best to get into a visualization room, or some other distraction-free room with a large screen, so as to keep your attention and minimize the effect of interruptions.

Mutual accountability — build allies based on science, technology, and critical thinking, not gossip or politics. Your team will have no one to blame, and you'll feel more connected around the office. Is knowledge hoarded and privileged or is it open and shared? If you pick in pairs, there is always someone who can vouch for your actions.

Mentoring and training — by pair picking, newcomers quickly get to watch the flow of work, not just a schematic flow-chart. Instead of just an end-product, they see the clicks, the indecision, the iteration, and the pace at which tasks unfold.

Practicing pair picking is not just about sharing tasks, it is about channeling our natural social energies in the pursuit of excellence. It may not be practical all of the time, and it may make you feel vulnerable, but pairing up for seismic interpretation might bring more flow to your workflow.

If you give it a try, please let us know how it goes!

Geophysical stamps 2: Sonic

Recently I bought some stamps on eBay. This isn't something I've done before, but when I saw these stamps I couldn't resist their pure geophysical goodness. They are East German stamps from 1980, and they are unusual because they aren't fanciful illustrations, but precise, technical drawings. Last week I described the gravimeter; today it's the turn of a borehole instrument, the sonic tool.

← The 25 pfennig stamp in the series of four shows a sonic tool, complete with the logged data on the left, and a cross-section on the right. Bohrlochmessung means well-logging; Wassererkundung translates as water exploration. The actual size of the stamp is 43 × 26 mm.

The tool has two components: a transmitter and a recevier. It is lowered to the bottom of the target interval and logs data while being pulled up the hole. In its simplest form, an ultrasound pulse (typically 20–40 kHz) is emitted from the transmitter, travels through the formation, and is recorded at the receiver. The interval transit time is recorded continuously, giving the trace shown on left hand side of the stamp. Transit time is measured in µs/m (or µs/ft if you're old-school), and is generally between 160 µs/m and 550 µs/m (or, in terms of velocity, 1800 m/s to 6250 m/s). Geophysicists often use the transit time to estimate seismic velocities; it's important to correct for the phenomenon called dispersion: lower-frequency seismic waves travel more slowly than the high-frequency waves measured by these tools.

Sonic logs are used for all sorts of other things, for example:

  • Predicting the seismic response (when combined with the bulk density log)
  • Predicting porosity, because of the large difference between velocity in fluids vs minerals
  • Predicting pore pressure, an important safety concern and reservoir property
  • Measuring anisotropy, especially due to oriented fractures (important for permeability)
  • Qualitatively predicting lithology, especially coals (slow), salt (4550 m/s), dolomite (fast)

Image credit: National Energy Technology Lab.Modern tools are not all that different from early sonic tools. They measure the same thing, but with better electronics for improved vertical resolution and noise attenuation. The biggest innovations are dipole sonic tools for accurate shear-wave velocities, multi-azimuth tools for measuring anisotropy, high resolution tools, and high-pressure, high-temperature (HPHT) tools.

Another relatively recent advance is reliable sonic-while-drilling tools such as Schlumberger's sonicVISION™ system, the receiver array of which is shown here (for the 6¾" tool).

The sonic tool may be the most diversely useful of all the borehole logging tools. In a totally contrived scenario where I could only run a single tool, it would have to be the sonic, especially if I had seismic data... What would you choose?

Next time I'll look at the 35 pfennig stamp, which shows a surface geophone.