Graphics that repay careful study

The Visual Display of Quantitative Information by Edward Tufte (2nd ed., Graphics Press, 2001) celebrates communication through data graphics. The book provides a vocabulary and practical theory for data graphics, and Tufte pulls no punches — he suggests why some graphics are better than others, and even condemns failed ones as lost opportunities. The book outlines empirical measures of graphical performance, and describes the pursuit of graphic-making as one of sequential improvement through revision and editing. I see this book as a sort of moral authority on visualization, and as the reference book for developing graphical taste.

Through design, the graphic artist allows the viewer to enter into a transaction with the data. High performance graphics, according to Tufte, 'repay careful study'. They support discovery, probing questions, and a deeper narrative. These kinds of graphics take a lot of work, but they do a lot of work in return. In later books Tufte writes, 'To clarify, add detail.'

A stochastic AVO crossplot

Consider this graphic from the stochastic AVO modeling section of modelr. Its elements are constructed with code, and since it is a program, it is completely reproducible.

Let's dissect some of the conceptual high points. This graphic shows all the data simultaneously across 3 domains, one in each panel. The data points are sampled from probability density estimates of the physical model. It is a large dataset from many calculations of angle-dependent reflectivity at an interface. The data is revealed with a semi-transparent overlay, so that areas of certainty are visually opaque, and areas of uncertainty are harder to see.

At the same time, you can still see every data point that makes the graphic giving a broad overview (the range and additive intensity of the lines and points) as well as the finer structure. We place the two modeled dimensions with templates in the background, alongside the physical model histograms. We can see, for instance, how likely we are to see a phase reversal, or a Class 3 response subject to the physical probability estimates. The statistical and site-specific nature of subsurface modeling is represented in spirit. All the data has context, and all the data has uncertainty.

Rules for graphics that work

Tufte summarizes that excellent data graphics should:

  • Show all the data.
  • Provoke the viewer into thinking about meaning.
  • Avoid distorting what the data have to say.
  • Present many numbers in a small space.
  • Make large data sets coherent.
  • Encourage the eye to compare different pieces of the data.
  • Reveal the data at several levels of detail, from a broad overview to the fine structure.
  • Serve a reasonably clear purpose: description, exploration, tabulation, or decoration.
  • Be closely integrated with the statistical and verbal descriptions of a data set.

The data density, or data-to-ink ratio, looks reasonably high in my crossplot, but it could like still be optimized. What would you remove? What would you add? What elements need revision?

Whither technical books?

Pile of geophysics booksLeafing through our pile of new books on seismic analysis got me thinking about technical books and the future of technical publishing. In particular:

  • Why are these books so expensive? 
  • When will we start to see reproducibility?
  • Does all this stuff just belong on the web?

Why so expensive?

Should technical books really cost several times what ordinary books cost? Professors often ask us for discounts for modelr, our $9/mo seismic modeling tool. Students pay 10% of what pros pay in our geocomputing course. Yet academic books cost three times what consumer books cost. I know it's a volume game — but you're not going to sell many books at $100 a go! And unlike consumer books, technical authors usually don't make any money — a star writer may score 6% of net sales... once 500 books have been sold (see Handbook for Academic Authors).

Where's the reproducibility?

Compared to the amazing level of reproducibility we saw at SciPy — where the code to reproduce virtually every tutorial, talk, and poster was downloadable — books are still rather black box. For example, the figures are often drafted, not generated. A notable (but incomplete) exception is Chris Liner's fantastic (but ridiculously expensive) volume, Elements of 3D Seismology, in which most of the figures seem to have been generated by Mathematica. The crucial final step is to share the code that generated them, and he's exploring this in recent blog posts (e.g. right).

I can think of three examples of more reproducible geophysics in print:

  1. Gary Mavko has shared a lot of MATLAB code associated with Quantitative Seismic Interpretation and The Rock Physics Handbook. The code to reproduce the figures is not provided, and MATLAB is not really open, but it's a start.
  2. William Ashcroft's excellent book, A Petroleum Geologist's Guide to Seismic Reflection contains (proprietary, Windows only) code on a CD, so you could in theory make some of the figures yourself. But it wouldn't be easy.
  3. The series of tutorials I'm coordinating for The Leading Edge has, so far, includes all code to reproduce figures, exclusively written in open languages and using open or synthetic data. Kudos to SEG!

Will the web win?

None of this comes close to Sergey Fomel's brand of fully reproducible geophysics. He is a true pioneer in this space, up there with Jon Claerbout. (You should definitely read his blog!). One thing he's been experimenting with is 'live' reproducible documents in the cloud. If we don't see an easy way to publish live, interactive notebooks in the cloud this year, we'll see them next year for sure.

So imagine being able to read a technical document, a textbook say, with all the usual features you get online — links, hover-over, clickable images, etc. But then add the ability to not only see the code that produced each figure, but to edit and re-run that code. Or add slider widgets for parameters — "What happens to the gather if if I change Poisson's ratio?" Now, since you're on the web, you can share your modification with your colleagues, or the world.

Now that's a book I'd be glad to pay double for.

Some questions for you

We'd love to know what you think of technical books. Leave a comment below, or get in touch

  • Do you purchase technical books regularly? What prompts you to buy a book?
  • What book keeps getting pulled off your shelf, and which ones collect dust?
  • What's missing from the current offerings? Workflows, regional studies, atlases,...?
  • Would you rather just consume everything online? Do you care about reproducibility?

400 posts

The last post was our 400th on this blog. At an average of 500 words, that's about 200,000 words since we started at the end of 2010. Enough for a decent-sized novel, but slightly less likely to win a Pulitzer. In that time, according to Google, almost exactly 100,000 individuals have stopped by agilegeoscience.com — most of them lots of times — thank you readers for keeping us going! The most popular posts: Shale vs tight, Rock physics cheatsheet, and Well tie workflow. We hope you enjoy reading at least half as much as we enjoy writing.

Six books about seismic analysis

Last year, I did a round-up of six books about seismic interpretation. A raft of new geophysics books recently, mostly from Cambridge, prompts this look at six volumes on seismic analysis — the more quantitative side of interpretation. We seem to be a bit hopeless at full-blown book reviews, and I certainly haven't read all of these books from cover to cover, but I thought I could at least mention them, and give you my first impressions.

If you have read any of these books, I'd love to hear what you think of them! Please leave a comment. 

Observation: none of these volumes mention compressive sensing, borehole seismic, microseismic, tight gas, or source rock plays. So I guess we can look forward to another batch in a year or two, when Cambridge realizes that people will probably buy anything with 3 or more of those words in the title. Even at $75 a go.


Quantitative Seismic Interpretation

Per Avseth, Tapan Mukerji and Gary Mavko (2005). Cambridge University Press, 408 pages, ISBN 978-0-521-15135-1. List price USD 91, $81.90 at Amazon.com, £45.79 at Amazon.co.uk

You have this book, right?

Every seismic interpreter that's thinking about rock properties, AVO, inversion, or anything beyond pure basin-scale geological interpretation needs this book. And the MATLAB scripts.

Rock Physics Handbook

Gary Mavko, Tapan Mukerji & Jack Dvorkin (2009). Cambridge University Press, 511 pages, ISBN 978-0-521-19910-0. List price USD 100, $92.41 at Amazon.com, £40.50 at Amazon.co.uk

If QSI is the book for quantitative interpreters, this is the book for people helping those interpreters. It's the Aki & Richards of rock physics. So if you like sums, and QSI left you feeling unsatisifed, buy this too. It also has lots of MATLAB scripts.

Seismic Reflections of Rock Properties

Jack Dvorkin, Mario Gutierrez & Dario Grana (2014). Cambridge University Press, 365 pages, ISBN 978-0-521-89919-2. List price USD 75, $67.50 at Amazon.com, £40.50 at Amazon.co.uk

This book seems to be a companion to The Rock Physics Handbook. It feels quite academic, though it doesn't contain too much maths. Instead, it's more like a systematic catalog of log models — exploring the full range of seismic responses to rock properies.

Practical Seismic Data Analysis

Hua-Wei Zhou (2014). Cambridge University Press, 496 pages, ISBN 978-0-521-19910-0. List price USD 75, $67.50 at Amazon.com, £40.50 at Amazon.co.uk

Zhou is a professor at the University of Houston. His book leans towards imaging and velocity analysis — it's not really about interpretation. If you're into signal processing and tomography, this is the book for you. Mostly black and white, the book has lots of exercises (no solutions though).

Seismic Amplitude: An Interpreter's Handbook

Rob Simm & Mike Bacon (2014). Cambridge University Press, 279 pages, ISBN 978-1-107-01150-2 (hardback). List price USD 80, $72 at Amazon.com, £40.50 at Amazon.co.uk

Simm is a legend in quantitative interpretation and the similarly lauded Bacon is at Ikon, the pre-eminent rock physics company. These guys know their stuff, and they've filled this superbly illustrated book with the essentials. It belongs on every interpreter's desk.

Seismic Data Analysis Techniques...

Enwenode Onajite (2013). Elsevier. 256 pages, ISBN 978-0124200234. List price USD 130, $113.40 at Amazon.com. £74.91 at Amazon.co.uk.

This is the only book of the collection I don't have. From the preview I'd say it's aimed at undergraduates. It starts with a petroleum geology primer, then covers seismic acquisition, and seems to focus on processing, with a little on interpretation. The figures look rather weak, compared to the other books here. Not recommended, not at this price.

NOTE These prices are Amazon's discounted prices and are subject to change. The links contain a tag that gets us commission, but does not change the price to you. You can almost certainly buy these books elsewhere. 

6 questions about seismic interpretation

This interview is part of a series of conversations between Satinder Chopra and the authors of the book 52 Things You Should Know About Geophysics (Agile Libre, 2012). The first three appeared in the October 2013 issue of the CSEG Recorder, the Canadian applied geophysics magazine, which graciously agreed to publish them under a CC-BY license.


Satinder Chopra: Seismic data contain massive amounts of information, which has to be extracted using the right tools and knowhow, a task usually entrusted to the seismic interpreter. This would entail isolating the anomalous patterns on the wiggles and understanding the implied subsurface properties, etc. What do you think are the challenges for a seismic interpreter?

Evan Bianco: The challenge is to not lose anything in the abstraction.

The notion that we take terabytes of prestack data, migrate it into gigabyte-sized cubes, and reduce that further to digitized surfaces that are hundreds of kilobytes in size, sounds like a dangerous discarding of information. That's at least 6 orders of magnitude! The challenge for the interpreter, then, is to be darn sure that this is all you need out of your data, and if it isn't (and it probably isn't), knowing how to go back for more.

SC: How do you think some these challenges can be addressed?

EB: I have a big vision and a small vision. Both have to do with documentation and record keeping. If you imagine the entire seismic experiment upon a sort of conceptual mixing board, instead of as a linear sequence of steps, elements could be revisited and modified at any time. In theory nothing would be lost in translation. The connections between inputs and outputs could be maintained, even studied, all in place. In that view, the configuration of the mixing board itself becomes a comprehensive and complete history for the data — what's been done to it, and what has been extracted from it.

The smaller vision: there are plenty of data management solutions for geospatial information, but broadcasting the context that we bring to bear is a whole other challenge. Any tool that allows people to preserve the link between data and model should be used to transfer the implicit along with the explicit. Take auto-tracking a horizon as an example. It would be valuable if an interpreter could embed some context into an object while digitizing. Something that could later inform the geocellular modeler to proceed with caution or certainty.

SC: One of the important tasks that a seismic interpreter faces is the prediction about the location of the hydrocarbons in the subsurface.  Having come up with a hypothesis, how do you think this can be made more convincing and presented to fellow colleagues?

EB: Coming up with a hypothesis (that is, a model) is solving an inverse problem. So there is a lot of convincing power in completing the loop. If all you have done is the inverse problem, know that you could go further. There are a lot of service companies who are in the business of solving inverse problems, not so many completing the loop with the forward problem. It's the only way to test hypotheses without a drill bit, and gives a better handle on methodological and technological limitations.

SC: You mention "absolving us of responsibility" in your article.  Could you elaborate on this a little more? Do you think there is accountability of sorts practiced in our industry?

EB: I see accountability from a data-centric perspective. For example, think of all the ways that a digitized fault plane can be used. It could become a polygon cutting through a surface on map. It could be a wall within a geocellular model. It could be a node in a drilling prognosis. Now, if the fault is mis-picked by even one bin, this could show up hundreds of metres away, depending on the dip of the fault, compared to the prognosis. Practically speaking, accounting for mismatches like this is hard, and is usually done in an ad hoc way, if at all. What caused the error? Was it the migration or was it the picking? Or what about the error in the measurement of the drill-bit? I think accountability is loosely practised at best because we don't know how to reconcile all these competing errors.

Until data can have a memory, being accountable means being diligent with documentation. But it is time-consuming, and there aren’t as many standards as there are data formats.

SC: Declaring your work to be in progress could allow you to embrace iteration.  I like that. However, there is usually a finite time to complete a given interpretation task; but as more and more wells are drilled, the interpretation could be updated. Do you think this practice would suit small companies that need to ensure each new well is productive or they are doomed?

EB: The size of the company shouldn't have anything to do with it. Iteration is something that needs to happen after you get new information. The question is not, "do I need to iterate now that we have drilled a few more wells?", but "how does this new information change my previous work?" Perhaps the interpretation was too rigid — too precise — to begin with. If the interpreter sees her work as something that evolves towards a more complete picture, she needn't be afraid of changing her mind if new information proves us to be incorrect. Depth migration, for example, exemplifies this approach. Hopefully more conceptual and qualitative aspects of subsurface work can adopt it as well.

SC: The present day workflows for seismic interpretation for unconventional resources demand more than the usual practices followed for the conventional exploration and development.  Could you comment on how these are changing?

EB: With unconventionals, seismic interpreters are looking for different things. They aren't looking for reservoirs, they are looking for suitable locations to create reservoirs. Seismic technologies that estimate the state of stress will become increasingly important, and interpreters will need to work in close contact to geomechanics. Also, microseismic monitoring and time-lapse technologies tend to push interpreters into the thick of the operations, which allow them to study how the properties of the earth change according to operations. What a perfect place for iterative workflows.


You can read the other interviews and Evan's essay in the magazine, or buy the book! (You'll find it in Amazon's stores too.) It's a great introduction to who applied geophysicists are, and what sort of problems they work on. Read more about it. 

Join CSEG to catch more of these interviews as they come out. 

52 Things is out!

The new book is out! You can now order it from Amazon.com, Amazon.co.uk, Amazon in Europe, and it will be available soon from Amazon.ca and various other online bookstores.

What's it about? I sent Andrew Miall some chapters at the proof stage; here's what he said:

Geology is all about rocks, and rocks are all about detail and field context, and about actually being out there at that critical outcrop, right THERE, that proves our point. The new book 52 Things You Should Know About Geology is full of practical tips, commentary, and advice from real geologists who have been there and know what the science is all about.

Amazing authors

A massive thank you to my 41 amazing co-authors...

Eight of these people also wrote in 52 Things You Should Know About Geophysics; the other 34 are new to this project. Between them, this crowd has over 850 years of experience in geoscience (more remarkably, just two of them account for 100 years!). Half of the authors are primarily active in North America; others are in the UK, Germany, Indonesia, India, the Netherlands, and Norway. Ten are engaged in academia, four of them as students. The diversity is wonderful as far as it goes, but the group is overwhelmingly composed of white men; it's clear we still have work to do there. 

We have the globe mostly covered in the essays themselves too. Regrettably, we have gaping holes over South America and most of Africa. We will endeavour to fix this in future books. This map shows page numbers...

Giving back

Academic publishing is a fairly marginal business, because the volumes are so small. Furthermore, we are committed to offering books at consumer, not academic, prices. The 42 authors have shown remarkable generosity of time and spirit in drafting these essays for the community at large. If you enjoy their work, I'm certain they'd love to hear about it.

In part to recognize their efforts, and to give something back to the community that supports these projects (that's you!), we approached the AAPG Foundation and offered to donate $2 from every sale to the charity. They were thrilled about this — and we look forward to helping them bring geoscience to more young people.

These books are all about sharing — sharing knowledge and sharing stories. These are the things that make our profession the dynamic, sociable science that it is. If you would like to order 10 or more copies for your friends, students, or employees, do get in touch and we will save you some money.

52 Things... About Geology

Welcome to the new book from Agile Libre! The newest, friendliest, awesomest book about petroleum geoscience. 

The book will be out later in November, pending review of the proof, but you can pre-order it now from Amazon.com at their crazy offer price of only $13.54. When it comes out, the book will hit Amazon.ca, Amazon.co.uk, and other online booksellers.

63 weeks to mature

It's truly a privilege to publish these essays. When an author hands over a manuscript, they are trusting the publisher and editors to do justice not just to the words, but to the thoughts inside. And since it's impossible to pay dozens of authors, they did it all for nothing. To recognize their contributions to the community, we're donating $2 from every book sale to the AAPG Foundation. Perhaps the students that benefit from the Foundation will go on to share what they know. 

This book took a little stamina, compared to 52 Things... Geophysics. We started inviting authors on 1 July 2012, and it took 442 days to get all the essays. As before, the first one came almost immediately; this time it was from George Pemberton, maintaining the tradition of amazing people being great champions for these projects. Indeed, Tony Doré — another star contributor — was a big reason the book got finished.

What's inside?

To whet your appetite, here are the first few chapters from the table of contents:

  • Advice for a prospective geologist — Mark Myers, 14th Director of the USGS
  • As easy as 1D, 2D, 3D — Nicholas Holgate, Aruna Mannie, and Chris Jackson
  • Computational geology — Mark Dahl, exploration geologist at ConocoPhillips
  • Coping with uncertainty — Duncan Irving at TeraData
  • Geochemical alchemy — Richard Hardman, exploration legend
  • Geological inversion — Evan Bianco of Agile
  • Get a helicopter not a hammer — Alex Cullum of Statoil

Even this short list samples some of the breadth of topics, and the range of experience of the contributors. Nichlas and Aruna are PhD students of Chris Jackson at Imperial College London, and Richard Hardman is a legend on the UK exploration scene, with over 50 years of experience. Between them, the 42 authors have notched up over 850 career-years — the book is a small window into this epic span of geological thinking.

We're checking the proofs right now. The book should be out in about 2 weeks, just in time for St Barbara's day!

Pre-order now from Amazon.com 
Save more than 25% off the cover price!

It's $13.54 today, but Amazon sets the final price... I don't know how long the offer will last. 

Standards, streamers, and Sherlock

Yesterday afternoon at the SEG Annual Meeting I spent some time with Jill Lewis from Troika and Rune Hagelund, a consultant. They have both served on the SEG Standards Committee, helping define the SEG-D and SEG-Y standards for field data and processed data respectively. The SEG standards are — in my experience — almost laughably badly implemented by most purveyors of data. Why is this? Is the standard too inflexible? Are the field definitions unclear? The confusion can lead to real problems: I know I've inadvertently loaded a seismic survey back to front. If you feel passionately about it, the committee is always looking for feedback.

Land streamerI wandered past a poster yesterday morning about land streamers. The last I'd seen of this idea — dragging a train of geophones behind a truck — was a U of C test at Priddis, Alberta, by Gabriella Suarez and Rob Stewart. I haven't been paying much attention, but this was the first commercial implementation I'd seen – a shallow acquifer study in Sweden, reported by Boiero et al. The truck has the minivibe source right behind it, and the streamer after that. Quicker than juggies!

I don't know if this is a function of my recent outlook on conferences, but this was the first conference I've been to where all of the best things have been off-site. Perhaps the best part of the week was last night — a 3-hour geek-fest with Evan, Ben Bougher, Sam Kaplan, and Bernal Manzanilla. The conversation covered compressive sensing, stochastic resonance, acoustic lenses, and the General Inverse-Problemness of All Things.

As I mentioned last week, I hung out every day in the press room, waiting for wiki-curious visitors. We didn't have many drop-ins (okay, four), but I had some great chats with SEG staff, my friend Mike Stone, and a couple of other enthusiasts. I also started some fun projects to move some quality content into SEG Wiki. If you're at all interested in seeing a vibrant, community-driven space for geophysical knowledge, do get involved.

I bought some books in the Book Mart yesterday: Planning Land 3D Seismic Surveys by Andreas Cordsen et al., 3D Seismic Survey Design by Gijs Vermeer, and Fundamentals of Geophysical Interpretation by Larry Lines and Rachel Newrick. We're increasingly interested in modeling, and acquisition is where it all begins; Lines and Newrick was mostly just a gap on the shelf, but it also covers some topics I have little familiarity with, such as electromagnetics. Jennifer Cobb, SEG publications manager, showed me the intriguing new monograph by geophysical legend Enders Robinson and TLE Editor Dean Clark — I can't quite remember the title but it had something to do with Sherlock Holmes and Albert Einstein — a story about scientific investigation and geophysics. Looking forward to picking that up.

That's it for me this year; I'm writing this on the flight home. Evan is staying for the workshops and will report on those. As usual, I'm leaving with a lot of things to follow up on over the next weeks and months. I'm also trying to sift my feelings about the Annual Meeting, and especially the Exhibition. So many people, so much time, so much marketing...

What is Creative Commons?

Not a comprehensive answer either, but much more brilliantI just found myself typing a long email in reply to the question, "What is a Creative Commons license and how do I use it?" Instead, I thought I'd post it here. Note: I am not a lawyer, and this is not a comprehensive answer.

Creative Commons depends on copyright

There is no relinquishment of copyright. This is important. In the case of 52 Things, Agile Geoscience is the copyright holder. In the case of an article, it's the authors themselves, unless the publisher gets them to sign a form relinquishing it. Copyright is an automatic, moral right (under the Berne Convention), and boils down to the right to be identified as the authors of the work ('attribution').

Most copyright holders grant licenses to re-use their work. For instance, you can pay hundreds of dollars to reproduce a couple of pages from an SPE manual for a classroom of students (if you are insane). You can reprint material from a book or journal article — again, usually for a fee. These licenses are usually non-exclusive, non-transferable, and use-specific. And the licensee must (a) ask and (b) pay the licensor (that is, the copyright holder). This is 'the traditional model'.

Obscurity is a greater threat than piracy

Some copyright holders are even more awesome. They recognize that (a) it's a hassle to always have to ask, and (b) they'd rather have the work spread than charge for it and stop it spreading. They wish to publish 'open' content. It's exactly like open source software. Creative Commons is one, very widespread, license you can apply to your work that means (a) they don't have to ask to re-use it, and (b) they don't have to pay. You can impose various restrictions if you must.

Creative Commons licenses are everywhere. You can apply Creative Commons licenses at will, to anything you like. For example, you can CC-license your YouTube videos or Flickr photos. We CC-license our blog posts. Almost everything in Wikipedia is CC-licensed. You could CC-license a single article in a magazine (in fact, I somewhat sneakily did this last February). You could even CC-license a scientific journal (imagine!). Just look at all the open content in the world!

Creative Commons licenses are easy to use. Using the license is very easy: you just tell people. There is no cost or process. Look at the footer of this very page, for example. In print, you might just add the line This article is licensed under a Creative Commons Attribution license. You may re-use this work without permission. See http://creativecommons.org/licenses/by/3.0/ for details. (If you choose another license, you'd use different wording.)

Creative_Commons.jpg

I recommend CC-BY licenses. There are lots of open licenses, but CC-BY strikes a good balance between being well-documented and trusted, and being truly open (though it is not recognized as such, on a technicality, by copyfree.org). The main point is that they are very open, allowing anyone to use the work in any way, provided they attribute it to the author and copyright holder — it's just like scientific citation, in a way.

Do you openly license your work? Or do you wish more people did? Do you notice open licenses?

Creative Commons graphic by Flickr user Michael Porter. The cartoon is from Nerdson, and licensed CC-BY. 'Obscurity is a greater threat than piracy' is paraphrased from a quote by Tim O'Reilly, publishing 2.0 legend. 

Review: First Steps in Seismic Interpretation

Thomas Martin is a first-year graduate student at Scripps Institution of Oceanography. He got bored of waiting for us to review the seismic interpretation books (we are tectonically slow sometimes) and offered to review some for us. Thank you, Thomas! He's just about to set off on a research cruise to the Canadian Arctic on USCGC Healy to collect CHIRP seismic reflection data and sediment cores.


Herron, D (2012). First Steps in Seismic Interpretation. Geophysical Monograph Series 16. Society of Exploration Geophysicists, Tulsa, OK. Print ISBN 978-156080280-8. Ebook DOI 10.1190/1.9781560802938. List price: USD62. Member price: USD49. Student price: USD39.20

As a graduate student, this book has become quite the resource for me. It gives a good handle on basic seismic properties, and provides a solid introduction. Some of the topics it covers are not typically discussed in a geoscience journal papers that use seismic reflection data (migration comes to mind). The table of contents gives an idea of the breadth:

  1. Introduction
  2. Seismic response
  3. Seismic attributes — including subsections on amplitude, coherence, and inversion
  4. Velocity — sonic logs, well velocity surveys, seismic velocities, anisotropy, and depth conversion
  5. Migration
  6. Resolution
  7. Correlation concepts — horizons and faults, multiples, pitfalls, automatic vs manual picking
  8. Correlation procedures — loop tying, jump correlations, visualization
  9. Data quality and management — keeping track of everything
  10. Other considerations — e.g. 4D seismic, uncertainty and risk, and ergonomics

One of the great things about this book is that it's designed to be light on math, so all levels of geoscientists can pick it up. I have found this book invaluable because it is a great bridge from the 'pure' geophysicist to the seismic interpreter, and can improve the dialogue between these two camps. Chapter 10 is 'leftover' subjects, but it is one of the most helpful in the book as it covers approximations, career development, and a fantastic section on time spent and value added.

The book covers a lot of ground but, with the book coming in at under 200 pages, nothing in detail — this is not meant to be the ultimate text for seismic interpretation. I think the book is a little light for nearly $40 plus shipping, however (student price; the list price is over $60). I would recommend it to graduate students or early career scientists with an interest in seismic data, across the full range of geoscience disciplines. The price for a student is high for a small paperback book under 200 pages, but the content is worth it.

If you liked this review please leave an encouraging comment for Thomas.

Connection through attribution

Agile's rock physics cheatsheet. Sort of.At EAGE I picked up Ikon Science's latest swag, a coil-bound notebook. On page 8, I found a modified version of our rock physics cheatsheet. "Hey, that's neat, someone thinks it's useful!" But then I thought, "Darn, I wish they'd mentioned us." Like nearly all of the work we do that's online, it's labelled CC-BY. Meaning anyone can use it, however they want, but with attribution

It's incredibly rewarding to see our stuff being used and spreading. That's why we put it out there. And by using a CC-BY license, we hope others will distribute, remix, tweak, and build upon our work, even commercially, as long as they credit us for the original creation. Creators have a choice when they are sharing, and because we want the maximum dissemination possible, we often choose the most accommodating license.

Why don't we completely relinquish our rights and opt out of copyright altogether? Because we want recognition for our work, and the attribution brings us connection

The best people I have met are the ones who are generous, connected, and open. Being diligent with attribution isn't easy, but it plays an important part in being awesome.