Why I don't flout copyright

Lots of people download movies illegally. Or spoof their IP addresses to get access to sports fixtures. Or use random images they found on the web in publications and presentations (I've even seen these with the watermark of the copyright owner on them!). Or download PDFs for people who aren't entitled to access (#icanhazpdf). Or use sketchy Russian paywall-crumbling hacks. It's kind of how the world works these days. And I realize that some of these things don't even sound illegal.

This might surprise some people, because I go on so much about sharing content, open geoscience, and so on. But I am an annoying stickler for copyright rules. I want people to be able to re-use any content they like, without breaking the law. And if people don't want to share their stuff, then I don't want to share it.

Maybe I'm just getting old and cranky, but FWIW here are my reasons:

  1. I'm a content producer. I would like to set some boundaries to how my stuff is shared. In my case, the boundaries amount to nothing more than attribution, which is only fair. But still, it's my call, and I think that's reasonable, at least until the material is, say, 5 years old. But some people don't understand that open is good, that shareable content is better than closed content, that this is the way the world wants it. And that leads to my second reason:
  2. I don't want to share closed stuff as if it was open. If someone doesn't openly license their stuff, they don't deserve the signal boost — they told the world to keep their stuff secret. Why would I give them the social and ethical benefits of open access while they enjoy the financial benefits of closed content? This monetary benefit comes from a different segment of the audience, obviously. At least half the people who download a movie illegally would not, I submit, have bought the movie at a fair price.

So make a stand for open content! Don't share stuff that the creator didn't give you permission to share. They don't deserve your gain filter.

Rockin' Around The Christmas Tree

I expect you know at least one geoscientist. Maybe you are one. Or you want to be one. Or you want one for Christmas. It doesn't matter. The point is, it'll soon be Christmas. If you're going to buy someone a present, you might as well get them something cool. So here's some cool stuff!

Gadgets

There isn't a single geologist alive that wouldn't think this was awesome. It's a freaking Geiger counter! It goes in your pocket! It only costs USD 60, or CAD 75, or less than 40 quid! Absurd: these things normally cost a lot more.

OK, if you didn't like that, you're not going to like this IR spectrometer. Yes, a pocket molecular sensor, for sensing molecules in pockets. It does cost USD 250 though, so make sure you really like that geologist!

Back down to earth, a little USB microscope ticks most of the geogeek boxes. This one looks awesome, and is only USD 40 but there are loads, so maybe do some research.

Specimens

You're going to need something to wave all that gadgetry at. If you go down the well-worn path of the rock & mineral set, make sure it's a good size, like this 100-sample monster (USD 70). Or go for the novelty value of fluorescent specimens (USD 45) — calcite, sphalerite, and the like.

If minerals seem passé for a geologist, then take the pure line with a tour of the elements. This set — the last of it's kind, by the way — costs USD 565, but it looks amazing. Yet it can't hold a candle to this beauty, all USD 5000 of it — which I badly want but let's face it will never get.

Home

If you have a rock collection, maybe you want a mineralogical tray (USD 35) to put them in? The same store has all sorts of printed fabrics by designers Elena Kulikova and Karina Eibitova. Or how about some bedding?

These steampunk light switch plates are brilliant and varied (USD 50). Not geological at all, just awesome.

I don't think they are for sale, but check out Ohio artist Alan Spencer's ceramic pieces reflecting each of the major geological periods. They're pretty amazing.

Lego

My kids are really into Lego at the moment. Turns out there are all sorts of sciencey kits you can get. I think the Arctic Base Camp (USD 90) is my favourite that's available at the moment, and it contains some kind of geological-looking type (right).

I don't condone the watching of television programmes, except Doctor Who obviously, but they do sometimes make fun Lego sets. So there's the Doctor, naturally, and other things like Big Bang Theory.

You can fiddle with these while you wait for the awesome HMS Beagle model to come out.

Books etc.

A proven success — winner of the Royal Society's prestigious Winton Prize for science books this year — is Adventures in the Anthropocene: A Journey to the Heart of the Planet We Made, by Gaia Vince, Milkweed Editions, September 2015. Available in hardback and paperback.

Lisa Randall's Dark Matter and the Dinosaurs: The Astounding Interconnectedness of the Universe (HarperCollins) just came out, and is doing remarkably well at the moment. It's getting decent reviews too. Randall is a cosmologist, and she reckons the dinosaurs were obliterated by a comet nudged out of orbit by mysteriousness. Hardback only.

If those don't do it for you, I reviewed some sciencey comic books recently... or there's always Randall Munroe.

Or you could try poking around in the giftological posts from 2011, 2012, 2013, or 2014.

Still nothing? OK, well, there's always chocolate :)


The images in this post are all someone else's copyright and are used here under fair use guidelines. I'm hoping the owners are cool with people helping them sell stuff!

The big data eye-roll

First, let's agree on one thing: 'big data' is a half-empty buzzword. It's shorthand for 'more data than you can look at', but really it's more than that: it branches off into other hazy territory like 'data science', 'analytics', 'deep learning', and 'machine intelligence'. In other words, it's not just 'large data'. 

Anyway, the buzzword doesn't bother me too much. What bothers me is when I talk to people at geoscience conferences about 'big data', about half of them roll their eyes and proclaim something like this: "Big data? We've been doing big data since before these punks were born. Don't talk to me about big data."

This is pretty demonstrably a load of rubbish.

What the 'big data' movement is trying to do is not acquire loads of data then throw 99% of it away. They are not processing it in a purely serial pipeline, making arbitrary decisions about parameters on the way. They are not losing most of it in farcical enterprise data management train-wrecks. They are not locking most of their data up in such closed systems that even they don't know they have it.

They are doing the opposite of all of these things.

If you think 'big data', 'data' science' and 'machine learning' are old hat in geophysics, then you have some catching up to do. Sure, we've been toying with simple neural networks for years, eg probabilistic neural nets with 1 hidden layer — though this approach is very, very far from being mainstream in subsurface — but today this is child's play. Over and over, and increasingly so in the last 3 years, people are showing how new technology — built specifically to handle the special challenge that terabytes bring — can transform any quantitative endeavour: social media and online shopping, sure, but also astronomy, robotics, weather prediction, and transportation. These technologies will show up in petroleum geoscience and engineering. They will eat geostatistics for breakfast. They will change interpretation.

So when you read that Google has open sourced its TensorFlow deep learning library (9 November), or that Microsoft has too (yesterday), or that Facebook has too (months ago), or that Airbnb has too (in August), or that there are a bazillion other super easy-to-use packages out there for sophisticated statistical learning, you should pay a whole heap of attention! Because machine learning is coming to subsurface.

Moving ahead with social interpretation

After quietly launching Pick This — our social image interpretation tool — in February, we've been busily improving the tool and now we're moving into 2016 with a plan for world domination. I summed up the first year of development in one of the interpretation sessions at SEG 2015. Here's a 13-minute version of my talk:

In 2016 we'll be exploring ways to adapt the tool to in-house corporate use, mainly by adding encryption and private groups. This way, everyone with @awesome.com email addresses, say, would be connected to each other, and their stuff would only be shared among the group, not with the general public.

Some other functionality is on the list of things to do:

  • Other types of interpretation than points, lines and polygons.
  • Ways to find content more easily, for example with tags like 'Seismic' or 'Outcrop'.
  • Ways to follow individuals, or get notifications of new interpretations on an image.
  • More ways to visualize and generally get at the data Pick This produces.

We're always open to suggestions. Please get in touch if you have a neat idea!

What now?

Times are rock hard in industry right now.

If you have a job, you're lucky — you have probably already survived one round of layoffs. There will likely be more, especially when the takeovers start, which they will. I hope you survive those too. 

If you don't have a job, you probably feel horrible, but of course that won't get you anywhere. I heard one person call it an 'involuntary sabbatical', and I love that: it's the best chance you'll get to re-invent, re-learn, and find new direction. 

If you're a student, you must be looking out over the wasteland and wondering what's in store for you. What on earth?

More than one person has asked me recently about Agile. "You got out," they say, "how did you do it?" So instead of bashing out another email, I thought I'd blog about it.

Consulting in 2015

I didn't really get out, of course, I just quit and moved to rural Nova Scotia.

Living out here does make it harder to make a living, and things on this side of the fence, so to speak, are pretty gross too I'm afraid. Talking to others at SEG suggested that I'm not alone among small companies in this view. A few of the larger outfits seem to be doing well: IKON and GeoTeric for instance, but they also have product, which at least offers some income diversity. 

Agile started as a 100% bootstrapped effort to be a consulting firm that's more directly useful to individual professional geoscientists than anyone else. Most firms target corporate accounts and require permission, a complicated contract, an AFE, and 3 months of bureaucracy to hire. It turns out that professionals are unable or unwilling to engage on that lower, grass-roots level, though — turns out almost everyone thinks you actually need permission, contracts, AFEs, etc, to get hired in any capacity, even just "Help me tie this well." So usually we are hired into larger, longer-term projects, just like anyone else.

I still think there's something in this original idea — the Uberification of consulting services, if you will — maybe we'll try again in a few years.

But if you are out of work and were thinking of getting out there as a consultant... I'm an optimistic person, but unless you are very well known (for being awesome), it's hard for me to honestly recommend even trying. It's just not the reality right now. We've been lucky so far, because we work in geothermal and government as well as in petroleum, but oil & gas was over half our revenue last year. It will be about 0% of it this year, maybe slightly less.

The transformation of Agile

All of which is to explain why we are now, since January, consciously and deliberately turning ourselves into a software technology R&D company. The idea is to be less dependent on our dysfunctional industry, and less dependent on geotechnical work. We build new tools for hard problems — data problems, interpretation problems, knowledge sharing problems. And we're really good at it.

We hired another brilliant programmer in August, and we're all learning more every day about our playground of scientific computing and the web — machine learning, cloud services, JavaScript frameworks, etc. The first thing we built was modelr.io, which is still in active development. Our latest project is around our tool pickthis.io. I hope it works out because it's the most fun I've had on a project in ages. Maybe these projects spin out of Agile, maybe we keep them in-house.

So that's our survival plan: invent, diversify, and re-tool like crazy. And keep blogging.

F**k it

Some people are saying, "things will recover, sit it out" but I think that's awful — the very worst — advice. I honestly think your best bet right now* is to find an accomplice, set aside 6 months and some of your savings, push everything off your desk, and do something totally audacious. 

Something you can't believe no-one has thought of doing yet.

Whatever it was you just thought of — that's the thing.

You might as well get started.


* Unless you have just retired, are very well connected in industry, have some free time, and want to start a new, non-commercial project that will profoundly benefit the subsurface community for the next several decades at least. Because I'd like to talk to you about another audacious plan...

Notes from a hackathon

The spirit of invention is alive and well in exploration geophysics! Last weekend, Agile hosted the 3rd annual Geophysics Hackathon at Propeller, a large and very cool co-working space in New Orleans, Louisiana.

A community of creative scientists

Commensurate with the lower-than-usual turnout at the SEG Annual Meeting, which our event preceded, we had 15 hackers. The remaining hackers were not competing, but hanging out and self-teaching or hacking around with code.

As in Denver, we had an amazing showing from Colorado School of Mines, with 6 participants. I don't know what's in the water over there in the Rockies, or what the profs have been feeding these students, but it works. Such smart, creative talent. But it can't stay this one-sided... one day we'll provoke Stanford into competitive geophysics programming.

Other than the Mines crew, we had one other student (Agile's Ben Bougher, who's at UBC), the dynamic wiki duo from SEG, and the rest were professional geoscientists from large and small companies, so it was pretty well balanced between academia and industry.

Thank you

As always, we are indebted to the sponsors and supporters of the hackathon. The event would be impossible without their financial support, and much less fun without their eager participation. This year we teamed up with three companies:

  • OpenGeoSolutions, a fantastic group of geophysicists based in Calgary. You won't find better advice on signal processing problems. Jamie Alison and Greg Partyka also regularly do us the honour of judging our hackathon demos, which is wonderful.
  • EMC, a huge cloud computing company, generously supported us through David Holmes, their representative for our industry, and a fellow Landmark alum. David also kindly joined us for much of the hackathon, including the judging, which was great for the teams.
  • Palladium Consulting, a Houston-based bespoke software house run by Sebastian Good, were a new sponsor this year. Sebastian reached out to a New Orleans friend and business partner of his, Graham Ganssle, to act as a judge, and he was beyond generous with his time and insight all weekend. He also acted as a rich source of local knowledge.

Although he craves no spotlight, I have to recognize the personal generosity of Karl Schleicher of UT Austin, who is one of the most valuable assets our community has. His tireless promotion of open data and open source software is an inspiration.

And finally, Maitri Erwin again visited to judge the demos on Sunday. She brings the perfect blend of a deep and rigorous expertise in exploration geoscience and a broad and futuristic view of technology in the service of humankind. 

I will do a round up of the projects in the next couple of weeks. Look out for that because all of the projects this year were 'different'. In a good way.


If this all sounds like fun, mark your calendars for 2016! I think we're going to try running it after SEG next year, so set aside 22 and 23 October 2016, and we'll see you there. Bring a team!

PS You can already sign up for the hackathon in Europe at EAGE next year!

More highlights from SEG

On Monday I wrote that this year's Annual Meeting seemed subdued. And so it does... but as SEG continued this week, I started hearing some positive things. Vendors seemed pleasantly surprised that they had made some good contacts, perhaps as many as usual. The technical program was as packed as ever. And of course the many students here seemed to be enjoying themselves as much as ever. (New Orleans might be the coolest US city I've been to; it reminds me of Montreal. Sorry Austin.)

Quieter acquisition

Pramik et al. (of Geokinetics) reported on a new marine vibrator acquisition using their AquaVib source. This instrument has been around for a while, indeed it was first tested over 20 years ago by IVI and later Geco (e.g. see J Bird, TLE, June 2003). If perfected, it will allow for much quieter marine seismic acquisition, reducing harm to marine mammals, with no loss of quality (images below from their abstract and their copyright with SEG):

Ben told me one of his favourite talks was Schostak & Jenkerson with a report from a JIP (Shell, ExxonMobil, Total, and Texas A&M) trying to build a new marine vibrator.  Three designs are being tested by the current consortium, respectively manufactured by PGS with an electrical model, APS with a mechanical piston, and Teledyne with a bubble resonator.

In other news:

  • Talks at Dallas 2016 will only be 15 minutes long. Hopefully this is to allow room in the schedule for something else, not just more talks.
  • Dave Hale has retired from Colorado School of Mines, and apparently now 'writes software with Dean Witte'. So watch out for that!
  • A sure sign of industry austerity: "Would you like Bud Light, or Miller Light?"
  • Check out the awesome ribbons that some clever student thought of. I'm definitely pinching that idea.

That's all I have for now, and I'm flying home today so that's it for SEG 2015. I will be reporting on the hackathon soon I promise, and I'll try to get my paper on Pick This recorded next week (but here's a sneak peek). Stay tuned!


References

Bill Pramik, M. Lee Bell, Adam Grier, and Allen Lindsay (2015) Field testing the AquaVib: an alternate marine seismic source. SEG Technical Program Expanded Abstracts 2015: pp. 181-185. doi: 10.1190/segam2015-5925758.1

Brian Schostak* and Mike Jenkerson (2015) The Marine Vibrator Joint Industry Project. SEG Technical Program Expanded Abstracts 2015: pp. 4961-4962. doi: 10.1190/segam2015-6026289.1

Monday highlights from SEG

Ben and I are in New Orleans at the 2015 SEG Annual Meeting, a fittingly subdued affair, given the industry turmoil recently. Lots of people are looking for work, others are thankful to have it.

We ran our annual Geophysics Hackathon over the weekend; I'll write more about that later this week. In a nutshell: despite a low-ish turnout, we had 6 great projects, all of them quite different from anything we've seen before. Once again, Colorado School of Mines dominated.

Beautiful maps

One of the most effective ways to make a tight scientific argument is to imagine trying to convince the most skeptical person you know that your method works. When it comes to seismic attribute analysis, I am that skeptical person.

Some of the nicest images I saw today were in the 'Attributes for Stratigraphic Analysis' session, chaired by Rupert Cole and Yuefeng Sun. For example, Tao Zhao, one of Kurt Marfurt's students, showed some beautiful images from the Waka 3D offshore New Zealand (Zhao & Marfurt). He used 2D colourmaps to co-render two attributes together, along with semblance mapped to opacity on a black layer, and were very nice to look at. However I was left wondering, and not for the first time, how we can do a better job calibrating those maps to geology. We (the interpretation community) need to stop side-stepping that issue; it's central to our credibility. Even if you have no wells, as in this study, you can still use forward models, analogs, or at least interpretation by a sedimentologist, preferably two.

© SEG and Zhao & Marfurt. Left to right: Peak spectral frequency and peak spectral magnitude; GLCM homogeneity; shape index and curvedness. All of the attributes are also corendered with Sobel edge detection.

© SEG and Zhao & Marfurt. Left to right: Peak spectral frequency and peak spectral magnitude; GLCM homogeneity; shape index and curvedness. All of the attributes are also corendered with Sobel edge detection.

Pavel Jilinski at GeoTeric gave a nice talk (Calazans Muniz et al.) about applying some of these sort of fancy displays to a large 3D dataset in Brazil, in a collaboration with Petrobras. The RGB displays of spectral attributes were as expected, but I had not seen their cyan-magenta-yellow (CMY) discontinuity displays before. They map dip to the yellow channel, similarity to the magenta channel, and 'tensor discontinuity' to the cyan channel. No, I don't know what that means either, but the displays were pretty cool.

Publications news

This evening we enjoyed the Editor's Dinner (I coordinate a TLE column and review for Geophysics and Interpretation, so it's totally legit). Good things are coming to the publication world: adopted Canadian Mauricio Sacchi is now Editor-in-Chief, there are no more page charges for colour in Geophysics (up to 10 pages), and watch out for video abstracts next year. Also, Chris Liner mentioned that Interpretation gets 18% of its submissions from oil companies, compared to only 5% for Geophysics. And I heard, but haven't verified, that downturns result in more papers. So at least our journals are healthy. (You do read them, right?)

That's it for today (well, yesterday). More tomorrow!


References

Calazans Muniz, Moises, Thomas Proença, and Pavel Jilinski (2015). Use of Color Blend of seismic attributes in the Exploration and Production Development - Risk Reduction. SEG Technical Program Expanded Abstracts 2015: pp. 1638-1642. doi: 10.1190/segam2015-5916038.1

Zhao, Tao, and Kurt J. Marfurt (2015). Attribute assisted seismic facies classification on a turbidite system in Canterbury Basin, offshore New Zealand. SEG Technical Program Expanded Abstracts 2015: pp. 1623-1627. doi: 10.1190/segam2015-5925849.1

The Rock Property Catalog again

Do you like data? Data about rocks? Open, accessible data that you can use for any purpose without asking? Read on.

After writing about anisotropy back in February, and then experimenting with storing rock properties in SubSurfWiki later that month, a few things happened:

  • The server I run the wiki on — legacy Amazon AWS infrastructure — crashed, and my backup strategy turned out to be <cough> flawed. It's now running on state-of-the-art Amazon servers. So my earlier efforts were mostly wiped out... Leaving the road clear for a new experiment!
  • I came across an amazing resource called Mudrock Anisotropy, or — more appealingly — Mr Anisotropy. Compiled by Steve Horne, it contains over 1000 records of rocks, gathered from the literature. It is also public domain and carries only a disclaimer. But it's a spreadsheet, and emailing a spreadsheet around is not sustainable.
  • The Common Ground database that was built by John A. Scales, Hans Ecke and Mike Batzle at Colorado School of Mines in the late 1990s, is now defunct and has been officially discontinued, as of about two weeks ago. It contains over 4000 records, and is public domain. The trouble is, you have to restore a SQLite database to use it.

All this was pointing towards a new experiment. I give you: the Rock Property Catalog again! This time it contains not 66 rocks, but 5095 rocks. Most of them have \(V_\mathrm{P}\), \(V_\mathrm{S}\) and  \(\rho\). Many of them have Thomsen's parameters too. Most have a lithology, and they all have a reference. Looking for Cretaceous shales in North America to use as analogs on your crossplots? There's a rock for that.

As before, you can query the catalog in various ways, either via the wiki or via the web API. Let's say we want to find shales with a velocity over 5000 m/s. You have a few options:

  1. Go to the semantic search form on the wiki and type [[lithology::shale]][[vp::>5000]]
  2. Make a so-called inline query on your own wiki page (you need an account for this).
  3. Make a query via the web API with a rather long URL: http://www.subsurfwiki.org/api.php?action=ask&query=[[RPC:%2B]][[lithology::shale]][[Vp::>5000]]|%3FVp|%3FVs|%3FRho&format=jsonfm

I updated the Jupyter Notebook I published last time with a new query. It's pretty hacky. I'll work on this to produce a more robust method, with some error handling and cleaner code — stay tuned.

The database supports lots of properties, including:

  • Citation and reference
  • Description, lithology, colour (you can have pictures if you want!)
  • Location, lat/lon, basin, age, depth
  • Vp, Vs, \(\rho\), as well as \(\rho_\mathrm{dry}\) and \(\rho_\mathrm{grain}\)
  • Thomsen's \(\epsilon\), \(\delta\), and \(\gamma\)
  • Static and dynamic Young's modulus and Poisson ratio
  • Confining pressure, pore pressure, effective stress, axial stress
  • Frequency
  • Fluid, saturation type, saturation
  • Porosity, permeability, temperature
  • Composition

There is more from the Common Ground data to add, especially photographs. But for now, I'd love some feedback: is this the right set of properties? Do we need more? I want this to be useful — what kind of data and metadata would you like to see? 

I'll end with the usual appeal — I'm open to any kind of suggestions or help with this. Perhaps you can contribute new rocks, or a paper containing data? Or maybe you have some wiki skills, or can help write bots to improve the data? What can you bring? 

What is AVO-friendly processing?

It's the Geophysics Hackathon next month! Come down to Propeller in New Orleans on 17 and 18 October, and we'll feed you and give you space to build something cool. You might even win a prize. Sign up — it's free!

Thank you to the sponsors, OpenGeoSolutions and Palladium Consulting — both fantastic outfits. Hire them.

AVO-friendly processing gets called various things: true amplitude, amplitude-friendly, and controlled amplitude, controlled phase (or just 'CACP'). And, if you've been involved in any processing jobs you'll notice these phrases get thrown around a lot. But seismic geophysics has a dirty little secret... we don't know exactly what it is. Or, at least, we can't agree on it.

A LinkedIn discussion in the Seismic Data Processing group earlier this month prompted this post:

I can't compile a list of exactly which processes will harm your AVO analysis (can anyone? Has anyone??), but I think I can start a list of things that you need to approach with caution and skepticism:

  • Anything that is not surface consistent. What does that mean? According to Oliver Kuhn (now at Quantec in Toronto):
Surface consistent: a shot-related [process] affects all traces within a shot gather in the same way, independent of their receiver positions, and, a receiver-related [process] affects all traces within a receiver gather in the same way, independent of their shot positions.
  • Anything with a window — spatial or temporal. If you must use windows, make them larger or longer than your areas and zones of interest. In this way, relative effects should be preserved.
  • Anything that puts the flattening of gathers before the accuracy of the data (<cough> trim statics). Some flat gathers don't look flat. (The thumbnail image for this post is from Duncan Emsley's essay in 52 Things.)
  • Anything that is a sort of last resort, post hoc attempt to improve the data — what we might call 'cosmetic' treatments. Things like wavelet stretch correction and spectral shaping are good for structural interpreters, but not for seismic analysts. At the very least, get volumes without them, and convince yourself they did no harm.
  • Anything of which people say, "This should be fine!" but offer no evidence.

Back to my fourth point there... spectral shaping and wavelet stretch correction (e.g. this patented technique I was introduced to at ConocoPhillips) have been the subject of quite a bit of discussion, in my experience. I don't know why; both are fairly easy to model, on the face of it. The problem is that we start to get into the sticky question of what wavelets 'see' and what's a wavelet anyway, and hang on a minute why does seismic reflection even work? Personally, I'm skeptical, especially as we get more used to, and better at, looking at spectral decompositions of stacked and pre-stack data.

Divergent paths

I have seen people use seismic data with very different processing paths for structural interpretation and for AVO analysis. This can happen on long-term projects, where the structural framework depends on an old post-stack migration that was later reprocessed for AVO friendliness. This is a bad idea — you won't be able to put the quantitative results into the structural framework without introducing substantial error.

What we need is a clinical trial of processing algorithms, in which they are tested against a known model like Marmousi, and their effect on attributes is documented. If such studies exist, I'd love to hear about them. Come to think of it, this would make a good topic for a hackathon some day... Maybe Dallas 2016?