GeoConvention highlights

We were in Calgary last week at the Canada GeoConvention 2017. The quality of the talks seemed more variable than usual but, as usual, there were some gems in there too. Here are our highlights from the technical talks...

Filling in gaps

Mauricio Sacchi (University of Alberta) outlined a new reconstruction method for vector field data. In other words, filling in gaps in multi-compononent seismic records. I've got a soft spot for Mauricio's relaxed speaking style and the simplicity with which he presents linear algebra, but there are two other reasons that make this talk worthy of a shout out:

  1. He didn't just show equations in his talk, he used pseudocode to show the algorithm.
  2. He linked to his lab's seismic processing toolkit, SeismicJulia, on GitHub.

I am sure he'd be the first to admit that it is early days for for this library and it is very much under construction. But what isn't? All the more reason to showcase it openly. We all need a lot more of that.

Update on 2017-06-7 13:45 by Evan Bianco: Mauricio, has posted the slides from his talk

Learning about errors

Anton Birukov (University of Calgary & graduate intern at Nexen) gave a great talk in the induced seismicity session. It was a lovely mashing-together of three of our favourite topics: seismology, machine-learning, and uncertainty. Anton is researching how to improve microseismic and earthquake event detection by framing it as a machine-learning classification problem. He's using Monte Carlo methods to compute myriad synthetic seismic events by making small velocity variations, and then using those synthetic events to teach a model how to be more accurate about locating earthquakes.

Figure 2 from  Anton Biryukov's abstract . An illustration of the signal classification concept. The signals originating from the locations on the grid (a) are then transformed into a feature space and labeled by the class containing the event origin. From Biryukov (2017). Event origin depth uncertainty - estimation and mitigation using waveform similarity. Canada GeoConvention, May 2017.

Figure 2 from Anton Biryukov's abstract. An illustration of the signal classification concept. The signals originating from the locations on the grid (a) are then transformed into a feature space and labeled by the class containing the event origin. From Biryukov (2017). Event origin depth uncertainty - estimation and mitigation using waveform similarity. Canada GeoConvention, May 2017.

The bright lights of geothermal energy
Matt Hall

Two interesting sessions clashed on Wednesday afternoon. I started off in the Value of Geophysics panel discussion, but left after James Lamb's report from the mysterious Chief Geophysicists' Forum. I had long wondered what went on in that secretive organization; it turns out they mostly worry about how to make important people like your CEO think geophysics is awesome. But the large room was a little dark, and — in keeping with the conference in general — so was the mood.

Feeling a little down, I went along to the Diversification of the Energy Industry session instead. The contrast was abrupt and profound. The bright room was totally packed with a conspicuously young audience numbering well over 100. The mood was hopeful, exuberant even. People were laughing, but not wistfully or ironically. I think I saw a rainbow over the stage.

If you missed this uplifting session but are interested in contributing to Canada's geothermal energy scene, which will certainly need geoscientists and reservoir engineers if it's going to get anywhere, there are plenty of ways to find out more or get involved. Start at cangea.ca and follow your nose.

We'll be writing more about the geothermal scene — and some of the other themes in this post — so stay tuned. 


DID YOU KNOW?

You can get regular updates right to your email, just drop your address in the box:

The fine print: No spam, we promise! We never share email addresses with 3rd parties. Unsubscribe any time with the link in the emails. The service is provided by MailChimp in accordance with Canada's anti-spam regulations.

Geothermal facies from seismic

Here is a condensed video of the talk I gave at the SEG IQ Earth Forum in Colorado. Much like the tea-towel mock-ups I blogged about in July, this method illuminates physical features in seismic by exposing hidden signals and textures. 

This approach is useful for photographs of rocks and core, for satellite photography, or any geophysical data set, when there is more information to be had than rectangular and isolated arrangements of pixel values.

Click to download slides with notes!Interpretation has become an empty word in geoscience. Like so many other buzzwords, instead of being descriptive and specific jargon, it seems that everyone has their own definition or (mis)use of the word. If interpretation is the art and process of making mindful leaps between unknowns in data, I say, let's quantify to the best of our ability the data we have. Your interpretation should be iteratable, it should be systematic, and it should be cast as an algorithm. It should be verifiable, it should be reproducible. In a word, scientific.  

You can download a copy of the presentation with speaking notes, and access the clustering and texture codes on GitHub

The science of things we don't understand

I am at the EAGE Conference & Exhibition in Copenhagen. Yesterday I wrote up my highlights from Day 2. Today it's, yep, Day 3!

Amusingly, and depressingly, the highlight of the morning was the accidental five minute gap between talks in the land seismic acquisition session. Ralf Ferber and Felix Herrmann began spontaneously debating the sparsity of seismic data (Ferber doubting it, Herrmann convinced of it), and there was a palpable energy in the room. I know from experience that it is difficult to start conversations like this on purpose, but conferences need more of this.

There was some good stuff in Ralf's two talks as well. I am getting out of my depth when it comes to non-uniform sampling (and the related concept of compressive sensing), but I am a closet signal analyst and I get a kick out of trying to follow along. The main idea is that you want to break aliasing, a type of coherent noise and a harmful artifact, arising from regular sampling (right). The way to break it is to introduce randomness and irregularity—essentially to deliberately introduce errors in the data. Ralf's paper suggested randomly reversing the polarity of receivers, but there are other ways. The trick is that we know what errors we introduced.

Geothermal in Canada. Image: GSC. As Evan mentioned recently, we've been doing a lot of interpretation on geothermal projects recently. And we both worked in the past on oil sands projects. Today I saw a new world of possiblity open up as Simon Weides of GFZ Potsdam gave his paper, Geothermal exploration of Paleozoic formations in central Alberta, Canada. He has assessed two areas: the Edmonton Peace River regions, but only described the former today. While not hot enough for electricity generation, the temperature in the Cambrian (81°–89°C) is suitable for so-called district heating projects, though it's so tight it would need fraccing. The Devonian is cooler, at 36°–59°C, but still potentially useful for greenhouses and domestic heat. The industrial applications in Alberta, where drilling is easy and inexpensive, are manifold.

I wandered in at the end of what seemed to be the most popular geophysics talk of the conferece: Guus Berkhout's Full wavefield migration — utilization of multiples in seismic migration. While I missed the talk, I was in time to catch a remark of his that resonated with me:

Perhaps we don't need the science of signals, but the science of noise. The science of noise is the science of things we don't understand, and that is the best kind of science. 

Yes! We, as scientists in the service of man, must get better at thinking about, worrying about, and talking about the things we don't understand. If I was feeling provocative, I might even say this: the things we understand are boring.

The brick image shows spatial aliasing resulting from poor sampling. Source: Wikipedian cburnett, under GFDL.

Thermogeophysics, whuh?

Earlier this month I spent an enlightening week in Colorado at a peer review meeting hosted by the US Department of Energy. Well-attended by about 300 people from organizations like Lawerence Livermore Labs, Berkeley, Stanford, Sandia National Labs, and *ahem* Agile, delegates heard about a wide range of cost-shared projects in the Geothermal Technologies Program. Approximately 170 projects were presented, representing a total US Department of Energy investment of $340 million.

I was at the meeting because we've been working on some geothermal projects in California's Imperial Valley since last October. It's fascinating, energizing work. Challenging too, as 3D seismic is not a routine technology for geothermal, but it is emerging. What is clear is that geothermal exploration requires a range of technologies and knowledge. It pulls from all of the tools you could dream up; active seismic, passive seismic, magnetotellurics, resistivity, LiDAR, hyperspectral imaging, not to mention the borehole and drilling technologies. The industry has an incredible learning curve ahead of them if Enhanced Geothermal Systems (EGS) are going to be viable and scalable.

The highlights of the event for me were not the talks that I saw, but the people I met during coffee breaks:

John McLennan & Joseph Moore at the the University of Utah have done some amazing laboratory experiments on large blocks of granite. They constructed a "proppant sandwich", pumped fluid through it, and applied polyaxial stress to study geochemical and stress effects on fracture development and permeability pathways. Hydrothermal fluids alter the proppant and gave rise to wormhole-like collapse structures, similar to those in the CHOPS process. They incorporated diagnostic imaging (CT-scans, acoustic emission tomography, x-rays), with sophisticated numerical simulations. A sign that geothermal practitioners are working to keep science up to date with engineering.

Stephen Richards bumped into me in the corridor after lunch after he overheard me talking about the geospatial work that I did with the Nova Scotia Petroleum database. It wasn't five minutes that passed before he rolled up his sleeves, took over my laptop, and was hacking away. He connected the WMS extension that he built as part of the State Geothermal Data to QGIS on my machine, and showed me some of the common file formats and data interchange content models for curating geothermal data on a continental scale. The hard part isn't nessecarily the implementation, the hard part is curating the data. And it was a thrill to see it thrown together, in minutes, on my machine. A sign that there is a huge amount of work to be done around opening data.

Dan Getman - Geospatial Section lead at NREL gave a live demo of the fresh prospector interface he built that is accesible through OpenEI. I mentioned OpenEI briefly in the poster presentation that I gave in Golden last year, and I can't believe how much it has improved since then. Dan once again confirmed this notion that the implementation wasn't rocket science, (surely any geophysicist could figure it out), and in doing so renewed my motivation for extending the local petroleum database in my backyard. A sign that geospatial methods are at the core of exploration and discovery.

There was an undercurrent of openness surrounding this event. By and large, the US DOE is paying for half of the research, so full disclosure is practically one of the terms of service. Not surprisingly, it feels more like science going on here, where innovation is being subsidized and intentionally accelerated because there is a demand. Makes me think that activity is a nessecary but not sufficient metric for innovation.