You own your brain

I met someone last week who said her employer — a large integrated oil & gas company — 'owned her'. She said she'd signed an employment agreement that unequivocally spelt this out. This person was certainly a professional on paper, with a graduate degree and plenty of experience. But the company had, perhaps unwittingly, robbed her of her professional independence and self-determination. What a thing to lose.

Agreements like this erode our profession. Do not sign agreements like this. 

The idea that a corporation can own a person is obviously ludicrous — I'm certain she didn't mean it literally. But I think lots of people feel confined by their employment. For some reason, it's acceptable to gossip and whisper over coffee, but talking in any public way about our work is uncomfortable for some people. This needs to change.

Your employer owns your products. They pay you for concerted effort on things they need, and to have their socks knocked off occasionally. But they don't own your creativity, judgment, insight, and ideas — the things that make you a professional. They own their data, and their tools, and their processes, but they don't own the people or the intellects that created them. And they can't — or shouldn't be able to — stop you from going out into the world and being an active, engaged professional, free to exerise and discuss our science with whomever you like.

If you're asked to sign something saying you can't talk at meetings, write about your work, or contribute to open projects like SEGwiki — stop.

These contracts only exist because people sign them. Just say, 'No. I am a professional. I own my brain.'

Stop waiting for permission to knock someone's socks off

When I had a normal job, this was the time of year when we set our goals for the coming months. Actually, we sometimes didn't do it till March. Then we'd have the end-of-year review in October... Anyway, when I thought of this, it made me think about my own goals for the year, for Agile, and my career (if you can call it that). Here's my list:

1. Knock someone's socks off.

That's it. That's my goal. I know it's completely stupid. It's not SMART: specific, measurable, attainable, realistic, or timely. I don't believe in SMART. For a start, it's obviously a backronym. That's why there's attainable and realistic in there—what's the difference? They're equally depressing and uninspiring. Measurable, attainable goals are easy, and I'm going to do them anyway: it's called work. It's the corporate equivalent of saying my goals for the day are waking up, getting out of bed, having a shower, making a list of attainable goals... Maybe those are goals if you're in rehab, but if you're a person with a job or a family they're just part of being a person.

I don't mean we should not make plans and share lists of tasks to help get stuff done. It's important to have everyone working at least occasionally in concert. In my experience people tend to do this anyway, but there's no harm in writing them down for everyone to see. Managers can handle this, and everyone should read them.

Why do these goals seem so dry? You love geoscience or engineering or whatever you do. That's a given. (If you don't, for goodness's sake save yourself.) But people keep making you do boring stuff that you don't like or aren't much good at and there's no time left for the awesomeness you are ready to unleash, if only there was more time, if someone would just ask. 

Stop thinking like this. 

You are not paid to be at work, or really to do your job. Your line manager might think this way, because that's how hierarchical management works: it's essentially a system of passing goals and responsiblities down to the workforce. A nameless, interchangeable workforce. But what the executives and shareholders of your company really want from you, what they really pay you for, is Something Amazing. They don't know what it is, or what you're capable of — that's your job. Your job is to systematically hunt and break and try and build until you find the golden insight, the new play, the better way. The real challenge is how you fit the boring stuff alongside this, not the other way around.

Knock someone's socks off, then knock them back on again with these seismic beauties.Few managers will ever come to you and say, "If you think there's something around here you can transform into the most awesome thing I've ever seen, go ahead and spend some time on it." You will never get permission to take risks, commit to something daring, and enjoy yourself. But secretly, everyone around you is dying to have their socks knocked right off. Every day they sadly go home with their socks firmly on: nothing awesome today.

I guarantee that, in the process of trying to do something no-one has ever done or thought of before, you will still get the boring bits of your job done. The irony is that no-one will notice, because they're blinded by the awesome thing no-one asked you for. And their socks have been knocked off.

Things not to think

  1. Some humans are scientists.
  2. No non-humans are scientists.
  3. Therefore, scientists are human.

That's how scientists think, right? Logical, deductive, objective, algorithmic. Put in such stark terms, this may seem over the top, but I think scientists do secretly think of themselves this way. Our skepticism makes us immune to the fanciful, emotional, naïvetés that normal people believe. You can't fool a scientist!

Except of course you can. Just like everyone else, scientists' intuition is flawed, infested with bias like subjectivity and the irresistible need to seek confirmation of hypotheses. I say 'everyone', but perhaps scientists are biased in obscure, profound ways that non-specialists are not. A scary thought.

But sometimes I hear scientists say things that are especially subtle in their wrongness. Don't get me wrong: I wholeheartedly believe these things too, until I stop for a moment and reflect. Here are some examples:

The scientific method

...as if there is but one method. To see how wrong this notion is, stop and try to write down how your own investigations proceed. The usual recipe is something like: question, hypothesis, experiment, adjust hypothesis, iterate, and conclude with a new theory. Now look at your list and ask yourself if that's really how it goes. If it isn't really full of false leads, failed experiments, random shots in the dark and a brain fart or two. Or maybe that's just me.

If not thesis then antithesis

...as if there is no nuance or uncertainty in the world. We treat bipolar disorder in people, but seem to tolerate it and even promote it in society. Arguments quickly move to the extremes, becoming ludicrously over-simplified in the process. Example: we need to have an even-tempered, fact-based discussion about our exploitation of oil and gas, especially in places like the oil sands. This discussion is difficult to have because if you're not with 'em, you're against 'em. 

Nature follows laws

...as if nature is just a good citizen of science. Without wanting to fall into the abyss of epistemology here, I think it's important to know at all times that scientists are trying to describe and represent nature. Thinking that nature is following the laws that we derive on this quest seems to me to encourage an unrealistically deterministic view of the world, and smacks of hubris.

How vivid is the claret, pressing its existence into the consciousness that watches it! If our small minds, for some convenience, divide this glass of wine, this universe, into parts — physics, biology, geology, astronomy, psychology, and so on — remember that Nature does not know it!
Richard Feynman

Science is true

...as if knowledge consists of static and fundamental facts. It's that hubris again: our diamond-hard logic and 1024-node clusters are exposing true reality. A good argument with a pseudoscientist always convinces me of this. But it's rubbish—science isn't true. It's probably about right. It works most of the time. It's directionally true, and that's the way you want to be going. Just don't think there's a True Pole at the end of your journey.

There are probably more but I read or hear and example of at least one of these a week. I think these fallacies are a class of cognitive bias peculiar to scientists. A kind of over-endowment of truth. Or perhaps they are examples of a rich medley of biases, each of us with our own recipe. Once you know your recipe and learned its smell, be on your guard!

The simultaneity funnel

Is your brilliant idea really that valuable?

At Agile*, we don't really place a lot of emphasis on ideas. Ideas are abundant, ideas are cheap. Ideas mean nothing without actions. And it's impossible to act on every one. Funny though, I seem to get enthralled whenever I come up with a new idea. It's conflicting because, it seems to me at least, a person with ideas is more valuable, and more interesting, than one without. Perhaps it takes a person who is rich with ideas to be able to execute. Execution and delivery is rare, and valuable. 

Kevin Kelly describes the evolution of technology as a progression of the inevitable, quoting examples such as the lightbulb, and calculus. Throughout history parallel invention is the norm. 

We can say, the likelihood that the lightbulb will stick is 100 percent. The likelihood Edison's was the adopted bulb is, well, one in 10,000. Furthermore, each stage of the incarnation can recruit new people. Those toiling at the later stages may not have been among the early pioneers. Given the magnitude of the deduction, it is improbable that the first person to make an invention stick was also the first person to think of the idea.

Danny Hillis, founder of Applied Minds describes this as an inverted pyramid of invention. It tells us that your brilliant idea will have coparents. Even though the final design of the first marketable lightbulb could not have been anticipated by anyone, the concept itself was inevitable. All ideas start out abstract and become more specific toward their eventual execution. 

Does this mean that it takes 10,000 independant tinkerers to bring about an innovation? We aren't all working on the same problems at the same time, and some ideas arrive too early. One example is how microseismic monitoring of reservoir stimulation has exploded recently with the commercialization of shale gas projects in North America. The technology came from earthquake detection methods and that has been around for decades. Only recently has this idea been utilized in the petroleum industry, due to an alignment of compelling market forces. 

So is innovation merely a numbers game? Is 10,000 a critical mass that must be exceeded to bring about a single change? If so, the image of the lonely hero-inventor-genius, then, is misguided. And if it is a numbers game, then subsurface oil and gas technology could be seriously challenged. The SPE has nearly 100,000 members world wide, compared to our beloved SEG, which has a mere 6,000 33,000. Membership to a club or professional society does not equate to contribution, but if this figure is correct, I doubt our industry has the sustained man power to feed this funnel.

This system has been observed since the start of recorded science. The pace of invention is accelerating with population and knowledge growth. Additionally, even though the pace of technology is accelerating, so is specialization and diversification, which means we have fewer people working on more problems. Is knowledge sharing and crowd wisdom a natural supplement to this historical phenomenon? Are we augmenting this funnel or connecting disparate funnels when we embrace openess?

A crowded funnel might be compulsory for advacement and progression even if it is causes cutthroat competitiveness, hoarding, or dropping out altogether. But if these options become no longer palatable for the future of our industry, we will have to modify our approach.  

Great geophysicists #3

Today is a historic day for greatness: Rene Descartes was born exactly 415 years ago, and Isaac Newton died 284 years ago. They both contributed to our understanding of physical phenomena and the natural world and, while not exactly geophysicists, they changed how scientists think about waves in general, and light in particular.

Unweaving the rainbow

Scientists of the day recognized two types of colour. Apparent colours were those seen in prisms and rainbows, where light itself was refracted into colours. Real colours, on the other hand, were a property of bodies, disclosed by light but not produced by that light. Descartes studied refraction in raindrops and helped propagate Snell’s law in his 1637 paper, Dioptrica. His work severed this apparent–real dichotomy: all colours are apparent, and the colour of an object depends on the light you shine on it.

Newton began to work seriously with crystalline prisms around 1666. He was the first to demonstrate that white light is a scrambled superposition of wavelengths; a visual cacophony of information. Not only does a ray bend in relation to the wave speed of the material it is entering (read the post on Snellius), but Newton made one more connection. The intrinsic wave speed of the material, in turn depends on the frequency of the wave. This phenomenon is known as dispersion; different frequency components are slowed by different amounts, angling onto different paths.

What does all this mean for seismic data?

Seismic pulses, which strut and fret through the earth, reflecting and transmitting through its myriad contrasts, make for a more complicated type of prism-dispersion experiment. Compared to visible light, the effects of dispersion are subtle, negligible even, in the seismic band 2–200 Hz. However, we may measure a rock to have a wave speed of 3000 m/s at 50 Hz, and 3500 m/s at 20 kHz (logging frequencies), and 4000 m/s at 10 MHz (core laboratory frequencies). On one hand, this should be incredibly disconcerting for subsurface scientists: it keeps us from bridging the integration gap empirically. It is also a reason why geophysicists get away with haphazardly stretching and squeezing travel time measurements taken at different scales to tie wells to seismic. Is dispersion the interpreters’ fudge-factor when our multi-scale data don’t corroborate?

Chris Liner, blogging at Seismos, points out

...so much of classical seismology and wave theory is nondispersive: basic theory of P and S waves, Rayleigh waves in a half-space, geometric spreading, reflection and transmission coefficients, head waves, etc. Yet when we look at real data, strong dispersion abounds. The development of spectral decomposition has served to highlight this fact.

We should think about studying dispersion more, not just as a nuisance for what is lost (as it has been traditionally viewed), but as a colourful, scale-dependant property of the earth whose stories we seek to hear.

Accretionary Wedge #31

This is my first contribution to the Accretionary Wedge; the theme this time is 'What geological concept or idea did you hear about that you had no notion of before (and likely surprised you in some way)?' Like most of the entries I've read so far, I could think of quite a few things fitting this description. I find lots of geological concepts surprising or counterintuitive. But in the end, I chose to write about the thing that obsessed me as an undergraduate, right at the beginning of my career:

The Devonian day was 22 hours long

In November I moved to the Atlantic coast of Canada. It's the first time I've lived right at the seaside, but I am originally from the tiny island of Great Britain so never lived too far from the edge. There is a deeply maritime feel to this part of the continent, even in the sheltered Bay of Fundy. The famously macrotidal regime there permeates the culture: artists paint the tidal landscapes; musicians sing about the eerie currents; geologists crawl around on the mud-flats and cliffs. The profound consequences of a 17-metre tidal range and its heartbeat, regular as clockwork.

← Tidal forces shape a bar-built estuary, Pamlico Sound, USA.

It's easy to see the effects of the tide in the geological record. Tidal successions are recognizable from some combination of pin-stripe lamination, mud-drapes, bi-directional ripples, proximity to shore, diagnostic fossils, brackish trace fossil assemblages, and other marvellous sedimentological tools. Less intuitively perhaps, at least for a non-biologist like me, marine animals also express these tidal frequencies in their growth patterns. So a coral, for example, might have a lunar breeding cycle. This periodicity results in growth rings just like a tree, only they record not the seasons but the bi-monthly beat of spring and neap tides. The tides are driven by the relative positions of the sun and moon relative to earth. Celestial bodies created banded coral.

From Scutton (1963): diurnal rings and and monthly bandsColin Scrutton, one of my professors at the University of Durham in the northeast of England, measured the growth ridges of rugose corals of Middle Devonian successions in Michigan, Ontario and Belgium (Scrutton 1964). He was testing the result of a similar experiment by John Wells (1963). The conclusion: the Devonian year contained 13 lunar months, each lunar month contained 30.6 days, so the year was 399 days long. According to what we know about planetary dynamics in the solar system, the year was approximately the same length so Devonian days were shorter by a couple of hours. The reason: the tides themselves, as they move westward around the eastward-spinning earth, are a simple frictional brake. The earth's rotation slows over time as the earth-moon system loses energy to heat, the ultimate entropy. Even more fascinatingly, the torque exerted by the sun is counteractive, introducing further cyclicities as these signals interfere. Day length, therefore, has probably not slowed monotonically though time.

For me, this realization was bound up with an obsession with cyclicity. I could not read enough about Milankovitch cycles: wobbles and ellipticity in the earth's dance through space scratching their pulse into the groove of the stratigraphic record and even influencing sea-floor spreading rates, perhaps even mass extinctions. The implications are profound: terametre-scale mechanics of the universe control the timing of cellular neurochemical functions.

Why anyone needs astrology to connect with this awesome fact is beyond me. 

References

Panella, G, et al (1968). Palaeontological evidence for variation in length of synodic month since late Cambrian. Science 15 (3855), p 792–796, doi: 10.1126/science.162.3855.792.
Scrutton, C (1964). Periodicity in Devonian coral growth. Palaeontology 7 (4), p 552–558, pl 86–87.
Wells, J (1963). Coral growth and geochronometry. Nature 197, p 948–950. doi: 10.1038/197948a0.