The dangers of default disdain
Those who create do so with an unspoken bias and for good reason. A right-handed can opener, the placement of letters on a QWERTY keyboard. Anything manufactured starts out as a guess about its end user and their motivations, and these motivations are carved into the design. Those who create technology have the awesome power of establishing standards—setting the presets—to steer their systems. The larger the scale of the system, the more assumptions the designer has to make. And, unless these presets can be modified, the system is limited.
If you think about it, defaults are an incredibly necessary invention, because they go to work when we do nothing. But doing nothing, relying on some stranger's arbitrary parameterization, is a bad choice. We often blur the concept of the default (where we have choice) with the preset (no choice). Your keyboard is preset. Your desktop wallpaper, your suite of mobile apps, the color scale on your contour map, these are defaults that can be changed. The shame is when they aren’t changed, the triumph is when they are changed to add value. By fully exploiting defaults, we can be more agile in our work, but only if the systems are designed with such flexiblity and freedom in mind.
The visonary Kevin Kelly writes a wonderful essay, describing the default as a vehicle that designers use to drive the habits of the consumers. But he notes,
...defaults are 'sticky'. Many psychological studies have shown that the tiny bit of extra effort needed to alter a default is enough to dissuade most people from bothering, so they stick to the default, despite their untapped freedom. Their camera's clock blinks at the default of 12:00, or their password remains whatever temporary one was issued them. The hard truth, as any engineer will tell you, is that most defaults are never altered. Pick up any device, and 98 out of 100 options will be the ones preset at the factory.
Is there an altogether better way to design goods and services—or software tools? Can the user community drive the design and utility of intelligent defaults?
Many geoscience applications (most applications in fact) are designed with a preset architecture ready for optimizing performance and accessibility. However applications, like people, are not neutral. They have biases. Think about the software that you use most often. Does it provide you with flexibility, or is it constrained (thus constraining you) to doing only a few things well? Does it have units that you can’t stand? Is it an impenetrable black box, or is it a hub for your creative expression? Does it allow you to be as agile as you want to be?
The January 2011 Interpreter's Corner section of SEG's The Leading Edge, William Hammon examines the computational and psychological factors affecting 'intelligent default' design in geoscience software. He asked people to play with defaults in order to match a given, ideal processing target. Although a variety of interesting observations emerge, there are three take-home conclusions: 1) very few people actually start with the initial default settings (essentially defeating the purpose of intelligent design); 2) fewer parameter choices invoke bolder default excursions; 3) subjects will be more experimental when working with faster processes.
I really like this paper, but my lasting thought after reading it was one of jealousy. We never actually have an idealised target to converge to. Sure, it's a construct of the experiment that allows for metrics and discussion, but it's just not fair! It's also more reason to start, be even bolder, and even more experimental with the defaults that are sitting in our inventories.