In March 1857, a full two decades before Thomas Edison invented the phonograph, the French patent office awarded a Parisian printer named Édouard-Léon Scott de Martinville a patent for a machine that recorded sound. Inspired by anatomical studies of the human ear and fascinated with the art of stenography, Scott had stumbled across a radical new idea: Instead of a human writing down words, a machine could write sound waves.
Scott's contraption funneled sound waves through a hornlike apparatus that ended with a membrane. Sound waves would trigger vibrations in the membrane, which would then be transmitted to a stylus made of a stiff brush. The stylus would etch the waves on a page darkened by the carbon of lampblack. He called his invention a phonautograph: the self-writing of sound.
In the annals of invention, there may be no more curious mix of farsightedness and myopia than the story of the phonautograph. On the one hand, Scott had managed to make a critical conceptual leap—the realization that sound waves could be pulled out of the air and etched onto a recording medium—long before others got around to it. (When you're two decades ahead of Edison, you're doing pretty well for yourself.) But Scott's invention was hamstrung by one crucial—even comical—limitation. He had produced the first sound-recording device. But he neglected to include playback.
It seems obvious to us now that a device for recording sound should include a feature that lets you hear the recording. But that's hindsight. The idea that machines could convey sound waves that originated elsewhere was anything but intuitive. It wasn't that Scott forgot or failed to make audio playback work; it was that the idea never even occurred to him. It was in his blind spot.
For understandable reasons, when we tell stories of technological innovation, we tend to focus on insight and even seeming clairvoyance—the people who can see the future before the rest of us. But there's a flip side to such farsightedness that shows up again and again in the history of innovation: the blind spots, the possibilities that somehow escaped our field of vision but that, in retrospect, seem glaringly obvious.
Perhaps the most familiar kind of blind spot is the assumption that some new device will never find a mass audience. A classic of this genre: the confident predictions about the (tiny) demand for computers at the dawn of the digital age. “There is no reason anyone would want a computer in their home,” Ken Olsen, the cofounder of Digital Equipment Corporation, is famously quoted as saying in 1977.
WE FREQUENTLY FAIL TO ANTICIPATE HOW A NEW TOOL WILL BE ABUSED: THE INVENTORS OF EMAIL HAD NO CLUE THAT IT WOULD BE HIJACKED FOR SPAM.
But the more interesting blind spots are about how a novel technology might be used. Strangely enough, working at the cutting edge of a field makes you more prone to these sorts of blind spots, because you're exploring new territory without conventional landmarks or guidelines. You design a tool with one specific use in mind, but that focus blinds you to other ones. Scott, for instance, was trying to build an automated stenographer. He assumed that humans would learn to “read” those squiggles the way they had learned to read the squiggles of shorthand. It wasn't that crazy an idea, looking back on it. Humans had proved to be adept at recognizing visual patterns; we can internalize an alphabet so well we don't even have to think about reading once we've learned how to do it. Why would sound waves, once you get them on the page, be any different? Sadly, the neural toolkit of human beings doesn't seem to include the capacity for reading sound waves by sight.
A similar myopia surrounded the invention of the laser in the postwar era. Science fiction writers had been speculating on the military uses of concentrated beams of light since at least H. G. Wells' The War of the Worlds. (The “heat ray” is a recurrent device throughout the sci-fi canon.) When researchers at Bell Labs and Hughes Aircraft actually began producing laser light in the 1960s, they never imagined that its first mainstream use would be scanning barcodes at checkout counters.
Another archetypal innovator blind spot: failing to anticipate how a new tool will be abused. The inventors of the foundational email standards—Post Office Protocol and Simple Mail Transfer Protocol—had a clear vision of the communications revolution their brainchild would unleash. Their system was designed to allow the maximum flow of messages with a minimum of filtering or barriers. The idea of hijacking the medium for spam seems not to have occurred to anyone until 1978, when a DEC marketer named Gary Thuerk sent out a bulk email to the entire Arpanet, inviting them to check out “the newest members of the DECSystem-20 family.” Today spam constitutes more than 70 percent of all email.
Many blind spots arise out of the constraints of governing metaphors, as Scott experienced with his stenography metaphor. Many of us failed to see the social media revolution coming, in part because the web's governing metaphor was drawn from the idea of the document: hypertext and pages, not people. World Wide Web inventor Tim Berners-Lee had explicitly drawn on literary metaphors when he built the web's HTML/HTTP standard; documents were clearly defined in the protocol—user identities were not. Consequently, most of the early experiments with the web drew on magazine or publishing models, not social networks.
We often fail to perceive important developments or possibilities because we assume that recent trends will continue to follow their current trajectory. About a decade ago I wrote a book on the contemporary state and near future of videogames, which focused on their increasing complexity: an obvious and indisputable trend that could be seen in the evolution from PacMan to World of Warcraft. Despite the fact that I'd spent countless hours researching and ruminating on the gaming industry, I completely failed to anticipate the rise of microgames like FarmVille and Dots, whose simplicity made them perfect for Facebook or the iPhone.
Assuming that current trends will continue sometimes causes us to worry too much about a problem that ends up not being such a big deal. Two hundred years ago, Thomas Malthus predicted that population growth would lead to global famine. That turned out to be wrong—even though the population grew faster than he ever imagined—because he failed to account for increases in agricultural productivity.
Drawing on that lesson, advocates of today's “abundance” school of thought, led by people like Peter Diamandis, argue that emerging clean energy sources such as solar and nuclear power will make the dire energy forecasts of today look like Malthusian blunders in a few decades. But optimistic forecasters also inevitably have their blind spots. We could innovate our way out of dependency on fossil fuels only to be plunged into chaos and war when the world suddenly pivots away from Big Oil. The solution you confidently see often hides its own set of problems.
We can at least take comfort that the most embarrassing blind spots sometimes lead to constructive outcomes. Scott never made a penny from his invention and has been largely forgotten by history. But about 15 years after his first recordings, another inventor was tinkering with his phonautograph design when he came up with a new technique for capturing and transmitting sound. His name? Alexander Graham Bell.
No comments:
Post a Comment