What do Best Buy, the Blue Man Group and Google X have in common? They’ve all tapped Elliott Hedman and his sensors to help make their stuff better.
Hedman is the founder of the design consultancy mPath, where he’s pioneering a new approach to design research. It combines stress-testing sensors with traditional observational techniques. The idea is to uncover the tiny, often imperceptible emotional moments that shape our reaction to products and experiences. If a company was testing a new vacuum cleaner, for instance, the sensors could help pinpoint precisely when a test subject became frustrated with its design.
It’s an infant, imperfect technique, but it has intriguing potential. In the future, sensors like these could help designers fine-tune user experiences to an unprecedented degree. Some day, they could even make for products that do the fine-tuning themselves.
A New Tool for Design Research
Hedman developed the technique over the last six years at the MIT Media Lab, as a doctoral candidate in the Affective Computing Group. It centers on skin conductance. Someone who is psychologically aroused—stressed, say, or excited—will begin to sweat. Those subtle variations in persperation affect the electrical conductivity of their skin. Track the conductivity, and you can track the arousal.
With mPath, Hedman aims to offer a more granular, more empirical account of user experience. If you have a person walk through a retail space and ask how she felt about it, she might say she was overwhelmed by all the choices. By looking at skin conductance readings alongside video of the same encounter, Hedman can say things like: She felt overwhelmed when she looked at this particular sign, and again when she was trying to decide, based only on packaging, between these two products.
Skin conductance has a history of being used in exacting laboratory settings—think of your typical scene of someone wired up at a table, responding to beeps and tones. Hedman wants to bring it into the wider world for loftier aims. “I don’t care about people responding to beeps and tones,” he says. “I want to use these tools to actually understand people better.” A huge amount of his PhD work was focused on methodology, figuring out how to get meaningful readings when his subjects were playing with toys in their living rooms, or sitting through symphonies at concert halls. He spent a year at the renowned design firm Ideo seeing how his work might complement their process.
Hugh Dubberly, a veteran designer well-versed in the prevailing approaches to research, says Hedman’s method could be a powerful resource. “In traditional research, we observe, and we ask users to tell us what they’re feeling. We can even ask them to think aloud during a task. But we still only know what we can see or what the users report,” he says. “By adding sensors, we can get actual data. I think that gives us a huge new tool.”
A Less Stressful Lego
Lego was among the first companies to take note of Hedman’s work. Most recently, it hired him to help develop a tablet app for Lego Technic. The plan was to make a souped-up version of the standard paper instructions, with animations showing various pieces zooming together at each stage. The company had built a prototype and conducted a traditional observational study with a handful of kids. It was confident the app was an upgrade.
But Hedman’s research uncovered potential problems with the new product. In testing a number of youngsters playing with Lego Technic at home, Hedman’s sensors spiked whenever kids turned a page of the instructions. This wasn’t entirely surprising–a new page means new information to process, new pieces to look for. But Hedman concluded that adding whizbang animations to the app’s virtual page-turns risked making these moments too overwhelming. “It’s very small and very subtle, but it has a huge effect,” he says. “If kids are even more stressed turning the pages on the app, they won’t use it.”
The most stressful moments for the kids, however, were when they realized they had to go back to correct a mistake. In these instances sensor readings “skyrocketed,” Hedman says. In response, he proposed that the app have a series of checkpoints where kids can ensure each new component is functioning properly.
Hedman also discovered an unexpected opportunity in the course of his research: parents. He found they had a huge effect on kids’ emotional states through the building process. “There was this one kid who was struggling so much that his mother came in and started helping him,” Hedman recalls. “The sensor graphs were really high when he was building by himself—very stressed, very overwhelmed—but when his mother came in there was a complete shift. He was much more calm and much more relaxed when his mother was there.” Lego hadn’t asked Hedman to evaluate parents’ effect on the building process, but he found the readings too significant to ignore. Now, he’s trying to get to company to consider making a Lego set specifically designed to foster collaborative building between parents and kids.
The Future: Continuous, Invisible User Feedback
Most of Hedman’s clients forbid him from discussing the specifics of his work. Google hired him to look at the experience of riding in a self-driving car. He’s worked with Best Buy on their retail environment, Hasbro on a board game, and the Blue Man Group on, well, whatever you’d call it that the Blue Man Group does. Bentley and Colgate recently got in touch. The point: All sorts of different companies are intrigued by what these signals could mean for their products.
Dubberly, who started his own design firm in San Francisco after a long stint at Apple, first encountered Hedman’s work at a design research conference a few years ago. He sees it as part of a larger trend of data seeping into design practice. He points to the way companies like Amazon track user behavior en masse to help optimize the design of their web sites. Brick and mortar stores likely will do the same soon, perhaps using cameras or Bluetooth beacons.
In the not-too-distant future, Dubberly says, we could even see individual products doing the same sort of tracking. As more stuff becomes outfitted with sensors and connectivity, products will learn to understand when you’re using them—and when you’re not. “With the internet of things, those devices are becoming like websites, in the sense that we’ll instrument everything you do with them,” Dubberly says. He thinks designers of physical will increasingly face questions about how to build in this sort of data-collection.
Beyond that, there are the sensors you might soon be wearing yourself. Jawbone’s new activity-tracking bracelet will purportedly be capable of the same sort of skin measurements Hedman relies on for his work. Even if it takes a few generations before consumer wearables are capable of taking useful readings of this sort, it’s not hard to see where these trajectories could meet. It’s a future in which usability testing is happening continuously, invisibly, all around us. And, perhaps more interestingly, one in which products could find ways to adjust themselves moment to moment, based on our emotional state.
For now, though, all this new data is most useful for identifying problems. It’s up to designers to figure out how to solve them. With mPath’s clients, Hedman makes suggestions based on the insights gleaned from his research. He likes to use a process he calls “emotional prototyping”—essentially rapid, iterative user testing, sometimes with the help of the sensors, sometimes without. Still, while these sensors and other technologies will continue to make design research more exact, there’s plenty of room left for creativity in what follows. “Identifying the problems is the easy part,” Hedman says. “Coming up with solutions is hard.”
No comments:
Post a Comment