Online Security Is a Total Pain, But That May Soon Change


encrypt

Getty



Staying secure online is a pain. If you really want to protect yourself, you have to create unique passwords for every web service you use, turn on two-factor authentication at every site that supports it, and then encrypt all your files, e-mails, and instant messages.


At the very least, these are tedious tasks. But sometimes they’re worse than tedious. In 1999, researchers at Carnegie Mellon University found that most users couldn’t figure out how to sign and encrypt messages with PGP, the gold standard in e-mail encryption. In fact, many accidentally sent unencrypted messages that they thought were secured. And follow-up research in 2006 found that the situation hadn’t improved all that much.


As many internet users seek to improve their security in the wake of ex-government contractor Edward Snowden exposing the NSA’s online surveillance programs, these difficulties remain a huge issue. And it’s hard to understand why. Do we really have to sacrifice convenience for security? Is it that security software designers don’t think hard enough about making things easy to use—or is security just inherently a pain? It’s a bit of both, says Lorrie Cranor, an expert in both security and usability and the director of Carnegie Melon’s CyLab Usable Privacy and Security Laboratory, or CUPS for short. “There isn’t a magic bullet for how to make security usable,” she says. “It’s very much an open research project.”


How to Make Things Usable


A big part of the problem, she says, is that security experts haven’t paid enough attention to the human side of things over the years. “There’s a lot of focus on getting the encryption right and not enough investment in looking at the usability side,” she says. Many security researchers will show her papers on topics like e-mail encryption or secure file transfer and tell her they think it’s “usable” because their friends say it’s easy to use. “But they haven’t done any testing,” she says. “They don’t know how to do testing and there’s no criteria for knowing if these types of things are usable.”


Security tools are notoriously hard to evaluate. For example, the Electronic Frontier Foundation is looking into sponsoring a crypto usability prize to promote the development of more user-friendly tools. But before it can offer a prize, the organization is conducting research into how to measure the usability of nominated projects. With a normal application, such as a Word processor, a usability tester can just make a list of core tasks and verify whether the user can figure out how to do them in a reasonable amount of time. But with security tools, you need to test whether users make mistakes that undermine security, and what the user experience is like when someone is actively trying to trick them into handing over data.


That often means the interface design needs to be considered from the very beginning of a project. “It’s not the sort of thing you can have the crypto guys build something and then throw it over the fence to the usability people and say ‘make it work,’” Cranor says.


This is especially clear in the case of e-mail. A big part of why PGP is so hard to use is that the earliest e-mail systems weren’t designed with encryption and privacy in mind, and now, software developers are trying to bolt security onto existing systems through plugins. Today, open source teams like Mailpile are trying to create new e-mail clients that are built from the ground up to support PGP, but e-mail remains limited in other ways. For example, even if you encrypt your e-mail it will still be possible for someone who intercepts a message—or seizes an e-mail server—to see who you’ve been sending mail to and receiving mail from.


That has led to a few projects to reinvent private messaging from scratch, such as Darkmail, a collaboration that brings together and Ladar Levison of Lavabit—the email service used by Edward Snowden used and PGP creator Phil Zimmermann. But that even if we start from a clean slate, Cranor says, there’s no clear way of making secure communications software usable, largely because the field has been so neglected for so long.


Why Change Is on the Way


But the problem isn’t just that crypto geeks don’t prioritize usability. There hasn’t been a strong demand for usable security software in the past, Cranor says. One of the best examples of usable security is SSL/TLS, the protocol used to encrypt web traffic. There was a strong business incentive on the part of banks and e-commerce companies to make encryption work well, and work as seamlessly as possible. But other areas, such as personal e-mail encryption, there’s been much less investment. That’s because, until very recently, the primary market for most security software has been the IT departments of large corporations and governments. “[IT professionals] care about usability but not to the extent that users do,” she says. “So they’ll buy something even if it isn’t very usable.”


That’s starting to change in the post-Snowden era, as average users start to worry more about privacy. Startups like Virtu are raising venture capital to make communications both more secure and easier to use. And Google released a preview of End-to-End, a PGP plugin for Chrome earlier this month. But it’s not necessarily in the company’s best interest to have all of its e-mail pass through its servers encrypted because it makes money by scanning e-mails for ad targeting. Therein lies another problem with the security technology market: consumers are often willing to trade privacy not only for convenience, but also in lieu of paying for services.


But with major privacy breaches becoming more common, just about every tech company is at least paying lip service to the idea of privacy. Gone are the days of Facebook Mark Zuckerberg proclaiming that privacy is no longer a social norm and Google chairman Eric Schmidt declaring that: “If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place.”


Facebook has actually been working with CUPS to find ways to encourage users into being more careful about what they post online with the company’s “Privacy Dinosaur” tool, which prompts users to think about their privacy settings. The collaboration is the result of additional CUPS research, which found that minor tweaks can help users make better privacy decisions, such as making users wait 10 seconds after writing something before they can post it, or showing them photos of five of their friends at random to remind them of who is likely to see the post.


These sorts of collaborations between web companies, usability experts, and privacy and security researchers are what we need more of. It may always be at least a little cumbersome to use encryption or secure passwords, but there’s plenty that can be done to make easier on us.



No comments:

Post a Comment