A billboard displays the Snapchat logo in New York, March 12, 2015. Lucas Jackson/Reuters/Corbis
Social media is always updating to give people more. More features like video and picture sharing. More freedom to use third-party apps. More capacity to store more data and make more connections. More platforms so we can use one service while loading another.
Paradoxically, the future of social media is also about providing less. Sometimes the best social media design will constrain invasive and harmful practices. If we want online social interaction to be safe and sustainable, we should embrace the limitations.
Last week, the ephemeral media service Snapchat announced that it would render its API inaccessible to third-party applications. This is an important step for the company that promises images that will disappear within seconds. After all, it was third-party Snapchat apps that allowed people to save the hundreds of thousands of snaps that got posted online last year in an event known as “The Snappening.” I previously criticized Snapchat in WIRED for blaming users for the Snappening when it knew that it was vulnerable to third-party apps and failed to ensure that only approved software could access the company’s API.
Now, Snapchat’s is working toward gaining back our trust. We should encourage all social media companies to be similarly vigilant and responsive.
Designing for a Purpose
Will Snapchat’s moves constrain users? Of course. Snapchat has essentially disabled whole categories of popular software. The app stores are already riddled with complaints. Third-party apps can be a great way to use a social media—think of all the poplar third-party clients for Twitter. But Snapchat set itself apart with the promise of ephemerality. It could not pretend it was just another service. And that’s a good thing.
Exceptionalism among apps should be welcomed. People are diverse. Social media should be, too. Different relationships require different tools. Friends, intimate partners, journalists, professionals, political dissidents, and others all use social media in different ways. Given the different needs within these communities, one size will not fit all.
While design is no cure-all, it can be more effective than laws, terms of service, or organizational policies because design affects every user.
This leads us to the issue of design. Software design forces choices that will ultimately shape the tone of the community and users’ actions. Those design choices reflect a company’s values and a software’s purpose. Want to lower penalties for speech? Facilitate anonymity. Want to cut down on online harassment? Simplify abuse reporting. Want to make your users more visible? Set notoriously sticky default privacy settings to “public.”
Each of these design choices leverages transaction costs to influence behavior. Design makes certain behaviors easier or more difficult and as a result, more or less likely to occur. Technological constraints thus help shape our reality, for good or ill.
Design is often overlooked by users and regulators because it is not a panacea. Structural constraints often only mitigate harmful behavior without preventing it. For example, the Snapchat ban on third-party apps will not completely prevent people from saving snaps. People can still capture images through their phone’s screen-shot function or by using another camera. But without third-party apps, saving snaps becomes harder to scale because it is dispersed and, in the aggregate, labor intensive.
While design is no cure-all, it can be more effective than laws, terms of service, or organizational policies because design affects every user. People don’t read the terms of use and they may not be aware of privacy laws, but every single person that uses an app must reckon with the constraints of technology.
We Should Encourage Protective Design
As social media mature, we must realize that constrains can be as useful as options.
We should pay attention when companies provide innovative design solutions to privacy problems and we should balk at designs that reflect carelessness with our personal information, reputation, and relationships.
For example, users of the social app Yik Yak are largely anonymous. Anonymity can facilitate harassment and lead to a toxic online community, but it can also foster honest discussion free from reprisal. As WIRED has noted, Yik Yak has tried to mitigate the dangers of anonymity with a number of innovative, protective design features that constrain users. Yik Yak facilitates geolocation “dead zones” for schools, features a voting function that removes posts receiving five “downvotes,” and prevents full names from being posted by employing filters. When people attempt to post certain threatening language, they receive a “pump the breaks” warning that encourages user caution, reflection, and sensitivity.
While abuse may still occur on Yik Yak, the app’s protective features should be lauded. it proves that there is still plenty of opportunity for Silicon Valley’s trademark innovation in privacy-protective design. Yik Yak is currently testing a photo feature. The app has preemptively banned photos that contain faces, nudity, or or illegal behavior. Yik Yak might consider using facial and object recognition technology to automatically filter some of these prohibited images.
People should also keep a look out for privacy-protective features that let users help themselves and others, like Facebook’s Privacy Dinosaur or YouTube’s face-blurring tool. Platforms can also better protect people through design. App developers are ultimately limited by the features provided to them by a platform. Apple could make it easier for apps like Snapchat running on their iOS to disable a phone’s screen-capture feature, which is already an option for enterprise software.
Technological constraints are a defining characteristic of modern social media. Twitter limits posts to 140 characters. Snapchat makes pictures visible for up to 10 seconds. Yik Yak limits who can see posts to those within a 1.5 mile radius. We are beginning to see the same principles in the design of these companies’ privacy protections. Safe and sustainable online communities require a regularly recalibrated balance between choices and constraints. More is not always better. Thus companies, users, and even regulators must all recognize that in mediated environments, a person’s options can be just as important as their actions.
No comments:
Post a Comment