Skip
Current Issue
This Month's Print Issue

Follow Fast Company

We’ll come to you.

8 minute read

Our Apathy Toward Privacy Will Destroy Us. Designers Can Help

The loss of security and privacy online may seem inevitable, but designers can help the public help themselves.

Our Apathy Toward Privacy Will Destroy Us. Designers Can Help

[Photo: Kelvin Murray/Getty Images]

Privacy has never felt more futile. In the last year, we’ve seen Yahoo and the Democratic Party hacked and some of the biggest sites on the internet taken offline by an attack on connected appliances like—no joke—smart toasters. On the one hand, we’re told to only browse the web inside the secure browser Tor, and, on the other, we hear from Edward Snowden that the NSA is spying on pretty much everything, anyway, so why bother?

These headlines are driving apathy among users, many of whom now just agree to any terms presented by any random app. But Simply Secure, a nonprofit founded by Google and Ideo alumni that focuses on making software more transparent, useful, and secure, believes there’s another way—that designers can make websites, apps, and platforms safer and more transparent for consumers, all while actually benefiting data-hungry companies in the long run.

"We want designers to see security and privacy advocacy are part of their job," says Sara "Scout" Sinclair Brody, Simply Secure's executive director. "They are often the voice of the user." Simply Secure insists that designers really can make a difference. Here’s how.

Explain Security As A Gradient, Not An On/Off Feature

The very idea of something online being "secure" in 2017 is almost an oxymoron. Anything encrypted might one day be decoded, much like an armored car can stop bullets but not a nuclear missile. Privacy is complex; consumers want anonymity, but at the same time, we appreciate Google’s calendar reminders that keep us from missing meetings, right?

That’s because security and privacy are not a binary concepts, but gradients. And these gradients should clearly presented in design.

Brody points to the "philosophical problem" of Signal. Signal is considered the world’s most secure messaging app. So experts in security recommend using Signal, and nothing else. Yet Facebook Messenger, along with the Facebook-owned WhatsApp, actually uses much of Signal’s secure protocol, though the company collects and stores some additional metadata about your conversations on its servers.

"Yes . . . Signal offers more protections of your data than WhatsApp," says Brody, "but if it’s a question of your mom using Whatsapp verses a messenger without any security guarantees, WhatsApp is a win."

Simply Secure recommends that companies design around "threat models," which vary person to person—actually altering the UI and capabilities in response to individual concerns. Maybe one person wants anonymity from corporations, but they’re alright with the government knowing what they’re up to. Another person might not mind Facebook knowing anything about their life, but is terrified by the NSA.

The private browser Tor handles this well, offering users one of four anonymity options with which to browse the web. At the most secure level, these security features will actually break the browsing experience on much of the internet, but that’s an option that is clearly explained and built into the software’s design.

The language of security is built around fear, but design director Ame Elliott actually believes there’s an opportunity here to reframe security as a positive feature of a product. "This whole area is due for a reimagining, away from the language of no—a lot of popular security language is scolding, ‘you mustn’t mustn’t!,'" she says. Instead, she points to the security branding of Tunnel Bear, a Canadian VPN, as an antidote. You may have no clue what a "VPN" even is, but cartoony pictures of a bear knocking over street signs and hanging out with you on vacation under an umbrella help explain network security in a nonthreatening way. "This kind of tone is so different than cyber militaristic language. It’s easy to understand if you’re a company why you want to have those positive connotations."

Kill Dark Patterns By Listening To Users, Not Developers

More often than not, consumers can’t afford to push back on their rights, simply because they’re ostensibly forced into handing them over in exchange for use of an app. We see this with app permissions on platforms like iOS. If you want to use Waze, Uber, and similar apps, end users need to agree to always share their location upon request. There’s simply no UI option offered for users to share their location only while the app is running anymore. The same thing can be true of sharing your contact information or entire contact list.

Consumers are being strong-armed into handing over excess data through dark patterns, and while app companies might keep pushing in this direction, the fact of the matter is that platform holders like Apple and Google can restrict these behaviors across the board on iOS and Android—if they so chose.

So why don’t they? "The organizations they hear the most from are the app developers. If you’re Apple, Google, or Microsoft, who are you talking to? You’re not having big conferences where you invite big users and ask what they think of your platform! You’re having developer conferences," says Brody. "If you have Uber, Lyft, and 15 other ride-sharing apps telling you, ‘I need this location permission set up this way,' and you don’t have a counterbalancing voice, that’s a real challenge," Brody continues.

There is no simple solution to make companies listen more to users than their business interests, but Elliott suggests that’s a battle the designer needs to fight all the same. That means product designers need to be more vocal on development teams, and try to take on product manager roles whenever possible. "At the end of the day, there’s a real need for designers to come up and lead," she says. Because who else at an organization is primarily all that concerned with user experience?

[Photo: Nastco/iStock]

Build Smart Things That Aren’t Connected To The Internet

Toasters are taking down websites. TVs are being held hostage by ransomware. Internet baby monitors offer hackers a way to break into your home and talk to children. "My thinking is that the internet of things brings the internet to our life at a new degree of intimacy," says Brody. "That additional intimacy brings additional peril."

We're used to receiving push notifications about events and news on our home screens. And soon, potential security vulnerabilities may be communicated in the same way—through an avalanche of alerts streamed to users' screens. It's untenable. "In the IoT and privacy space, there’s a lot of fairly naive thinking that you’re going to get an alert on your phone in your pocket that a drone is capturing your picture," says Elliott. "What there isn’t an appreciation of is the scale of the problem. We’re talking about a globally connected network, and it’s going to be your phone is just blowing up [with security push notifications]."

Elliott admits that it will likely take machine learning, data visualization, and every other buzzword you can think up to tackle the problem of your light switch telling you it’s been hacked. The design community alone doesn’t have the skills or techniques to funnel countless devices into a digestible UI that makes IoT security reasonable for a layperson to manage. And to make matters even worse, the IoT represents the most vulnerable aspect of our connected lives. Apple can barely keep a routinely refreshed $800 iPhone secure; what company can afford to constantly recode a $50 smart anything to defeat hackers? "People aren’t changing [home] hardware on 18-month cycles," says Elliott. "These things are going to be sleeper cells whenever there’s some new vulnerability."

Yet the solution may be staring us right in the face: Just unplug all these "smart" gadgets from the internet. Design the experience without connectivity. Realize it’s more important for a toaster oven to be safe from hacking than to get firmware updates to allow a few new features now and again. "It’s ridiculous, why am I connecting lightbulbs to the public internet," says Brody. "They don’t need that." Obviously without internet, some "smart" functions would be limited. But Brody suggests a local network—allowing your refrigerator to talk to your washing machine—could create all sorts of useful functions even without internet access.

"As someone who has a PhD in computer security, I can tell you that I have zero smart devices in my own home," she adds a moment later.

Realize That Storing User Data Is Actually A Liability

When Yahoo divulged it had been hacked and its user data compromised, Verizon allegedly requested a $1 billion credit on the sale of the company. Likewise, Ashley Madison was worth nothing overnight once its anonymous user data was leaked and the company was revealed to be one big chatbot sham.

User data is valuable for companies for a multitude of reasons. But insecure data is also a huge liability for companies. And it’s such a liability that it might make more sense for companies to stop keeping it altogether.

"As someone who has spent a lot of time in Silicon Valley, the VC ecosystem is about encouraging companies to having liquidity events, because that’s how they get paid. And there’s this sense that customer data is on the black side of the balance sheet," says Elliott. "Now we’re seeing an early swing of the pendulum of data being on the red side, where it’s unattractive."

On one hand, it’s heresy to think that a company might not keep all of its user data. How do they train new algorithms? Why give up the potential revenue streams of selling user data to third parties? And yet, startups may lessen the risk—for themselves and for the public—by keeping less data by design. A company without user data can't leak user data to hackers. It also can’t hand over data to authorities (local or federal) by nature—and as a result, it could actually save on potential legal fees in protecting user data if subpoenaed for it. Take Signal: It doesn’t need to employ a massive legal team to assess requests for user data because it just doesn't have it.

"It’s like, yes, make the future better, but right now delete everything . . . so you’re not going to have it when people ask [for it]," says Elliott. And designers can have a proactive role in deleting all this saved data; namely, they can weigh the utility of certain features against their privacy and security implications, rather than mere convenience. Because in a world where a free app can track everything you do and say, user-centered design has to be about more than a few neat capabilities. First and foremost, it has to fundamentally keep those users safe.

loading