advertisement
advertisement

What Comes After User-Friendly Design?

The design industry needs a new way to talk to users–one that isn’t just friendly, but respectful.

What Comes After User-Friendly Design?
[Source Images: Jason Jaroslav Cook/Getty Images (pattern), winterling/iStock (photo)]

“User-friendly” was coined in the late 1970s, when software developers were first designing interfaces that amateurs could use. In those early days, a friendly machine might mean one you could use without having to code.

advertisement

Forty years later, technology is hyper-optimized to increase the amount of time you spend with it, to collect data about how you use it, and to adapt to engage you even more. Meanwhile, other aspects of our tech have remained difficult to use, from long, confusing privacy policies to a lack of explanation on why and when your data is being collected, much less how it’s being protected. While some aspects of apps and platforms have become almost too easy to use–consider how often Facebook invites you to check out a friend’s latest update or new comment–others remain frustratingly opaque, like, say, the way Facebook crafts advertising to your behavior.

[Source Images: Jason Jaroslav Cook/Getty Images (pattern), winterling/iStock (photo)]
The discussion around privacy, security, and transparency underscores a broader transformation in the typical role of the designer, as Khoi Vinh, principal designer at Adobe and frequent design writer on his own site, Subtraction, points out. “Over the past decade or so, businesses and companies have come to realize that design is valuable, and they’ve been aggressive about pursuing design talent and putting design at the forefront of their process,” he explains. It’s an opportunity for designers to rethink the human implications of products: “What happens when we start thinking about the long-term impact of the work that we do?”

So what does it mean to be friendly to users–er, people–today? Do we need a new way to talk about design that isn’t necessarily friendly, but respectful? I talked to a range of designers about how we got here, and what comes next.

A few years ago, one Google exec claimed that after it tested 41 shades of blue for its Gmail ads in 2009, the company saw a $200 million windfall. It’s a well-worn anecdote now, but it underscores the fact that data analytics are still a new tool in the design world. “Working with data is still relatively new to designers,” says Vinh. “Analytics have been around since the beginning, but they started getting much richer and much more consumable by product designers and the product team 10 years ago or so.”

Being able to make fine-grained observations about the way people use a product ties design directly to engagement. At the same time, insights from psychology have helped designers use behavioral science to increase engagement, too. It’s easy to see the invisible hand of these new tools everywhere, once you start looking. Don’t overwhelm people with settings and menus. Don’t expect them to spend a lot of time reading a privacy policy. Make interfaces slick and fast and usable. Reduce friction. Send regular notifications with rewards. Turn your product into a game. Be their friend.  

[Source Images: Jason Jaroslav Cook/Getty Images (pattern), winterling/iStock (photo)]
Yet, as a culture, we seem to slowly be growing more wary–both of the time we spend with technology, and the data with which we supply it. A string of high-profile data breaches, most recently of the personal information 143 million U.S. consumers by Equifax, are making consumers more cognizant of data security (if not necessarily more cautious). New products, like Amazon’s AI-powered, camera-equipped Echo Look, and Google’s initiative to screen users for depression, are also raising serious questions about privacy. Meanwhile, “dark UX,” which deceptively tricks or forces user behavior, has surfaced more publicly; for instance, Sonos recently admitted that its speakers may “cease to function” if users don’t accept its new privacy policy. Even here on Co.Design, founding editor Cliff Kuang has questioned the ethos of user-friendly design: “Modern user experience is a black box that has made it easier and easier to consume things, and harder and harder to remake them,” he wrote after the 2016 election. “We should not give in. We should make better things.”

advertisement

Meanwhile, a new vanguard of technologists is advocating for designers to step into the fold and lead–including Simply Secure, a three-year-old nonprofit that helps designers, developers, and companies build products that are more secure and more transparent. “For years there was such a huge UX trend toward seamlessness and concealing as much as possible in the interest of making things user-friendly,” says Ame Elliott, Simply Secure’s design director. “Now, as discipline, interaction designers and UX experts have a lot of hard work to do to think about how to expose those seams in appropriate ways.”

In part, Simply Secure’s approach focuses on educating designers themselves about best practices (some of which you can read about here). That means convincing the design community that privacy and security are part of their ambit–that these issues aren’t boring or impossibly complex, but rather are design problems that demand elegant design solutions. For instance, how do you communicate when and how a voice assistant is collecting data about a person? How can design foster trust in an e-commerce site’s security? How can design help people understand the way their products work, and give them the agency to control their own experiences?

Elliott points to an example of transparent UX from WhatsApp: the app’s automatic delivered and read receipts, communicated through blue checkmarks. Love them or hate them, they give you information about the app’s behavior, and they also change your behavior, Elliott points out. That makes them a great model for clear, transparent UX design. “How do you take the simplicity of that check mark for read receipts, and apply that to voice, and apply that to smart cities, and apply that to autonomous vehicles–all these other emerging technologies?” she asks. “How do we give people immediate feedback about how the system is working?”

Simply Secure hosts workshops, offers design support, and offers an online database of best practices for transparency, security, and privacy–but in a broader sense, it wants to push designers at large to think about these issues as fundamental to their jobs. “I want to come forward and say, ‘Hey, UX designers, you have a leadership opportunity here–to step into a deeper role unpacking some of these privacy and ethical issues,'” Elliott adds.

Elsewhere in the industry, designers are grappling with similar ethical questions about engagement and optimization. Emily Ryan, a UX designer who tweets under the handle @UXIsEverywhere, has experienced the dilemma both as a designer and a user. Ryan recently found herself trying out a mobile game–and immediately wasted two hours (and $1.99) playing it on her couch without a second thought. “It was a very strange moment where a light went off,” she remembers. It’s a balance any designer with a brief to design an effective, engaging experience has to strike: “You want people to spend money on your game and you want them to spend time in it, but there comes a point where that can become detrimental to what’s good for them and what’s healthy for them.”

[Source Images: Jason Jaroslav Cook/Getty Images (pattern), winterling/iStock (photo)]
Tristan Harris, Google’s former design ethicist, has become a prominent voice in this area. Harris frequently points out invasive and addictive UX  pops up well outside of the gaming world. For instance, notifications and menus that ping you dozens of times a day, demanding your attention and engagement. Interactions like “pull to refresh,” seen on Instagram, Facebook, and dozens of other apps, are based on the proven psychological principle of “intermittent variable rewards,” where a user action is followed by a random “reward,” like a new message or photo. It’s the same principle that makes slot machines so incredibly addictive. Harris has argued that companies must take responsibility for these addictive patterns, and give users more control over them. “For example, they could empower people to set predictable times during the day or week for when they want to check ‘slot machine’ apps, and correspondingly adjust when new messages are delivered to align with those times,” he wrote in a recent essay on Medium.

advertisement

For designers at large technology companies, wanting to do the right thing can present a complex dilemma. Without walking away from a job or a client, how do you reconcile the client’s wishes with your own definition of what “good” design really is?

It’s an uncomfortable position to be in as a UX designer when, as Ryan puts it, “the clock is ticking, and the client is paying, and the product manager needs something done.” In her experience, the best way to shift a client’s perspective is to get specific about what it could cost them. Ryan comes from the cybersecurity world, but after becoming UX competency lead at Deloitte Digital in 2016 she is developing an idea she calls “strategic UX.” It’s a method of proving to a client why a dark UX pattern should be avoided, even if it seems like it’s the right call from their perspective. The key? Making a monetary case against it. “Part of getting a business to make the right decision is to tie that back to money,” she explains. “So instead of saying ‘this is morally wrong,’ it’s ‘Hey, here’s what you should be doing, and it’s just good business to do this. And here’s all of the times when people haven’t done it, and this is what it cost them.'”

Consider the paginated listicle. You’ve seen them across the internet, forcing you to click through a series of slides to read the article. Ryan points to one recent example on the site IHeartDogs.com, “The 10 Least Obedient Dog Breeds,” which makes you click through a series of slides, each with its own ad, to read the list. On Facebook, the first comment is a reader listing the entire content of the story so no one else has to click through all of the ads. Followed, of course, by other readers thanking them profusely for saving them the trouble. “At the end of the day, if you have a user who’s not happy, they’re going to find this workaround,” she says. “And all of a sudden, they’re hacking their own experience, and your UX is going right out the window and it’s a wasted effort not doing anything good for the client.”

The listicle might seem like an effective and sticky bit of UX for publishers who want to juice traffic and ad impressions. “But until you say, ‘here’s the amount of money you’re losing doing that,’ they’re not going to change it,” Ryan says. She admits that this approach takes longer and is harder than taking the path of least resistance. Making the case still falls on the designer’s shoulders.

What we’re seeing now is just the beginning of a discussion around the ethics of UX–and the uneasy balance between what’s good for a company and what’s good for people will surely evolve alongside technology itself. But one thing seems certain: “Friendly” no longer seems like the right word for describing good digital design that’s transparent, ethical, and respectful of users.

About the author

Kelsey Campbell-Dollaghan is Co.Design's deputy editor.

More