Skip
Current Issue
This Month's Print Issue

Follow Fast Company

We’ll come to you.

4 minute read

How A Subtle UX Tweak Reduced Racial Profiling On A Site Notorious For It

The company made a small shift that’s changing the way people talk on social media in a big way.

  • <p>First attempt to re-engineer posts to fight racial profiling</p>
  • <p>Later attempt</p>
  • <p>Later attempt</p>
  • 01 /04

    First attempt to re-engineer posts to fight racial profiling

  • 02 /04

    Later attempt

  • 03 /04

    Later attempt

  • 04 /04

    Later attempt

Nextdoor has a problem. The hyperlocal social network was designed to help neighbors privately share information like garage sale listings and handyman recommendations, but it has also become a hotbed for racial profiling. According to an East Bay Express report last year, white residents in Oakland regularly labeled black residents "suspicious" for innocuous activities such as walking down the street and driving cars. In Seattle, some Nextdoor message boards have taken on an air of "paranoid hysteria," as mayor Ed Murray said.

San Francisco-based Nextdoor has decided to do something about it in a new update that just rolled out nationwide. "If you’re posting about an individual and providing race, you have to provide a full description," says Nextdoor CEO Nirav Tolia. "Not just ‘a black guy broke into my car’—race and sex don’t count for our definition—someone has to say ‘a black man wearing a red shirt and glasses, 6’ 2", broke into my car.’" Inspired by how authorities handle 911 calls, it's a small UI tweak that Nextdoor hopes will help reduce bias.

The Rationale
Ideally, in being forced to describe someone in more ways than just their skin color, neighbors will be coerced into conscientiousness, all while flooding the service with less racially charged conversation. As a result, authorities would have richer descriptions on real suspects who actually could have committed crimes. (The second big change is that the service no longer lets you report generalized suspicious activity. As walking down the street is not a crime, reports have to be criminal, or potentially criminal, behavior.)

Social media has become a safe haven for prejudiced speech—whether it’s your uncle on Facebook sharing Trump memes, or anonymous Gamergaters harassing women on Twitter. What's intriguing about Nextdoor is that it is treating such racism—and even the less overt unconscious bias that invisibly changes the way we talk about and treat certain groups—not as an unavoidable innate instinct, but as a design problem to be solved.

"It wasn’t that people were racist and we had to figure out how to change them from racist to not racist," says Tolia. "There are cases that people are just racist, but those cases are 1% of 1%. A majority of cases are, people are trying to be helpful."

Inspired By 911 Calls
Facing critics, Nextdoor met with Oakland community groups and the police department. They learned about racial profiling, and then specifically, how 911 dispatchers had been trained to avoid it.

"We were struck by, if you called 911 rather than posted on Nextdoor, how would this be different? Social networks are almost like voicemail boxes. You can leave whatever you want, and I just listen," says Tolia. "But if you call 911 and say, ‘I see someone breaking into a car,’ they’d say, ‘Describe the person. If I say, ‘The person is dark-skinned,’ they’d say, ‘What else can you see? What are they wearing? How tall are they? Can you see their shoes? That’s what happens." Not only does the discussion move the caller beyond stereotypes, the police gather more useful, identifying information on the suspect. "Then the challenge is, how do we proxy human conversation with a form?" Tolia continues. "That’s the user interface challenge."

What Nextdoor developed is, at first glance, less elegant than the clean, blank chat boxes that line Facebook and Twitter today. Tolia admits it looks like one of those step-by-step wizards out of TurboTax. When people report a crime, they’re asked to describe the incident in step one. Then in step two, they describe the person. It asks for hairstyle and clothing first. Race is buried toward the bottom. If users don't specify race at all, they can move on. But if they specify race and nothing else, the form will stop them, forcing them to fill out at least two descriptive identifiers if they want to submit the tip. In other words, you don’t have to describe a suspect at all, but if you describe them simply by the color of their skin, the system is going to intervene.

Honestly, it all looks a little clunky for modern web design, but that’s the point. "Most social networks, from an interface standpoint, are trying to remove most friction," says Toila. "Our approach is counterintuitive because it adds friction, but we think it reduces racial profiling and creates better content."

The Results
In preliminary testing, Nextdoor claims the changes have reduced racial profiling on the service by 75% (which was evaluated by a blind study of internal/external judges, who screened posts for mentions of solitary profiling descriptors like "black guy"). But it’s also come at a cost of quantifiable engagement, the gold standard for social networks. They’ve seen a 50% increase in abandonment rate—the amount of people who ditch out on reporting a crime.

"That’s something we knew would happen. The more forms you add, the less likely it is people will get from start to finish," says Toila. "The intuitive thing for us is, ‘Oh geeze, we don’t want fewer posts!’ But as we talked to police departments . . . we learned more of certain types of posts aren’t constructive."

[Cover Photo: Chris Saulit/Getty Images]

loading