advertisement
advertisement

This MIT Tool Enlists Your Squad To Stop Toxic Internet Harassers

The most effective anti-harassment tool might be the most obvious: your friends.

Tech companies like Facebook, Google, and Twitter employ thousands of people to moderate violent and disturbing content on their websites. But neither legions of moderators nor sophisticated machine learning algorithms can curb the rampant harassment on their platforms. Abuse is a problem at Twitter in particular, which Fast Company‘s new cover story explores in great detail.

advertisement

Amy Zhang, a Ph.D. student at MIT’s Computer Science and Artificial Intelligence Laboratory, has studied online harassment in detail. She’s found that many victims take an old-fashioned approach to surviving the flood of violent, racist, and sexist comments that come pouring into their inboxes: they ask their friends for help.

That’s because their friends often understand the context behind the harassment. For instance, an algorithm or a human moderator might not flag a word that’s recently emerged as an epithet against a marginalized group–but a friend, who is likely part of the victim’s community, would know it. Zhang interviewed one victim whose ex-partner would harass her with inflammatory messages during important business meetings–again, something that a friend would be far more attuned to. Zhang found that some people who are harassed will give their friends their account passwords and ask them to clean their inbox of the worst messages; others will ask that a friend read the positive comments on a video aloud to them and skip over the negative ones; some will request that friends report particular users to Twitter.

Zhang wondered–could she somehow formalize that behavior, making it easier for people to handle the onslaught of harassment and simpler for their friends to help them do so?

Amy Zhang [Photo: Jason Dorfman/MIT CSAIL]
That’s the idea behind Squadbox, an open-source web application designed to transform helpful friends into more effective moderators. It can work in one of two ways. If the harassed person wants to remain in the public sphere and let people contact them easily, Squadbox will provide them with an email address. Then, when someone reaches out to them, the message goes straight to their Squadbox account, where one of their friend-moderators will review it. According to what the harassed person has specified beforehand, the moderator can delete any abusive messages, forward on clean messages, or send along messages with tags to let them know what’s inside. Alternatively, Squadbox can help you set up filters that will detect certain words, or usernames, and send those messages to your friend-moderators first.

The system is designed to work best with a group of moderators who all tag-team the task. As emails come in, they’re distributed evenly to all the moderators, who get notifications saying there’s something in their queue. Then, when they feel like tackling the hurtful things someone is saying about their friend, they can go to Squadbox’s site and review the messages. Squadbox is also meant to be flexible, accommodating anyone who might want to adapt the basic idea of friend-moderators into something that makes handling harassment more manageable.

“So far we’ve demoed it to several people who have been harassed, including some really high-profile harassment targets,” Zhang says. “They’ve been overwhelmingly positive.”

advertisement

[Image: Amy Zhang/MIT CSAIL]
Zhang and her colleagues, MIT professor David Karger and former MIT student and software engineer Kaitlin Mahar, are presenting a paper that details their research on the topic at ACM’s CHI Conference on Human Factors in Computing Systems later this month, but Squadbox is already available to the general public. They’re also partnering with Jigsaw, the Alphabet-owned incubator that focuses on problems like extremism and harassment, to integrate a Jigsaw API that predicts harassment into Squadbox.

For the time being, Zhang is focused on building out the platform. Though people get harassed across all kinds of platforms, right now Squadbox is only designed to handle emails. She says the structure may also work for Facebook and Twitter, depending on their APIs–the next step is just to build it.

About the author

Katharine Schwab is an associate editor at Co.Design based in New York who covers technology, design, and culture.

More