advertisement
advertisement

Facebook Has Some Tips To Help You Spot The Fake News It Serves You

To slow the spread of fake news, the company is putting the onus on its users.

Facebook Has Some Tips To Help You Spot The Fake News It Serves You

In Facebook’s latest step to curb fake news, the service wants to teach its users to stop believing fake news in the first place. The company is launching a media literacy campaign called “Tips To Spot Fake News” that, for the next few days, will be placed on the top of every user’s News Feed in 14 countries.

advertisement

The intervention was developed in a partnership with First Draft, a non-profit dedicated to “improving skills and standards in the reporting and sharing of information that emerges online.” And while it all makes for a very nice press release, it’s yet another example of Facebook refusing to embrace its role as the world’s largest media publisher, responsible for the content that it actively places in your heavily curated feed.

Take this new “tool” itself. It’s really just a list of tips to recognize fake news for what it is. And it’s worth noting that the tips themselves are so vague that they likely will help no one. “Be skeptical of headlines,” advises tip number one, with the exact sort of vague umbrella advice that has led people to question the foundational practice of journalism itself. No, readers specifically shouldn’t be skeptical of headlines from reliable publications, like the New York Times, which over the course of more than 150 years has proven worthy of our trust. But yes, readers should be skeptical about the claims of a random blog or YouTube clip. That’s the distinction that Facebook needs to make, or it simply condemns all media.

Facebook deals with this sourcing matter in point number three: “Investigate the source. Ensure that the story is written by a source that you trust with a reputation for accuracy.” Because people believe things from sources they don’t trust? No, the problem is that people trust the wrong sources, the sources so oft shared by–oh, right, Facebook!

There’s a certain irony in Facebook telling us to be skeptical of exactly what it’s showing us. The social media company that uses algorithms to serve very specific content is telling its own users not to believe it.

“We cannot become arbiters of truth ourselves–It’s not feasible given our scale and it’s not our role,” writes the company in a press release. “Instead, we’re working on better ways to hear from our community and work with third parties to identify false news and prevent it from spreading on our platform.”

Facebook doesn’t want to be an arbiter of truth, and yet, it has already chosen to be an arbiter–of engagement, at least. Through algorithms that operate in completely opaque ways, the service shows us things that we’re meant to read and click on, and it enables these mistruths to be easily shared, garnering even more engagement and allowing Facebook to serve us more ads.

advertisement

This approach to fake news–which puts the blame on users rather than the product–is inherently disingenuous. It’s akin to Facebook funding literacy, while continuing to publish a bunch of backwards-written books.

If Facebook wants to cure fake news, it’s really not that hard. It just needs to stop serving so much fake news. But the problem for Facebook is that engagement seems to be a better business than truth. Otherwise, the journalism industry would be as hot a prospect as social media.

About the author

Mark Wilson is a senior writer at Fast Company. He started Philanthroper.com, a simple way to give back every day.

More