I’m the son of Taiwanese immigrants, of a father who came to America for an education, and a mother who never made it much past middle school. Whatever I have today—a good job, a good education, an amazing wife, and the expectation that our children will have much the same—I owe to the opportunities afforded by America's singular genius for cultural integration. As economists will tell you, our stunning history of growth has been sown by immigrants, realizing their potentials.
Just as millions of others must have done when they woke to the election results on Wednesday, I asked myself whether America no longer believes in the story that I’ve lived. I didn’t find any good answers, not in the data about who voted for Clinton or Trump, and not in any of the stories I read about those data. They rang hollow because they didn't shed light on what I could do about any of it. So it was with a mix of shock and recognition that I read Max Read’s article for New York, "Donald Trump Won Because of Facebook."
Read’s argument is nuanced and devastating. You should read it. But one of his central insights is that Facebook doesn’t spread information so much as it spreads affirmation. Thanks to algorithms that track "engagement," we are cocooned in beliefs that neatly match our own. The post falsely claiming that the Pope endorsed Trump has more than 868,000 Facebook shares, while the story debunking it has 33,000. Lies spread far better than truth, because a lie that we can believe in is so much easier to share than a truth that’s more complicated than a click. As my colleague Kelsey Campbell-Dollaghan points out, on Facebook, we’ve created the 21st-century equivalent to the suburban tract developments of Levittown: A place of homogeneity rather than diversity, where the only voices we hear are those of virtual neighbors who think exactly like us.
There have been millions of words written about the dangers of the way that Facebook lets us all reinforce what we believe, while ignoring everyone else. But in Read’s piece, the part that sickened me most was this:
Facebook connected those supporters to each other and to the candidate, gave them platforms far beyond what even the largest Establishment media organizations might have imagined, and allowed them to effectively self-organize outside the party structure . . . Even better, Facebook allowed Trump to directly combat the hugely negative media coverage directed at him, simply by giving his campaign and its supporters another host of channels to distribute counterprogramming. This, precisely, is why more good journalism would have been unlikely to change anyone’s mind: The Post and the Times no longer have a monopoly on information about a candidate. Endless reports of corruption, venality, misogyny, and incompetence merely settle in a Facebook feed next to a hundred other articles from pro-Trump sources (if they settle into a Trump supporter’s feed at all) disputing or ignoring the deeply reported claims, or, as is often the case, just making up new and different stories.
Read isn’t just describing a technology problem. He’s describing a design problem.
I’ve reported and written hundreds of stories about digital design, and also designed digital products myself. That entire time I’ve always assumed that "making things frictionless" was an unalloyed good, right up there with science, efficient markets and trustworthy courts. But Read’s essay made my stomach heave, because it made me ask: Is a fully user-friendly world actually the best world we can create?
The end goal of consumer technology has always been to buff and round every corner, so that each detail is so alluringly simple that it seems "inevitable." That ethos expresses itself both in the cleanliness of an interface and in the way that software remakes itself around who it believes you are. But its beliefs about you are prone to oversimplification and caricature—which itself is a far cry from every design maxim about "knowing your users." Shouldn’t there be an algorithm and an interface that encourages something more in us than cozy self-gratification—that might even encourage something like uncomfortable self-recognition?
Good interaction design hinges upon the principles of navigability, feedback, and consistency. But in hiding complex calculations behind soothing buttons, we also lose the ability to control how things work, to take them apart, and to question the assumptions that guided their creation. A world of instantaneous, dead-simple interactions is also a world devoid of the opportunity to challenge what lies behind them. Modern user experience is a black box that has made it easier and easier to consume things, and harder and harder to remake them.
That's all the more surprising because the power and promise of the personal computer wasn’t born whole cloth. It was born of the fact that a bunch of hackers like Steve Wozniak could break machines apart and assemble their own, better machines. But as our machines have become more elegant, our ability to bend them hasn’t nearly kept pace. As easy as it is to change the preferences on your smartphone, it’s all but impossible to make a different smartphone.
The most optimistic thinkers in Silicon Valley probably believe that the answer is for all of us to be able to code. That’s why today there are so many beautifully designed products aimed at teaching kids the basics. But why should coding remain a barrier to remaking our digital world? Why isn’t it easier for all of us to peer under the hood of an algorithm, much as in a previous era we might have tinkered with our cars?
As I reread that last sentence, part of me thinks: Are you really so naive? Do you really think people would want to be able to lift up the veil behind their nice clean interfaces, so that they can be challenged or better informed? And of course it’s ridiculous to think that designers could ever solve problems of demographics, economics, or resentment—the problems that seem to have left 47.4% of the country completely unable to understand what another 47.8% of the country is thinking.
But designers don’t have to roll over either. The things we make reflect the things we value. If you value truth over falsehood, then isn’t it your responsibility to design an interface that can help distinguish the two? Should a link from hategroup.net really look the same as that of whitehouse.gov? The desire to create a service that turns news into entertainment is one motivation to build a product. But is it impossible to create a service in which the things we don’t believe are every bit as shareable as the things we do? Maybe it is. Maybe Facebook's thought ghettos are an inevitable end state for how our monkey brains are wired. But the ugliness of that idea implies a fatalism that we implicitly reject every time we dream about traveling somewhere new, or make a new friend in some unlikely place.
We should not give in. We should make better things.