Open source. Web 2.0. Government as a platform. These are the memes of Internet evangelist Tim O’Reilly. They’re words we’ve all heard but few people can specifically define, and that’s for good reason: They’re cultural Rorschach tests, bold headlines that await society to fill in their copy. And after reading this essay by Evgeny Morozov, you’ll never want any part of them again.
It’s a long read, but a necessary one. Because over several thousand words, Morozov closely scrutinizes the thinness of O’Reilly’s buzz-phrase philosophy that’s attempting to permeate health care, government and even warfare. It’s a takedown so brutal that, by the end of the essay, you may actually feel bad for O’Reilly, as if Evgeny Morozov is bare-knuckling the face of O’Reilly’s long-knocked-out body. From the piece:
And soon Web 2.0 became the preferred way to explain any changes that were happening in Silicon Valley and far beyond it. Most technology analysts simply borrowed the label to explain whatever needed explaining, taking its utility and objectivity for granted. “Open source” gave us the “the Internet,” “the Internet” gave us “Web 2.0,” “Web 2.0” gave us “Enterprise 2.0”: In this version of history, Tim O’Reilly is more important than the European Union. Everything needed to be rethought and redone: enterprises, governments, health care, finance, factory production. For O’Reilly, there were few problems that could not be solved with Web 2.0: “Our world is fraught with problems … from roiling financial markets to global warming, failing healthcare systems to intractable religious wars … many of our most complex systems are reaching their limits. It strikes us that the Web might teach us new ways to address these limits.” Web 2.0 was a source of didactic wisdom, and O’Reilly had the right tools to interpret what it wanted to tell us—in each and every context, be it financial markets or global warming. All those contexts belonged to the Internet now. Internet-centrism won…
…In a fascinating essay published in 2000, O’Reilly sheds some light on his modus operandi. The thinker who emerges there is very much at odds with the spirit of objectivity that O’Reilly seeks to cultivate in public. That essay, in fact, is a revealing ode to what O’Reilly dubs “meme-engineering”: “Just as gene engineering allows us to artificially shape genes, meme-engineering lets us organize and shape ideas so that they can be transmitted more effectively, and have the desired effect once they are transmitted.” In a move worthy of Frank Luntz, O’Reilly meme-engineers a nice euphemism—“meme-engineering”—to describe what has previously been known as “propaganda.
But what’s the real harm of meme-engineering, you might ask. The harm is when these ideas find their way into our government ("government 2.0," "open government" or "government as a platform"--the meme’s name is still in flux). Morozov points to a future--a future O’Reilly very much seems to want--in which the government is a metaphorical app store, and it’s up to the citizens to code and debug their own future.
So what are we to make of O’Reilly’s exhortation that “it’s a trap for outsiders to think that Government 2.0 is a way to use new technology to amplify the voices of citizens to influence those in power”? We might think that the hallmark of successful participatory reforms would be enabling citizens to “influence those in power.” There’s a very explicit depoliticization of participation at work here. O’Reilly wants to redefine participation from something that arises from shared grievances and aims at structural reforms to something that arises from individual frustration with bureaucracies and usually ends with citizens using or building apps to solve their own problems.
While citizen action seems like a perfectly nice idea on paper, the truth of the matter is that we have government-sponsored programs specifically to run the oft-unprofitable but much-needed “apps” in our lives (like social security, food stamps, or housing subsidies), and those of us most capable of solving our own problems will always, always, always have the least.
Slate also ran a back-and-forth on the topic, culminating here.