Would Steve Jobs Have Become Steve Jobs Using A Computer Designed By Steve Jobs?

Multitasking, facilitated by modern computers, is considered a significant threat to our ability to focus. But a new study says we’re adapting just fine.

Would Steve Jobs Have Become Steve Jobs Using A Computer Designed By Steve Jobs?
Image: via Virally Positive

In 1975, Steve Jobs and Steve Wozniak took part in their first great collaboration, designing the Atari game Breakout by working four straight days and nights. It was an impressive feat of technical ability, to be sure, but it was an even more extraordinary display of concentration. The Steves of would-be Apple fame were so exhausted afterward that they both got mono.


Writing at the New Yorker last month, Tim Wu made the provocative (and rather ironic) suggestion that Jobs and Wozniak might have failed in that same task today, because they would have been working on modern computers. “Today’s machines don’t just allow distraction; they promote it,” writes Wu. By making it simple to shift between so many tasks, new computers may also make it impossible to focus on a single one, he says: “In short: we have built a generation of ‘distraction machines’ that make great feats of concentrated effort harder instead of easier.”

You need not be a Luddite to appreciate Wu’s coinage. A MacBook does, at times, feel like a machine engineered to distract; its appeal isn’t just that you can run Gchat and TweetDeck and iTunes and Excel and 25 Firefox tabs at once, but that you can toggle among them merely by swiping four fingers upward on the track pad. So the question is: Would Steve Jobs have become Steve Jobs using a computer designed by Steve Jobs?

Image: Flickr user Mark Mathosian

On one hand, there’s plenty of scientific reason to fear that our “distraction machines” will corrupt human attention. In a 2009 study with 32 participants, a group of Stanford researchers tested whether multitaskers were truly better at handling multiple tasks at once. They set up a basic task-switching test: participants saw a cue (“letter” or “number”) followed by a letter-number pair (e.g. 2b), then decided based on the cue whether to classify the “2” (as even or odd) or the “b” (as vowel or consonant). The test measures how quickly people respond when the cue stays the same versus when it switches on the fly.

What the researchers found came as a major surprise at the time: people who were heavy media multitaskers took longer to switch gears than those who were light multitaskers. It looked like chronic multitaskers had trained their brains not to stay focused, says Clifford Nass of Stanford, a collaborator on that work. Their default brain setting had been switched to “distracted”–as if their distraction machines had turned them into distraction machines.

“There’s been a lot of talk of designing software that makes it difficult or impossible to multitask, but the truth is people won’t buy that software, so we have a real problem,” Nass tells Co.Design. “Heavy multitaskers are a larger and growing percentage of the population. If we don’t design for them we’re going to be in huge trouble.”

Image: Flickr user Cedric Motte

On the other hand, new evidence calls some of these earlier conclusions about multitasking and attention into question. A study published this month, written by psychologists Reem Alzahabi and Mark W. Becker of Michigan State University, attempted to replicate the task-switching experiment done in 2009. There was one glaring difference this time around: the researchers reached the exact opposite conclusion.

To their own surprise, Alzahabi and Becker found that heavy media multitaskers in the study–which had 92 participants–switched tasks more quickly than light multitaskers. When they ran the test again, just to be sure, this time with 58 participants, they got the same results. They had no choice but to conclude that heavy multitaskers had acquired “abundant practice” switching tasks and learned to use this new skill to their advantage. In other words, multitasking might not interfere with attention after all, and could even make some people more efficient.

“At a general level, these data suggest that multitasking with media may not be as potentially detrimental to cognitive functioning as previous reports suggest,” Alzahabi and Becker wrote. If that’s the case, then one would expect over time that people who prize distraction will use modern computers to scatter their focus, and people who don’t will find software that keeps them on task. Perhaps rather than succumb to our distraction machines, we can adapt and conquer them.

The scientific discussion will continue. Nass says methodological differences explain the discrepancy; participants in the new study used two hands during the test rather than one, which brings different parts of the brain into play and therefore gives heavy multitaskers an advantage. Several attempts to reach Becker or Alzahabi failed, but in their paper they consider the possibility that “the nature of media multitasking has changed so dramatically in the few years between studies that its impact on cognition has changed,” too.

The debate over whether Jobs and Wozniak would flourish or perish using today’s distraction machines will go on even longer. Wu gleans one simple design lesson from this thought experiment: “our computers should never make us stupider.” That’s not quite the right framework. The fear isn’t that we’ll become stupid, but that we’ll use our time foolishly. In that sense, it’s important not to forget that what Jobs and Wozniak produced with their laser focus was an Atari game. You might be a light multitasker or heavy one, but we all have to distract ourselves somehow.

About the author

Eric Jaffe is an editor at CityLab, where he writes about transportation, history, and behavioral science, among other topics, through the lens of urban life. He's also the author of The King's Best Highway (2010) and A Curious Madness (2014).