On Friday afternoon, a jury of nine delivered a swift verdict in the Apple-Samsung patent infringement case, ruling overwhelming in Apple’s favor. Samsung, they said, had copied Apple’s designs, infringed upon their patents, and would have to pay $1 billion for the trouble. The verdict didn’t come as a complete shock, but it was more extreme than many had figured it would be, both in terms of the dollar amount of the damages and the extent to which the jury thought Apple’s patents—of which there were several types, covering both hardware and software design—had been infringed upon. Now, everyone is trying to figure out what the case means for the future of smartphone design and mobile innovation. There’s a number of people arguing that innovation has won—that because companies will have to make sure their products are different (lest they get dragged into court), we consumers will see a new diversity in the devices available to us. But there’s one problem with that thinking: Differentiation isn’t innovation.
It’s obvious that innovation—true innovation—is good for consumers. It’s what has delivered us from the flip-phone dark ages into the mobile renaissance we’re enjoying today, where nearly half of adult Americans carry a truly functional mobile computer in their pocket everywhere they go. It’s what gives us fresh visions of how we should interact with our devices. But true innovation is something—a new phone, a new function, a new feature—that’s different and smart. In the wake of Apple-Samsung, we’re at risk of seeing a lot of different and dumb. It will all depend on how broadly Apple’s patents are applied, and we still don’t know enough at this point.
One of the many effects of the unexpected success of the original iPhone, in 2007, was the establishment of a prevailing user interface language for smartphones. Apple did an extraordinary job, on a very basic level, of making a computer with a 3.5-inch touch-screen display that had practically zero learning curve. After the first time you used pinch to zoom to enlarge a map, you got it. You never had to go back to the manual.
And this is precisely where the pitfalls of differentiation arise (and where, it would seem, the existing patent system totally sucks): It’s very much possible that pinch-to-zoom is the best way of zooming in on a map. Forcing smartphone developers to come up with a different way of doing things for the sake of coming up with a different way of doing things is outrageously stupid. And, needless to say, it would be a headache for users.
When Android first came out, pinch-to-zoom was nowhere to be found. There were various reports of why this was—there was some evidence of an informal agreement between then-chummy Apple and Google dictating that the latter wouldn’t use certain multitouch features on its own mobile devices—but eventually, as Android users complained and as the Apple-Google relationship deteriorated, pinch-to-zoom was introduced. It became a sort of universal part of the mobile user experience, the language we use to communicate with our phones and tablets. And, at this point, it’s even more universal than that: Now you can use the same pinch gesture on a Mac’s trackpad and it’ll zoom in on a map or website, just like it does on your phone.
As a thought experiment, though, imagine if Google, wary of impinging upon Apple’s pinch-to-zoom patents, had decided to go another route for enlarging maps or websites on Android handsets. Maybe a two-finger-rotate type of gesture, something akin to turning a camera lens or the eyepieces on a pair of binoculars. Different, sure. But would users really benefit from having two options for how to zoom in on a touch screen? It’s hard to see how they would.
That doesn’t mean that the iPhone got everything right. It leaves a ton of areas for other smartphone makers to do something different and better. Take the very basic and hugely important matter of text input. The QWERTY keyboard was a good, familiar starting place for touch-screen keyboards, but it’s clear that they’re suboptimal, especially when you’re talking about the iPhone. I often think about how, in 2006, if you had kidnapped me, tied my arms behind my back, and blindfolded me, I could have pulled my BlackBerry Pearl out of my pocket, navigated to my messages app, and tapped out a perfectly punctuated "send help" message with one hand, sight unseen. If you kidnapped me today and I tried to do the same on my iPhone, I’d probably end up sending out friend requests in Game Center. I’ve seen some people who can do the no-look typing thing on the iPhone, but the point is that, in many ways, the touch-screen keyboard was a step backward.
The buffet of alternative keyboards available for Android offer a few examples of how things could be different and better. Take Swype, which gives you the standard QWERTY keyboard but lets you trace out words by dragging your finger from letter to letter. It’s not only faster than the iPhone’s keyboard, it’s also kind of fun to use. Or consider SwiftKey, a keyboard app which scans through your emails, texts, and tweets to learn your personal vocabulary and intelligently serves up those commonly used words as auto-complete options. Apple knows that input isn’t everything it could be, which is why, with its latest update, it baked dictation directly into the OS. Dictation, as it exists today, is different but not particularly good, and I can’t imagine it has become a serious alternative to typing for many iPhone users.
For further proof that different is not always good, just take a look at the freak show of iPhone and iPad prototypes that came out as the Apple-Samsung trial unfolded.
The prototypes were proof that Jony Ive doesn’t just sit on a white beanbag chair for a few months and then birth a new iPhone design, fully formed. They show that Apple, like every other company, comes up with some really wacky designs along the way. In the collection made public, there were octagonal iPhones, slender iPhones, iPhones without home buttons. They’re all different from the iPhone that Apple eventually released, but they’re definitely not better. And it’s similarly hard to say that customers would be better off if they were given the choice between buying the iPhone 4, or an iPhone 4 Retro Edition (with the 3G design), or an eight-sided Octo-iPhone.
What the prototypes do show is that Apple is pretty damn good at singling out a good design and concentrating ruthlessly on refining it in every detail. And because it doesn’t offer a variety of iPhones, the design of the iPhone it does offer is imbued with a sort of self-confidence. The iPhone is, unequivocally, the best possible design for a smartphone, as Apple sees it. If there was a better way of delivering a mobile computing experience, that’s the one they’d offer. Samsung, by contrast, offers the Captivate, the Continuum, the Charge, the Epic, the Exhibit, the Fascinate, the Indulge, the Infuse, the Mesmerize, the Replenish, the Vibrant, and a whole bunch of things with Galaxy in the name. Is each one of these an innovation upon the last? No, they’re merely different.
But I should be clear: I’m not saying that smartphone makers should take the iPhone and build on top of it. I’m saying they need to come up with their own guiding principles and build an amazing product that fits them. I agree with many others that iOS is starting to look dated, and I welcome something different. But it can’t just be different, it has to be different and smart. Different, but every bit as intuitive as the iPhone. Different, but not ignorant of the language the iPhone has established, a language that people have come to know and understand and enjoy using. Ideally, companies need to create something different that they can put out there with the full-throated confidence Apple shows in its own mobile vision.
Actually, this thing exists. It’s the Windows Phone.
Though it’s a distant third in market share, Microsoft’s new OS represents mobile innovation at its most thoughtful and exciting. It chucks out the corny skeuomorphism and tired faux 3-D design elements Apple has been relying on for a clean, geometric aesthetic. Instead of giving users a grid of application icons—shortcuts into discrete mobile experiences of their own—it instead presents a grid of information, using dynamically updated tiles that give more data, at a glance, than iOS or Android could ever hope to show. Whereas Apple’s idea of personalization was an almost insultingly simplistic folders feature it introduced with iOS 4, Microsoft’s next update will allow users to resize and reposition their dynamic tiles as they see fit. The biggest innovation, here, might be ceding that sometimes the user knows best what works for her.
The new platform is both utilitarian and beautiful, and Microsoft seems to believe in it enough to stake its entire future on it; the aesthetic is being carried over into Windows 8, the company’s radically redesigned desktop operating system. With the last few revisions of OS X, it’s clear that Apple is attempting the same sort of mobile/desktop cross-pollination, though so far its version hasn’t been quite as convincing. Apple, which jumped so far ahead in the mobile world with iOS, now has the difficult task of retrofitting OS X with some of those philosophies; Microsoft, starting late, had the advantage of building both at once.
And in terms of hardware, the flagship Windows Phone device, the Nokia Lumia 900, shows that it’s possible to build a lust-worthy smartphone that in no conceivable universe would be confused with the iPhone 4. In the recent trial, Samsung short-sightedly argued that Apple’s minimalist black rectangle was some sort of logical conclusion for smartphone design. The Lumia 900—boxy on top, round on the sides, and bright blue all over—proves that this simply isn’t the case.
Windows Phone has a lot of catching up to do, but Microsoft seems willing to put it out there, stick with it, and give it time to do that catching up. In the wake of the Apple-Samsung trial, consumers should hope that companies follow Microsoft’s lead. If the Samsungs of the world start making stuff that’s different just to be different, we’re all in trouble.
[Image: Nataliya Horas/Shutterstock]