At first glance, it makes little sense. It’s a black sphere with a thin strip of rainbow pulsating LEDs. Maybe it’s something crafted in the deepest dungeons of Sony without the suits finding out, like the Rolly—conceptual masturbation more than a consumer good, something made mostly because it could be.
In reality, it’s the Nexus Q, Google’s remarkably practical spherical media streamer. It has the guts of a tablet—in fact, the same internal horsepower as you’ll find in the Nexus 7 along with an amplifier—and the body of a sci-fi bocce ball. Look closer, and it does have “buttons” of a sort. Twisting the unit can change the volume, and tapping it will mute the machine. Its only design nod to the year 2012 is a slew of cords pouring out its back for speakers, home automation, HD videos, and even “general hackability,” as Google mentioned in their press conference.
With so few identifiable features, the Q becomes something of a monument, an obsidian obelisk merely referencing some higher power. That power, specifically, is the cloud, from which Q leverages Google Play to stream music, movies and YouTube clips, all controlled by people around the device sporting Android phones. The “Q” is really a shared “queue,” and friends can swap in and out songs on the playlist.
It’s the cloud in physical form, social networking in person. It’s some sort of digital Rosetta Stone, connecting all these parts of our life—and that’s the most important trend (and design challenge) in media sharing today. How do we pull the collaborative interfaces of Facebook into the shared physical gadgets in our living rooms? Q is doing it through Android phones and touch screens, connecting an unfamiliar sphere with the familiar phones in our pockets. They’re not using voice or gestures, but the Q’s general form is so devoid of functional cues that Google doesn’t seem to be making a statement either way. You control the Q with your smartphone today. But could you control a Q V2 with a wave of the hand or a confident vocalization? Why not? If sci-fi has taught us anything, it’s that a sphere has endless capability. The form itself is input agnostic.
But did I mention that it’s $300? And it’s proprietary through Google Play? And for whatever design flare, it’s still just a media streamer? What did Google do in designing the Nexus Q? They didn’t make something to sell. They couldn’t have, as it’s a product without a market. Google built something to make a statement.
To understand the design and purpose of the Nexus Q, you need only understand one thing: This is the first piece of hardware Google has ever built. They’re approaching two decades of dominance in Internet technology, and proving that they won’t let up during the smartphone revolution, either. But for all this time, Google has farmed out their hardware like Microsoft farms out PC manufacturing. They made Android, but they didn’t make Samsung phones. They made Google TV, but they didn’t make Logitech boxes. Consider Android fragmentation. Consider the mega QWERTY controllers of Google TV (even if Logitech’s controllers were technically in Google’s platform spec). And all of the sudden, Google has a quip for every complaint a naysayer could have.
And it’s $300. (Because it’s built nicely by people who aren’t jumping off buildings!) And it’s proprietary. (If Apple can do it, we can, too—plus we’re telling you to hack it!) And—wait—it’s actually just a media streamer, right? (What, because you really want a web browser on your TV? Idiots.)
Google won’t sell many Nexus Qs, not because it’s a bad product (like Google TV) but because it’s too expensive for whatever exactly it is. And the same might be said for Project Glass, if it ends up costing anywhere near the developer preview price of $1,500. But for now, Google isn’t designing for the mundane scale of a McDonald’s cheeseburger. They’re designing obsidian obelisks—beacons for our future. And even if the market isn’t ready for them today, don’t be surprised when we finally meet Google there tomorrow.