Google's Project Glass: Inside The Problem Solving And Prototyping

We speak with Steve Lee about solving the need for ubiquitous computing—while making the tech as unobtrusive as possible.

At Google X, the company’s now-not-so-top-secret R&D lab, engineers and neuroscientists and artificial-intelligence experts dream up a future without the pressure of market deadlines: driverless cars, robots, space elevators. But for lead product manager Steve Lee, his X pursuits are anything but an exercise in the fantastical: Project Glass, the futuristic eyeware he’s developing with an interactive heads-up display, might just hit market in the near future alongside products like Gmail and Android.

For Lee, it’s a matter of wrangling a sci-fi idea into a practical product. Whereas Apple and Microsoft have grounded their mobile future in the belief that the Post-PC World will revolve around the pillars of smartphones and tablets, Google is adding a late, left-field entry into the mobile space that’s as much of a technical feat as it is a fashion statement. In the first part of our interview published earlier week, Lee told Fast Company "something like this has never been created before." Today, he tells us what goes into designing a product like Google Glass.

The Problem: How Do You Keep People Connected, But Still Present In Meatspace?

Lee calls much modern-day technology a distraction. "If you walk around the streets of New York, people have their smartphones out and they’re looking down. They do that while they’re standing around, waiting for a bus or a taxi, or while they’re walking. Even if you go out to dinner with a friend or a date, the technology is taking them away," he says." But at the same time, people clearly have a desire to be connected to the Internet. We thought that was a really interesting problem to solve: trying to get technology out of the way while allowing people to still be connected out in the real world."

Lee ticks off some annoying use cases of traditional tech: holding a device up for an extended period to record a video, which he calls "fatiguing," or yanking out your phone to look up a map or share a photo. "Let’s say you’re meeting a friend at a bar you haven’t been to before. You just want to quickly check a map to make sure you’re going in the right direction. Even with my phone, it still takes a long time today to pull it out of my pocket, unlock it, open up Google Maps, and zoom into my location," he says. "A design goal for Project Glass is to make that much faster and much easier. What may take 30 to 60 seconds on a phone will instead take two to four seconds on Glass. Making that substantial of an improvement on speed and access will hopefully prove to be a game changer. I mean, you can capture moments with something like our Project Glass prototype that are impossible or just inconvenient or awkward with camera phone. If you can do those things in a few seconds, that’s going to be really meaningful to people."

It’s actually why the team called the device Google Glass, as opposed to say, Google Glasses. "Glass has a lot of connotations but certainly one is how to view this technology: being transparent and getting out of the way," he says. "Past wearable computer projects that people have seen likely conjure up something that gets in your way and blocks your vision or senses. That’s actually counter to our project goal. Everything around our design is exactly the opposite of that."

Another Problem: How Do Users Interact With It?

We’ve gone from mouse and keyboard controls on PCs, to touch screens on smartphones and tablets, to hand gestures on Xboxes. Now with Glass, Google wants to move to the next level of interactivity. But how? Lee’s team hasn’t settled on specifics but he takes me through the experimentation process.

"The input to this device is a real challenge because there is no physical keyboard—just like a phone doesn’t have a physical keyboard—and there is no touch screen either. How do you input? We’ve dabbled and experimented with lots of different types of input including using your voice, using some type of touch interface on the side of the device itself, as well as using your head," Lee says. "So using your head as input, for example, we’ve tried dozens and dozens of different types of head gestures. As you can imagine, some are more extreme than others. It creates a pretty funny experience. In fact, we created a game internally, both to exercise and test things out, but also to demonstrate the absurdity of using your head. It’s kind of like DDR but with your head instead of your feet."

"We created some pretty funny videos," he adds, with a chuckle. "I think there will likely be some way to move your head, which is comfortable and natural for a user, as well as not make them look odd and strange. But there’s many, many [gestures] that would definitely make you look strange to observers."

Trying To Solve The Dork Factor

Designing Google Glass isn’t like designing an app on a smartphone that only that user and that user alone will see. All factors must be considered—the comfort, the style, the ergonomics, the societal acceptance—because, as Lee puts it, "You care about how it looks on your face. Unlike software or even hardware like a phone which you can sort of sneak into your pocket, this is quite visible—you can’t hide it."

Steve Lee. Photo by Christophe Wu for Google.

It was a lesson Lee learned while playing around with early prototypes—such as a wearable computer he housed in a backpack to power the eyeware. "As you can imagine: not super comfortable. Setting aside style issues, it was simply not comfortable. Another thing that you really start to appreciate when you put things on your head is how important weight is—every single gram. I’ve never thought about 0.1 grams before as much as I have on this project," Lee says. "Societal acceptance and style are also extremely important. Because we can create the coolest Google technology and functionality, but if it’s embarrassing to wear around people, then it’s not going to get adoption."

Lee acknowledges that because of all the style, functionality, and ergonomic issues involved, it is "unlikely that we’ll be able to service everyone, though that is our ultimate aim." When asked whether Google might outsource the product’s hardware to fashion designers such as Gucci and Prada, as we have suggested—and just as Android does with smartphone makers—Lee seems open to the idea.

"I think it’s TBD," he says. "There’s just all these different connotations and permutations—like eyeglasses and sunglasses—so it’s really hard to address everyone. Certainly we’re going to consider partnering with various folks to accelerate that. But it’s really early on, and I don’t know how exactly it’s going to unfold. But we’re certainly considering it."

The Prototyping Breakthrough: Getting The Thing Out Of The Way

We’ve seen the current prototype now in design renderings, concept videos, on Charlie Rose, and even on Sergey Brin. Lee takes us through the design behind the current iteration of Project Glass. (For insight into early iterations, check out the first part of our interview with Lee.)

"There are a number of attractive things about this form factor: It’s actually quite sleek and has a nice profile," he says. "One key aspect of the prototype that we’re showing right now is that the display piece is actually up and out of the way. That was a key insight that we learned while developing this. Again, it gets back to the point of trying to free up people’s senses, and get out of the way. Putting lenses and things in the way of your eyes, we think is certainly a real challenge and has drawbacks. Putting the display up and out of the way, I feel very comfortable having a face-to-face conversation with someone, who doesn’t feel weird or odd, because I can make eye contact with them."

Add New Comment


  • Mlavary

    What if I already wear glasses?  Will I be discriminated against and denied my Google Glasses?!

  • Mhuntm

    Watch out for the smoke and mirrors. Looks great, right? Where are the batteries? Unless this is wireless (whcih means look for larger batteries and a phone somewhere), also look for the wire to the computing device which gives you web access. Interesting vision but not reality.

  • Kevin Mako, Mako Invent

    amazing, as soon as a new technology concept is released (a while ago for
    this), at our invention development firm we get inundated with home inventors
    coming up with brilliant applications for it, long before the base technology
    is even released to market.  It's almost
    as if the innovation lifecycle of a product is quickly compressing as information
    exchange and collaborative efforts increase in society.

  • Bennet

    I'm hoping that eventually this will have eye and environment tracking to implement the interface as something that appears as if it exists in the real environment. IE. Have your information and notes and everything floating next to the individual you're having a conversation with and not pass in front of the person. IE. True augmented reality.

  • Linda Varone

    I don't get it. Several months ago this blog slammed a Microsoft video that showed the future with handheld and tablet systems that did your thinking for you. Calling MS's vision of the future "dystopian" (I agree.) The only difference between what MS predicted and what Google Glasses is striving for is that the Google technology is hands free.

  • Stephen Garr

     One it's Google Glass and two you forgot to mention that it has the capability to record and stream all your activities to wherever they want to.

  • Viktor Popovski

    Great Idea. I love how Google is into innovations that drive us forward to a more sophisticated  way of life.

  • Cameron

    Progress: taking another step towards a complete virtual barrier between you and reality and tangible interaction...

  • Carolyn J. Hogue

    because I can make eye contact with

  • Mark Rojas

    Nice piece on Google Glass, I especially thought the insights on how to interact with the product was funny. I can imagine a bunch of testers shaking their heads frantically in every direction or like the scene from A Night at the Roxbury. I'm looking forward to more information about this interesting product.