advertisement
advertisement

Google Lens Sucks, For Now

My most anticipated feature of the Pixel 2 is barely there.

Google Lens Sucks, For Now

Five months ago, Google’s CEO Sundar Pichai showed off a compelling vision of the future of search. Called Google Lens, all you had to do was aim a smartphone’s camera at a restaurant to see its reviews, or at a Wi-Fi router’s barcode to connect to a local network. As I wrote at the time, “Alone, each of these ideas is a bit of a novelty. But all built into one platform . . . you can see how Google is imagining breaking free of the Search box to be even more integrated with our lives, living intimately in front of our retinas.”

advertisement

Google Lens is now available for the first time with the launch of Google’s Pixel 2 smartphone, where it’s labeled as “beta.” But testing it out for myself, it’s clear that the Lens that Google delivered couldn’t be further from its onstage demo five months ago. That’s not to say the platform is dead or that visual search is doomed–just that Google is rolling out Lens incredibly conservatively.

For one, Lens is buried within the phone. As a point of priorities, note that you can activate the Google Assistant with a tap, a squeeze, or even an “okay Google” if you have the phone in always-listening mode. I couldn’t even find Google Lens until I searched through the press materials Google provided. It ends up that Lens isn’t in Google’s camera app, search app, or even the Assistant (though I’m told Lens is actually coming to Assistant in the coming weeks). It’s in the Google Photos app. When viewing photos in your library, you can tap on its icon to activate Lens. And since you can’t actually take photos in the Google Photos app, that means Lens is several steps removed from “snap to search.” It’s more like, take a photo, open Google Photos, add the filter, see if Lens provides anything useful. Positioned in this way, there’s just no way anyone would use it as a habit in their everyday life.

Furthermore, a Google spokesperson clarified that this incarnation of Lens–as a feature in Google’s existing apps, rather than an app unto itself–is going to be the way Lens rolls out for the foreseeable future. So any visions you had of some Google-powered augmented reality search, like holding up your phone to identify a plant, is a long way off.

But how does Lens perform when you do dig into Google Photos to use it? Again, it’s nowhere near the stage demo–and truthfully, its existing solutions call into question the value of visual search altogether. Taking a photo of my black Nikes, I see . . . some similar black-and-white shoes in Google image search, and links to several DSW pairs that look almost nothing like my own. A photo of a bottle of mustard pulls up its product and website (the website was actually seen printed on the bottle, which Lens highlighted to show me). A photo of my address on a package turns the address into copy and pastable information, which is just slightly harder than just typing it. A photo of a restaurant brings up nothing at all–no reviews, no map information. Google will be adding this feature to Lens later, I’m told.

advertisement

Suffice it to say, there’s no great reason to use Lens right now. And it makes me wonder, if the best-case scenario is that I can use Lens to take a photo of something to get more photos of said thing, is it really useful at all?

It’s not, which demonstrates what a design challenge it will be to turn Lens into something as meaningful as Google’s longstanding text search. A picture may be worth a thousand words–but frankly, when it comes to search, maybe that’s 990 too many.

If I search, “shoe deals,” Google has a pretty good idea that I want to buy shoes. But with a photo, all it has is an image. Lens needs to be able to understand what I’m photographing, sure. But it also has to guess why I’m taking that photo with an intent that will always be changing. I might take a photo of my shoe to order more of that specific shoe, or to find new Flyknit Nikes on sale, or to get the review of the shoe in Runner’s World, or to find clothes that match that shoe for a new wardrobe. I might photograph a restaurant to see its Yelp rating, to learn its phone number, to make a reservation, or to see its specials. I might photograph a condiment because I want to buy it, to ensure it doesn’t contain peanuts or shellfish, to learn good mustard recipes, or figure out if it’s paleo because I’m on a new diet as of that week.

Of course, Google can throw a wall of varying results at me with Lens, but it won’t be as elegant as the top link or card that Google gives me now in Search, which is so often right. For visual search to feel both wondrous and essential, Lens doesn’t only have to do a whole lot more with a photo I take than it does today. It has to know why I’m taking that photo in the first place.

About the author

Mark Wilson is a senior writer at Fast Company. He started Philanthroper.com, a simple way to give back every day.

More