Google Knows If You’ll Like A Photo Before You Even See It

And on a scale of 1 to 10!

Google Knows If You’ll Like A Photo Before You Even See It
[Photos: Oliver Wendel/Unsplash, Modestas Urbonas/Unsplash, Jeremy Bishop/Unsplash, Peter Hammer/Unsplash]

Sunsets. Solitary trees. Silky, long exposure water. They’re photographic clichés, and yet, we love them.


Now? Google has quantified that love.

With its new neural net NIMA, Google has trained software to critique a photo on a scale of 1 to 10 just as a human would. To build the system, Google referenced a 255,000 photos that were taken by amateur photographers and graded by humans in photography contests. That data set, of photos and their respective scores, was fed into NIMA to create a system that is so good at identifying the general likability of photos, its scoring is more or less indistinguishable from that of humans.

[Image: Google]
“After training, the aesthetic ranking of these photos by NIMA closely matches the mean scores given by human raters,” Google researchers write. “We find that NIMA performs equally well on other data sets, with predicted quality scores close to human ratings.”

Looking at the photos next to their NIMA scores, it’s easy to agree with the machine. It’s also easy to spot what it is that we like inside the photos: sharpness, HDR-level contrast, gradients, high saturation, and, seemingly, compositions that respect long-standing art principles like the rule of thirds.

That likability has practical application. Researchers demonstrated that, when feeding NIMA several photos of the same sailboat, it would pick the one with the least distortion. NIMA can also be used to judge just how far photo filters should take their enhancements, from contrast to sharpness. Further afield, you could imagine a NIMA-like system presorting your iPhone Camera Roll to pick the best photo for you to Instagram–the one most likely to get all of those addictive hearts. Or perhaps NIMA chews its way through your Google Photos library, surfacing only the most likable photos, and burying all those blurry, dizzy shots somewhere else (the paper makes no mention of Google Photos integration, though clearly Google is doing a lot of algorithmic photo assessment on the platform already).

If NIMA gives you a weird feeling in the pit of your stomach, you’re not alone. A machine that can quantify artistic taste so accurately suggests that maybe people aren’t so unique and creative after all, and that’s an unsettling thought. But like so many of these conveniences that modern AIs make possible, that uncanny feeling will likely disappear the first time you save a few minutes by letting the machine do the work for you.

About the author

Mark Wilson is a senior writer at Fast Company. He started, a simple way to give back every day.