Current Issue
This Month's Print Issue

Follow Fast Company

We’ll come to you.

1 minute read


Why Google's Deep Dream A.I. Hallucinates In Dog Faces

It turns out Google's neural network is obsessed with canines for a reason.

Why Google's Deep Dream A.I. Hallucinates In Dog Faces

We've all been getting a kick out of what artists and developers have been doing with Google's Deep Dream, the neural net powered hallucination AI. Now you can play with it for yourself, thanks to the Dreamscope web app. Just upload an image, pick one of 16 different filters, and turn that image into a hallucinogenic nightmare of infinitely repeating dog eyes for yourself.

Which probably has you wondering: what's up with all those dog eyes anyway? Why does every single image Deep Dream coughs up look to a lesser or greater extent like Seth Brundle from The Fly crammed his teleport pod full of canines then flipped the switch? As it turns out, there's a pretty simple answer for this.

As you may know, Google's Deep Dream runs off the same type of neural network that powers Google's Photos ability to identify images by their content. Essentially, the network emulates the neurons in the human brain, with a single core of the network 'firing' every time it sees a part of the image it thinks it recognizes. Deep Dream's trippy effects come from giving it an initial image, then initiating a feedback loop, so that the network begins trying to recognize what it recognizes what it recognizes. It's the equivalent of asking Deep Dream to draw a picture of what it thinks a cloud looks like, then draw a picture of what its picture looks like, ad infinitum.

Where do the dogs come in? This Reddit thread provides some insight. A neural network's ability to recognize what's in an image comes from being trained on an initial data set. In Deep Dream's case, that data set is from ImageNet, a database created by researchers at Stanford and Princeton who built a database of 14 million human-labeled images. But Google didn't use the whole database. Instead, they used a smaller subset of the ImageNet database released in 2012 for use in a contest... a subset which contained "fine-grained classification of 120 dog sub-classes."

In other words, Google's Deep Dream sees dog faces everywhere because it was literally trained to see dog faces everywhere. Just imagine how much the Internet would be freaking out about Deep Dream right now if it was trained on a database that included fine-grained classification of LOLCATS instead.

[via Rhizome]