Current Issue
This Month's Print Issue

Follow Fast Company

We’ll come to you.

8 minute read

The Impossibly Complex Art Of Designing Eyes

Today's video game characters look amazing, with one exception: the eyes. And perfection could still be decades away.

The Impossibly Complex Art Of Designing Eyes

Jimmy Butler looks downright perfect. His statuesque cheekbones appear chiseled out of marble. His hair seems like it really grew from his head. Sweat trickles down his body with the tantalizing shimmer of a softcore porn video as his frame, 6'7" of pure core strength, barrels toward the hoop with Godzilla-like footfalls, while handling the ball as delicately as a glass sphere.

I’m not watching TV. I’m playing NBA2K17, the latest and greatest in basketball sports simulation, in which character modelers have given my favorite player the all-star treatment he deserves—except for one tiny detail that’s game-breakingly big.

His eyes.

[Image: via 2k Sports]

Jimmy Butler has the ripe figure of a greek god and the dead, lost eyes of a zombie—as does every other player, announcer, and personality in NBA2K17, as well as just about every other video game on the market today.

[Image: via 2k Sports]

Where you looking, Shaq?

[Image: via 2k Sports]

Ernie, stop trying to possess me through the camera! STOP TRYING TO POSSESS ME THROUGH THE CAMERA!

This isn't a problem that's unique to 2K17. Across the video game industry, engineers are working to make eyes more realistic—or at least less creepy. Despite all of our advancements in CGI and real-time physics, we’ve hit a wall. Call it the uncanny valley. Call it something else. But the eyes in games are terrible, and it’s holding back the believability of every character on the screen. And unfortunately, thanks to a combination of challenges including physics, limitations in processor power, and our own nuanced ability to read eyes, we’re nowhere close to being able to solve it.

//XCOM 2. [Image: 2K Games]

The Uniquely Complicated Eye

"It’s hundreds of times, if not an order of magnitude or two more [away]," says Brian Karis, senior graphics programmer at Epic Games, speaking about the processing power it would require to render the perfect, convincing eye in real time.

Epic Games’s big business is licensing its Unreal Engine, which is among the most popular "game engines"—or core technologies—used by developers today (you can see it in use in the screenshots of Hellblade below). The Unreal Engine is like a giant rule set, which dictates how things like light and shadow work to render convincing characters and worlds with the limited processor budgets of computers, consoles, and even cell phones. So if you’re building a brand-new game, you don’t need to code all of this information from scratch. Instead, you license that tech from Epic, and you can focus on the story, game mechanics, and art design.

Karis’s main job is translating the unique characteristics of the human eye to the Unreal Engine.

[Image: Epic Games]

The initial problem with rendering eyes is simply that of light and structure. While the eye looks simple to, um, the naked eye, when you actually examine its structures, you realize it’s actually a mostly clear object. All of these clear layers manipulate light differently, and in reaction to one another, through a spherical structure (but notably, not a perfect sphere!). On top is the cornea. It’s not just a transparent lens. It’s a transparent lens that bulges out from the eyeball. It might reflect light like a mirror, or refract light, warping it like a water droplet on a windshield. Indeed, every structure you see within someone’s eye—like the colorful iris—has been distorted by their cornea.

"The transitions of each of these things, from one to the next, needs to be handled properly," says Karis. "How light interacts with all those things has to be handled."

[Image: Epic Games]

The white of the eye is particularly tricky. Known as the sclera, it’s actually the layer that wraps around most of your eye like an orange skin. Light "scatters" from the sclera through the clear gel that comprises most your eye—which is the same phenomenon that gives a glass of milk its particular glow.

Assuming all of this is rendered correctly, there’s one final problem remaining: caustics, or an envelope of refracted light we see frequently in the real world. "Imagine you’re at a fancy restaurant. You have a glass of wine in front of you, and the light hits it," says CTO Kim Libreri. "You’ll see on the tablecloth a little red, a little white." Caustics occur within the eye, too.

Deus Ex: Mankind Divided [Image: Square Enix Ltd.]

"It’s a subtle little effect you might not notice if it wasn't there," continues Karis. "But it won’t seem like there was as much complexity happening in the eyeballs."

Yet if Epic can so clearly describe everything going on inside the eye, and could surely rebuild these microstructures in a simulation, why can’t it properly render them? The problem is largely that of horsepower.

The Witcher 3 [Image: Warner Bros. Interactive]

Hollywood studios, which can spend several hours rendering a single frame of a film, can use a technique called ray tracing. Ray tracing simulates real light passing through planes and bouncing around objects, essentially duplicating the physics of how light interacts with objects in our physical world. But video games don’t have several hours to render a frame, since players demand 30 to 60 frames per second for it to feel smooth. This means the Unreal Engine has just 16 ms to visualize everything in a frame.

Truth be told, because eyes are so small in terms of the overall image—depending on your view, they might be but a few pixels on the screen—eyes are only given the majority of the engine’s total rendering power on something like a close up, where you’ll see the eye closely. Even then, eyes are treated as more of a cheat than a true physics simulation. While developing its engine, Epic ray traced various effects in slow-rendered eyes, then translated these to a vastly more simplified code that lives in its engine. This allows it to mock up all those transparent 3D structures via something more like a 2D YouTube clip playing in a character’s eye.

"We think our eyes look really good," says Karis. "They’re not perfect because of what we can do in milliseconds."

Indeed, because eyes (and faces) are so difficult to get right, many games simply slap sunglasses or a helmet on their protagonists. Franchises like Deus Ex, Halo, and even Call of Duty make a habit of this, no doubt because it saves a lot of effort.

Kara by Quantic Dream—which renders some of the best eye work in the business[Image: Quantic Dream via Youtube]

Building An AI Eyeball

While the eye is a difficult object to model in terms of its microstructures, Karis insists the eye presents an even larger challenge to render properly. "We as humans focus on and convey a large amount of information through our eyes," he says. "This means the accuracy in how we render them has to be much higher than most other things in order to be convincing."

In other words, because humans have evolved to read and communicate so much through the subtlest movements of eyes, we’re hyperaware of their behavior. And when a game like NBA2K17 handles the behavior of eyes sloppily, it can undo the realism constructed everywhere else across the scene.

"That's a constant battle," said 2K's Anthony Tominia in a recent interview with Evening Standard. "There's a certain amount of eye animation being driven by the game itself. The game is saying, 'What is the target?' Well, the target's usually the ball—so it's doing everything it can to keep the head and the eye centered on the ball. But at the same time we've realized there's this throwback to like Polar Express or something, where there's those dead MOCAP eyes."

Indeed, even in 2K17’s launch trailer, filled with the most cinematic moments of the game possible, watching the slow-mo walk sequences of players on the court, when they don’t have a ball to follow, is like some scene from The Walking Dead or Invasion of the Body Snatchers. The bodies are moving, but the soul is gone, lobotomized by the eyes.

Such a scene exemplifies the deeper challenge of real-time eye animation: that we need AI to consistently mimic the behavior and appearance of real eyes. Having characters track a ball makes it easier. Choreographing a character's eyes in a pre-rendered scene—maybe glancing to a loved one who is crying for sympathy, or glancing away in guilt—is a bit more challenging, but game studios can use motion capture to record the performance of actors in real time, including their eyes. In these prerecorded scenes, characters' movements can look more or less perfect.

Games, however, are ultimately dynamic experiences, filled with unpredictable stimuli. And dynamic logic gets tough.

"I’d say in the case of eyes, replicating the behavior is probably fairly simplistic. It’s a matter of tuning it just right," says Karis. "But this delves into other things for human performance. If we want humans responding in believable ways, there’s a very deep and challenging AI problem to solve there, to get digital humans not replaying a human experience but reacting like a virtual one would. That’s going to be very deep and take a lot of computational power."

When characters are removed from their carefully pre-choreographed contexts, the true limits of our understanding of eye logic float to the surface. For instance, we can make several saccades—or eye movements—voluntarily and involuntarily each second, for reasons that can be hard for scientists to quantify. Suddenly, for a character to be convincing on screen, they need to calculate and perform this confusing, visual logic for all other humans to judge.

"You need an AI eyeball. You need an AI to do saccading with the correct speed," says Karis. "Even if you’re not going to speak to a character, having it sit in a chair, acting like a human, shifting its weight, looking like a believable passive person, that’s a challenge!"

[Image: via 2k Sports]


The Fast Company Innovation Festival