• 3 minute Read

The Future Of Interaction Design? Personality And Body Language

I dare you not to like this sassy industrial robot arm just a little.

Universal Robots’ $35,000 UR5 model robot arm is designed for automation in industrial applications, from welding to palletizing to injection molding. It doesn’t have a cartoon face, a British accent, or a little butler costume. It doesn’t have eyes, or a voice, or hands with which to gesticulate. But you’d have to have a heart of stone not to feel a twinge of sadness, watching it shrink away as a group of kids approaches it.

This experiment, called Mimic, comes courtesy of Design I/O, which was tapped to build the installation as part of the Toronto International Film Festival’s digiPlaySpace interactive playground. Based in Cambridge, Design I/O specializes in interactive installations, from immersive environments at the New York Hall of Science to an interactive app about the life of John Lennon. But for Mimic, the studio focuses on how personality and behavior could tell a story. “The project came out of this desire to take something mechanical and give it a personality that makes it feel alive,” explains Design I/O’s Theo Watson.

Using the UR5 and a series of Kinects, Creative Applications explains, the designers programmed the arm to interact with people nearby–but more importantly, they designed a precise model for its personality. It’s a mixture of metrics that include trust, interest, and curiosity, along with a taxonomy for body language that correspond to what the UR5 is feeling. “We realized that these three feelings could define so much in how the robot responds to visitors,” Watson says, “and in some ways these are some of the most primary metrics we lean on in our daily interactions, so much so that they aren’t immediately obvious.”

When a person runs toward the arm for the first time, it might shy away or cock its head in curiosity like a dog. But, Watson says, “because it doesn’t know them it acts more cautious and reserved.” But its trust increases the longer that person stays nearby, and the robot gets more curious and playful. Make a sudden movement, and it might curl back to hide its head behind its arm joints in a mock game of peek-a-boo.

It’s a form of design that’s closest in spirit to the Disney animators of the 1930s, who proposed principles for animating inanimate objects with what they called “the illusion” of life. Mimic comes from a similar place, where personality is expressed in movement and physicality rather than words. “We spend a lot of time bringing characters and creatures to life with our immersive environments, and we started to think about what gives something personality,” Watson says over email. “Is it the appearance or is it how it moves, how it reacts?”

It’s also a glimpse at an emerging form of interaction design, one where to create a form of interface. Just like Siri telling a joke, the UR5 uses human body language to create “understanding,” or an interface between hardware and human. Last year, asked industry experts to imagine the jobs of the future. Teague’s head of design Matt Schoenholz called an “embodied interactions designer”:

Screens have demanded a lot of attention from designers over the past 30 years. After all, they have been the source of so much content and so many interactions. They still require our thoughtful attention, but we will also see the rise of software that only rarely manifests on a screen . . .

Therefore, this role is expert in interface pattern languages and touch-points that have largely been considered as alternative or merely subservient to screen-based GUIs. This designer will borrow practices from industrial design and architecture, so that she can model interactions that are oriented in space.

Mimic is a glimpse of that future, when designers won’t just be coding the behavior of a 2D image on a screen, but complex interactions in the fourth dimension. It’s UI as body language.

About the author

Kelsey Campbell-Dollaghan is Co.Design's deputy editor.

More

Video