The Communication Of The Future Is So Real You Can Touch It

Companies are starting to integrate haptic feedback into their technology. Welcome to the future you can reach out and touch.

Early one morning last year Hiroshi Ishii, associate director of the MIT Media Lab, placed his hand on one of two identical assemblies of wooden bars. I put my hand on the other, a few feet away. As he rolled his hand over the one assembly, I simultaneously felt his movements in the bars under my hands. The experiment, dubbed InTouch, translates my movements into a form of communication that typically requires us to share the same physical space. But we’re not; it just feels as if we are.


Seeing the technology is amazing. But feeling is believing. At the lab, I couldn’t help but begin to imagine how it would change the hundreds of digital interactions I have every day. Take texting: With the absence of information like voice or body language, I sometimes struggle to truly understand what a text means. The only thing I can interpret is the text itself and perhaps that ellipsis that pops up on my iPhone while someone is typing. Ishii wants to unpack all the emotive information hidden in those ellipses. “I’m a calligrapher,” says Ishii, “before the ink drops on paper, [there’s] the study, the motion.”

Ishii and his team maintain that their prototypes are, by design, meant to add to our ability to express ourselves remotely. Today, our notion of telepresence–the ability to feel present in another location–has been generally confined to our visual and audio senses. Breakthroughs like virtual reality hint at a more cinematic experience, but you can’t feel someone’s heartbeat using Oculus Rift. “You can see [its] surface,” says 58-year-old Ishii, combining a Japanese accent with an unweathered, wide-eyed curiosity. “But you can’t feel it.”

A slew of new companies–including Lumo BodyTech, MEMI, Ringly and Artefact–integrate haptic feedback to correct our posture and alert us to important calls. Apple’s forthcoming smart watch will allow us to do things like physically feel one another tap on a screen and feel each other’s heartbeats. And if you think getting poked on Facebook is annoying now, well, it’s only a matter of time before moms use any of these products to invisibly nudge their kids (of all ages) to stand up straight.

“The two components of telepresence are interpersonal space and shared information,” Ishii says as he walks me through the Tangible Media area he founded within the MIT Media Lab. In MirrorFugue, one of more than 140 experiments credited to Ishii’s group, PhD candidate Xiao Xiao combines the haptic feedback concept embedded in InTouch with visual and audio elements to create what she dubs as “ghost presence.” In the experiment, a young girl was recorded playing on a grand piano. The visual recording of her hands playing is later superimposed on the piano keys, as the keys are triggered to play by her original movements. Live music emanates from the piano, with the pianist at once present and absent. Later, a woman sits at the same piano and plays alongside the ghost hands, seeing and feeling the younger girl’s hands physically trigger sound as she simultaneously sees and hears her own on the same piano.

Watching this unfold in front of me, I instantly imagined the scenarios MirrorFugue’s could lead to–from an instructor remotely teaching a pianist how to play to a faraway friend’s footsteps walking alongside me as we share an afternoon stroll. “Different streams of interface broaden our meaning of a physical world,” Ishii says.

Take working remotely: “It’s not just the presence of the person; there’s the space and the tangible objects you interact with,” says another PhD candidate on Ishii’s team, Sean Follmer. Those objects can be anything from, say, a car two mechanics repair to a ball soccer players “bend” to a product collaboratively created, like a handbag.


Last year, Follmer and fellow PhD candidate Daniel Leithinger unveiled project InForm, which allows video conferencegoers to interact physically (and which won Fast Company’s Innovation By Design award in October in the experimental category). Using InForm, I can copy the shape of my hand and its movements into a 3-D rendering that exists on a remote workspace. The prototype consists of identical workspaces with depth-sensing cameras and sensor-based pegs that move up and down in reaction to one’s movements.

As I wave my hands through the air above one of these workspaces, the pegs on both workspaces quickly mirror them; I feel like a conductor with an orchestra. I can then use the rendering of my hand on my partner’s workspace to manipulate an object in their physical space. I test this by curling up my hands to ensconce a rubber ball that sits on my partner’s workspace. When I uncurl my hands, the ball is released. Using this technology, you can “literally move an object [from] thousands of miles away,” Follmer says. Even as I do something as simple as roll a ball around in my hands, my mind flicks ahead to imagine a doctor comfortably treating an ebola patient without ever placing himself in harm’s way.

The MIT team is focused first and foremost on communication. “A lot of the time when we think of telepresence, the person who is remotely located is at a disadvantage,” Follmer says. “They are at a lower resolution, you can’t hear them very well, they’re flat, etc. But now we’re starting to think about what we can do to help them be better than the person who’s there.” InForm’s shape display can give me more hands and even transform the shape of my hands to be any object or shape needed. So I can turn my hands into a bucket, for instance, to move something like a large collection of golf balls.

How will this new tangible means of communication impact how we interpret more emotional messages, like the nuances of someone’s smile? “Any time you abstract [something], you lose the context,” Ishii admits. “But it’s an interesting tradeoff, as sometimes abstractness [lets] you learn more about a person’s point of view.” He points to how watching an old film’s original cinematography can better convey a director’s intent than watching that same film in high definition. Similarly, feeling someone’s fidgeting movements may be more distracting than illuminating. As with other technology breakthroughs in communication, we’ll just have to feel it out.


About the author

Maya Baratz is a freelance writer.