We Talk With Body Language. So Should Our Cellphones

A few ideas for creating cellphones that pick up our non-verbal cues.

We Talk With Body Language. So Should Our Cellphones


Admit it. Even though you know the other person can’t see you, you still nod in agreement while talking on the phone. You gesture. You gesticulate. You communicate in a medium destined never to be communicated. That extra set of meanings dies a local, meaningless death.

We gesture because it’s part of our lexicon. It’s involuntary. And those gestures matter; they really do convey information. Using them wisely in the classroom, for instance, increases the rate of learning and learned material retention. On the phone, though, we do without and use carbon-heavy transit to bridge the physical gap.

Yet with technology, perhaps we can return this funny quirk-of-behavior to its rightful place as a communicator.

My iPhone knew full well what I was doing, but it just sat in silence.

As I sat recently, nodding along to a friend’s story, I realized that my iPhone knew fully well what I was doing. It just sat in beautiful-but-ineffectual silence. Given the accelerometer and proper programming, it could be conveying my nod tactilely across the ether: If I nod, the other phone vibrates a pattern conveying the vigor of the nod and on screen, it shows an animated representation of me nodding; If I shake my head, a different set of vibrations goes over the line, and my representation shakes its head sadly (or curtly, or slowly in disbelief, or?).

Ideally, there would be a natural mapping between my head’s actions and the feeling on the receiving phone — a nod should feel like a nod, a shake should feel like a shake. I’m not sure if that’s possible with the current iPhone, but given three shake-and-shimmy locations, creating a facsimile becomes feasible.


Even if we can’t create a natural mapping, the brain’s plasticity comes to the rescue. When you use a tool, you’re brain actually represents that tool as an extension of your body. Similarly, when your brain gets consistent input from a real-world sensor, it quickly learns to map that into a sort-of sixth sense. As long as there was a different feeling for the different types of nods and head shakes, and a way to learn which means what (say, by looking at the screen), your brain would soon process the vibrations into their gestural meaning subconsciously.

You’d just know that the other person was nodding agreement.

A nod-aware phone would make my daily communications that much more humane, and technology that much more human.

Of course, I’d still feel silly gesturing with my hands.

[Top image by Mo Riza]

About the author

Aza is the founder of Massive Health, and was until recently Creative Lead for Firefox. Previously, he was Head of User Experience for Mozilla Labs.