Concept, shmoncept: if your "product" demo is all CGI'd screenshots and foamcore mockups, the Big Idea behind it had better fricking wow us. Well, the idea behind Thimble certainly does -- it's a Braille-powered mobile computing interface that uses Bluetooth, optical character recognition, and voice commands to create an always-on, web-connected heads-up display for blind people. (I'm sorry, can you hold? The TED people are calling on Line 2.)
Conceived by Artefact and the Industrial Design Department at the University of Washington, the Thimble would use Braille to recreate the effortlessly connected mobile-computing experiences that sighted folks take for granted -- like navigating a strange city with Google maps, or scanning Tweets and news headlines in a coffee shop.
So how would it work? Well, it all starts with an "electro-tactile grid array"...
This kind of mumbo-jumbo is exactly how most design concepts lose us. But get past the first few jargon-clogged seconds of the demo video, and you'll see the real genius of the Thimble concept emerge. Here's their visualization of how the Thimble would allow a visually impaired person to surf and scan news headlines, via a web-enabled "ticker" that courses under her fingertip in Braille:
But wait, it gets cooler. Voice commands to the Thimble would cue it to download information like movie times or train schedules and "display" them in Braille to the wearer's fingertip. But what about the serendipitous, non-digital discoveries that the visual world offers, like spotting a flyer for a cool-sounding band playing at the club down the street? The Thimble can handle that too: just point your finger at the poster, and a tiny scanner uses optical character recognition to translate the text into another discreet Braille feed.
I'm not blind, but if the Thimble ever actually gets put into production, I may want to learn Braille just for the invisible cyborg powers.