advertisement
advertisement
advertisement

FingerSense Could Finally Make Touch Screens As Versatile As Desktops

Chris Harrison’s software can recognize knuckle-taps and other kinds of gestural input, unlocking new ways to interact with our mobile devices.

Ever since Apple invented the modern touch screen five years ago with the first iPhone, smartphone input has been very Mac-like: Tapping the screen equals a “click,” and there is no equivalent to the more PC-like “right click” for directly accessing secondary menus. The “long press” (where you tap the screen and hold the tip of your finger down for an extra bit of time) comes close, but Chris Harrison didn’t think that was good enough–so he invented software that could recognize different kinds of taps, like knuckles and fingernails. In the year since we originally wrote about that software, Harrison and his colleagues have improved the UI and spun it off into a product they’re calling FingerSense. Check it out:

advertisement

FingerSense is the inaugural product of Harrison’s startup, Qeexo, and he tells Co.Design that his team is in talks with Android handset manufacturers to integrate FingerSense into their phones. “[We] see this as a system-wide functionality,” he says. “We hope to say good-bye to tap-and-hold as the unwieldy mechanism for triggering extra options.”

The catch is that FingerSense requires an extra bit of hardware in order to work–an acoustic sensor that can recognize the unique vibration patterns that distinguish among fingertip, fingernail, and knuckle taps. Which means you can’t just download FingerSense from Google Play and magically give your Galaxy Nexus a next-generation user interface–yet. “We are looking to partner with device makers to integrate this sensor, which our software needs,” Harrison explains. (In the meantime, developers can email Qeexo for an Android SDK to tinker around with.)

The FingerSense demo video is certainly slick. But as touch screens lose their last vestiges of futurism and become as mundane as keyboards and mice, there’s something ever-so-slightly “jetpack”-esque about the extra interactions that FingerSense provides. Like the initially-awesome-but-now-mostly-just-a-pain-in-the-ass “pinch to zoom” gesture, holding my phone in one hand while twisting my other wrist to knock on it with a fingernail or knuckle doesn’t seem any less “unwieldy” than a long press, which I can do with one thumb while walking down the street. This is why smart gestural apps like Clear offer one-thumb “fallbacks” for all of their fancier input methods: Sometimes, they’re just plain easier.


FingerSense’s two-handed touch screen input gestures seem much more useful for tablets, whose form factor assumes a more relaxed, mostly two-handed kind of interaction. The software demo even shows a stylus being used, which you’d never bother with on a phone. Then again, the flagship Android phones that Qeexo wants to integrate FingerSense into are becoming so waffle-sized that they’re more like small tablets than phones at this point anyway, so maybe it makes sense.

In any case, “a next-generation touch screen”–as Harrison bills FingerSense–will have to offer more than one way to accept input. Are fingernail-taps and knuckle-knocks going to catch on? Who knows. Interface innovations like FingerSense will, at least, give us the chance to find out.

[Read more about FingerSense.]

About the author

John Pavlus is a writer and filmmaker focusing on science, tech, and design topics. His writing has appeared in Wired, New York, Scientific American, Technology Review, BBC Future, and other outlets.

More