The Next Great Display Technology? Water

An MIT prototype turns the water flowing from the faucet into a digital display.

For the past half century, displays have been made out of layers of glass, transistors, and circuitry. But as screen-based interaction consumes more and more of our daily lives, some designers are looking for ways to embed interactive displays more seamlessly into the world around us. Many of these experimental displays use organic materials to communicate information—and a growing number of them use the most fundamental material on Earth: water.


Last week at the ACM Tangible, Embedded, and Embodied Interaction conference in the Netherlands, a team from MIT’s Tangible Media Group presented a prototype display called HydroMorph, which exploits the same basic physics you’ve experienced if you’ve ever run a spoon under a faucet. Using a series of actuators and sensors placed under a stream of water, they’re able to manipulate the shapes that result when water hits the surface of the device—creating what they call the water “membrane” that can shift from a flower to a bird instantly.

According to the authors, water is an underutilized medium in interaction design, and an overlooked material for creating rich ambient interactions. As opposed to traditional screens, water-based displays are ubiquitous and recyclable, and demand far less attention from a user. Like other ambient interfaces, the idea is to create digital displays that require less intensive focus to be useful. “We envision a world filled with living water that conveys information, supports daily life, and captivates us,” they write.

Think of the prototype as a really souped-up version of that spoon; it sits under a running tap using 10 “blockers,” or narrow pieces of plastic, to manipulate the way water bounces off the device’s surface. Below each blocker, Arduino-controlled servo motors control the movement of the individual plastic pieces. The software, built using Processing, generates shapes based on the way water reacts to the height of each blocker. Right now, the shapes the device can create are fairly crude, including stars, flowers, birds, and spheres, but the MIT team has also rigged up a camera that makes the prototype interactive, morphing based on how users play with it.

So, is this just a really innovative new fountain technology? Yes and no. While the obvious applications at this point are in public spaces, the team says that they see longer-term implications in the idea of making an organic substance, such as water, interactive.

They aren’t alone in seeing water as a viable material for interaction design, as they point out in their paper. At the same conference several years back, EMedia Research Lab presented a prototype video game controller that replaced traditional buttons and toggle sticks with a bowl containing an ounce or two of water.

And another recent project, by the Japanese studio Atelier Omoya, demonstrated a screen made out of hydrophobic material—think NeverWet—that served as a backdrop to words and images made out of carefully released water droplets.


Other prototypes skew more literal: The Japanese interaction designer Yasushi Matoba developed a device called AquaTop that uses a film of white water as an interactive screen, practically begging users to interact with the media in an extremely visceral way:

The MIT team envisions similarly screen-like applications for their prototype. “We can imagine the device working as an interactive piece of furniture in our home,” they write, communicating low-level information in an unobtrusive way, such as the weather or an alert. Or maybe it’s creative, “a playful, dynamic sculpture,” where “users may touch the water membrane with their hands or tools, blocking water flow and changing the shape.”

There are dozens of other water prototypes out there, and all of them suggest that interaction designers are only just beginning to think outside the structures of traditional hardware materials and look at readily available organic substances, too. Alongside soft robotics, it’s an emerging genre of human-computer interaction, and these projects are just an opening salvo.


About the author

Kelsey Campbell-Dollaghan is Co.Design's deputy editor.