advertisement
advertisement
  • 2 minute Read

City Fireflies, An Urban Video Game Whose Screen Is A Building

Victor Diaz and his collaborators needed a game controller that everyone could use without requiring instructions. Answer: smartphones.

City Fireflies, An Urban Video Game Whose Screen Is A Building

Getting computers to recognize and dynamically track moving objects is more complicated than it seems. Microsoft’s Kinect system is powered by incredibly sophisticated computer vision technology–which creative technologists can bend to myriad other purposes. But what if you’re designing a motion-centric video game that’s meant to be played in public by large groups of people who constantly come and go? It’s not practical to hand out specialized game controllers to all of them. Nor is it practical to gin up a computer-vision engine that can handle the constantly shifting, real-time computational load of identifying and tracking all the players. The creators of one such urban video game called City Fireflies cleverly sidestepped these design problems by harnessing a piece of equipment that everyone likely to play the game already carried on their person: a touchscreen smartphone.

advertisement

City Fireflies is a simple game that looks like tons of fun: Players cluster in a plaza in front of a large videoscreen showing a grid of 8-bit-looking “enemy” characters, which is superimposed on a live video image of the physical plaza itself. The goal of the game is for the players to physically move around the plaza to sweep the “enemies” off screen. (Or catch them like fireflies, if you prefer.) The game was meant to be casual: easily intuited rules, instant visual feedback, and a porous structure so that players could enter and leave the gamespace at will without disrupting the gameplay. Which meant that the interaction design had to be ultra-casual as well–so transparent that the energy barrier of joining the game was reduced to nearly zero.

Victor Diaz and Sergio Galán–the creators behind City Fireflies–could have given Wii-like controllers to anyone who wanted to play, but that would have required some sort of instructions on how to use them (not to mention some way of making sure people didn’t walk away with them). And they couldn’t use no controllers–without some sort of unique proxy signal for the game’s algorithms to visually track, it would be too difficult for the system to react to the physical multiplayer action in realtime. “Tracking the user with cameras tends to be complicated when lots of users are playing in a reduced space and light condition varies,” Diaz writes.

Their solution was ingenious: use the players’ smartphones as the game controllers. Everybody’s got one, and everyone already knows how to use it. But even asking players to install a simple app on their phones would be too much trouble. Instead, the game designers programmed City Fireflies to track a much simpler indicator for each player: the glowing rectangle of light emitted by the smartphone screen itself.

It’s the perfect solution: No apps to install or procedures to explain. City Fireflies simply directs players to point their phones at the giant game screen, and start moving around. The instant visual feedback is clear: The phone becomes your “net” for catching and removing the digital avatars onscreen. And behind the scenes, a much simpler computer vision algorithm can track a bunch of bright points (and sync it to the action onscreen), instead of struggling separate human figures from the background, distinguish players from non-players, and the like. It’s a beautifully elegant bit of interaction design and a dastardly smart engineering hack at the same time. What more could you want from a piece of public digital art?

[Read more about City Fireflies]

About the author

John Pavlus is a writer and filmmaker focusing on science, tech, and design topics. His writing has appeared in Wired, New York, Scientific American, Technology Review, BBC Future, and other outlets.

More

Video