Co.Design

Motion-Capture Invention Promises WETA Digital Effects on a YouTube Budget

2ndskin_x600

Motion capture is a staple of movie-making. It's also extremely expensive. The required equipment can cost hundreds of thousands of dollars. But a professor at MIT has invented an entirely new system, for just a fraction of the cost.

Usually, a person being captured has to wear markers—such as little white ping-pong balls—which allow a camera to record the refracted light. Second Skin, created by a team led by Ramesh Raskar, flips the process. A person wears special photosensors; these then pick up near infra-red light being projected by off-the-shelf projectors costing $50. The photosensors in turn capture the light. The light patterns are slightly different at each sensor, and they record a precise position some 500 times per second. The data is then transmitted to a computer via Bluetooth. Each sensor costs just $2. The entire system can be built for as little as $1,000.

The applications of this technology are vast. For one, the movie industry usually has to break up its production cycle and move to a special set whenever motion-capture is called for. Second Skin, by contrast, can work in broad daylight and at night time, because of the light spectrum being used. Thus, it could fold into production schedules seamlessly. Video game creators are increasingly using mo-cap in their productions as well.

But Raskar has already moved out of production: he's outfitted the sensors with special buzz packs and created a tai chi training program. When a move is off, the sensors buzz, letting the wearer know a correction is in order. Now imagine this technology in the hands of Nintendo, or a gestural interface designer like Jeff Han.

Amateur filmmakers should be agog. It's already possible to create a convincing knock-off prequel to Peter Jackson's The Lord of the Rings for just a few thousand dollars. Imagine what could be done with Second Skin. Indie movies, meanwhile, might now have the option of full-blown Hollywood-style effects.

[Via MIT Tech Review]

Add New Comment

0 Comments