Wearable cameras like GoPro and Google Glass will only become more common in years to come. The only problem? Latched to our dumb bodies, they make everything look like it was shot during the Running of the Bulls, just without any of the excitement of, you know, running with bulls. Microsoft’s got some wild new software to fix that.
A team of researchers recently demoed an application that smooths out jerky, first-person video into fantastically smooth hyperlapse clips. The results look like something you’d expect from a camera attached to a drone, not a clumsy biped.
As the researchers show in this clip, standard image stabilization doesn’t work when you speed up first-person videos. The new approach is much more sophisticated. First, an algorithm looks at the frames from the original video and reconstructs the photographer’s path through three-dimensional space. Then it derives a new, smooth camera path that can be created from it. By warping the perspective on some of the original shots and stitching and blending them together, it ends up with a fluid, fast-paced new clip. The researchers say they’re working on packaging it up for the public as a Windows application.
It’s all very impressive stuff. But it’s also kind of strange, in the sense that these algorithms are transforming a very real, first-person document of an experience into something wholly artificial. The results may feel more “natural” than the original, unstabilized videos, but they’re not. In fact, they’re the opposite. Here we’re not seeing our world so much as a rigorous reconstruction of it, with familiar sights stretched around and stitched back together as subjects for a virtual camera. The software algorithmically crunches ho-hum GoPro footage into dreams that we can send to our friends and upload to YouTube.
No comments:
Post a Comment