ACADEMIA
NYU researchers reveal how aquatic Olympic gold is captured
Computer scientists have isolated the movements of Olympic swimmers and divers through a cutting-edge technique that reveals their motions above and below the water’s surface.
The work, conducted by Manhattan Mocap, LLC, together with New York University’s Movement Laboratory and The New York Times, analyzes Dana Vollmer, who won three gold medals at the 2012 Summer Olympics in London, as well as Abby Johnston, who won a silver medal in synchronized diving, and Nicholas McCrory, a bronze medalist in synchronized diving.
The research team, headed by Chris Bregler, a professor in NYU’s Courant Institute of Mathematical Sciences, followed these athletes during their training in pools across the United States this spring and deployed ground-breaking motion-capture techniques to unveil their movement above and under the water’s surface.
Their work may be viewed here.
Of particular note is the team’s creation of a system, AquaCap (TM), which captures underwater motion. It was used to display Vollmer’s butterfly stroke and underwater dolphin kick, breaking down the technique the swimmer used to win the gold medal in the 100-meter butterfly in world-record time. Through a comparison of motions, the video illustrates how closely Vollmer’s kick resembles that of a dolphin swimming through the water.
Subsequent work analyzed Johnston and McCrory, showing through previously unseen angles their summersaults from 3- and 10-meter diving boards and marking another technical breakthrough in motion capture.
Motion capture records movements of individuals, who wear suits that reflect light to enable the recording of their actions. It then translates these movements into digital models for 3D animation often used in video games and movies, such as “Avatar” and “Iron Man.” Bregler and his team used a more sophisticated computer-vision technology, which allows for the tracking and recording of these movements straight from video and without the use of motion capture suits.
Manhattan Mocap LLC is a new spinoff of the NYU Movement Lab, a leader in motion capture in sports, entertainment, and scientific research. Previous projects include an examination of baseball pitcher Mariano Rivera’s delivery and gesture analysis of New York Philharmonic conductor Alan Gilbert. It has also developed a method to identify and compare the body language of different speakers—also known as “body signatures.” Titled “GreenDot,” the project employs motion capture, pattern recognition, and “Intrinsic Biometrics” techniques. In 2008, the NYU Movement Lab’s results showed that actress Tina Fey, who was widely praised for imitating Republican Vice-Presidential nominee Sarah Palin’s voice and appearance, also effectively channeled the former Alaska governor’s body language.
For more about Manhattan Mocap LLC, go to http://manhattanmocap.com/; for more about the NYU Movement Lab, go to http://movement.nyu.edu/.