How ILM Used the Clauss RODEON VR for Marvel’s The Avengers

One of the biggest summer blockbuster movies of 2012 was Marvel’s The Avengers, produced by Marvel Studios and distributed by Walt Disney Studios Motion Pictures.

Although roughly a third of "The Avengers" is set in New York City precious little was actually filmed there. Industrial, Light and Magic (ILM) crew members, headed by visual effects supervisor Jeff White, photographed 7-miles of city streets (about ten city blocks by about four blocks) from a variety of heights and at different times of day totaling 250,000 images. The crew then digitally recreated roughly 20 square blocks of mid-town Manhattan using a variety of techniques. Live action shoots were held primarily on sound stages in New Mexico and in the streets of Cleveland.

The Clauss RODEON VR panohead was used to capture and create 1800 x 360 degree pano-spheres. The images above (captured from the video below at 1:10 mark) shows the Clauss RODEON VR and a laptop running RODEONpreview.

To rebuild the streets, ILM sent a team into New York to photograph the city. “It was the ultimate culmination of building on the virtual background technology (at ILM),” says White. “We started with the biggest photography shoot I know we have done here, which was 8 weeks with four photographers out in the streets of New York.” The team shot some 1800 x 360 degree Pano-spheres of NY, using the Canon 1D with a 50mm lens – tiled, and as an HDR bracketed set.

The ILM team worked their way down the streets at street level – shooting every 100 feet down the road, and then started again doing the same street via a man lift at a height of 120 ft, then they would move to shoot every building roof top. But given how long it takes to do a bracketed tiled set of images, especially in sometimes high wind, it was not as simple as just moving down a street. By the time the team would get to the end of the street it would have taken so long that the sun would have completely shifted, thus making one end of the street lit from morning sun, but the end of the same street lit from the opposite side by afternoon light. To allow for this the team had to zig zag across New York, and try to capture most of the streets from the roughly same time of day, but on different days. Even this was greatly complicated by changing weather. Once all the images were captured, tiled and combined, another team set to removing (painting out) all the ground level people, cars and objects. Then a third team would re-populate the streets with digital assets ready to be seen in perspective or perhaps blown up. ILM came up with complex algorithmic traffic scripts to populate streets and create the sort of traffic grid lock any real world incident like this would naturally cause.

– quote from VFX Guide article by Mike Seymour