The Microsoft Mixed Reality Capture Studios allow human performances to be recorded from every angle using an array of cameras on a stage. Once a performance is captured, it can be "played back" in realtime as an animated 3D mesh and viewed from any angle, using a game engine like Unity or Unreal. Virtual and augmented reality applications help deliver this volumetric data in even more engaging ways.

To put these performances into a real-world setting that delivers the same level of realism, we created a 3D environment for parallax on foreground and midground elements, which seamlessly integrates with a 360 background video.

Seattle's Great Wheel boardwalk was chosen as the location in which a previous stage capture of two breakdancers would perform their moves. To capture the background plate without occlusion issues (note the potted trees covering the wheel, for example) we lofted a 360 camera rig 20 feet in the air on a light stand.

The output from the camera rig after stitching and color correction. This equirectangular video served as the distant background, where parallax was minimial within the viewer's range of movement.

For the foreground and midground elements, a full 3D digital set was built using photogrammetry and custom-built software tools.

The 3D set gets a pass in Mari. Source photogrphy from the location shoot is used for the base texture.

The breakdancers and environment are assembled in Unity. Note the wireframe view showing the breakdancers' 3D geometry running in a virtual reality headset.

The final result as viewed from the HTC Vive headset. The breakdancers are re-lit using high dynamic range photography from the set shoot. Cast and contact shadows are added to the ground of the digital set.

© 2019 Glyph Software, LLC. All rights reserved.