Virtual reality is usually an isolated individual experience very different from the shared group experience of a movie screen or even a living room TV. But those worlds of entertainment are more closely intertwined than most audiences are aware. Video game engines have been taking a growing role in film and television production behind the scenes, and now they’re stepping out in front of the camera in a big way for making The Mandalorian TV series.
Big in this case is a three-quarters cylindrical LED array 75 ft (23 m) in diameter and 20 ft (6 m) high. But the LEDs covering its walls and ceiling aren’t pointing outwards like some installation for Times Square. This setup, called the Volume, points inward to display background images for camera and crew working within. It’s an immersive LED backdrop and stage environment.
Incorporating projected imagery on stage is a technique going at least as far back as 1933’s King Kong, but it is very limited. Lighting and camera motion has to be very constrained in order to avoid breaking the fragile illusion. More recently, productions have favored green screens replaced with computer imagery in post production. It removed most camera motion and lighting constraints, but costs a lot of money and time. It is also more difficult for actors to perform their roles convincingly against big blank slabs of green. The Volume solves all of those problems by putting computer-generated imagery on set, rendered in real time via video game engine Unreal.
Lighting is adjusted to blend with the physical set pieces within, taking advantage of dynamic lighting capabilities developed recently for realistic games. 3D position trackers conceptually similar to those on a VR headset are attached to the primary camera. By tracking the camera’s motion precisely, Unreal Engine ensures the part of the Volume seen by the camera (the viewing frustum) is rendered with the perspective necessary to maintain the illusion no matter how the camera is aimed. It is an effect best seen in motion, starting with The Virtual Production of The Mandalorian, a short four-minute clip ILMVFX released on YouTube. (Embedded below.) The Volume is also the star for a 22-minute episode 4: Technology of Disney Gallery: Star Wars The Mandalorian. (Disney Plus subscription required.)
The amount of money spent to develop and build the Volume isn’t discussed, but it would be an expensive up-front cost expected to be paid back in the form of faster and cheaper production over the long run, making this research project consistent with others under the umbrella of ILM StageCraft. It makes sense since they’re running a streaming service that requires a constant feed of fresh content to keep subscribers on board. Taking an engine for realistic VR games and adapting them to television production, the Volume opens up options that were previously the exclusive domain of big-budget blockbusters. And while the Volume itself required deep Disney pockets, the technique is accessible to far lower budgets. A demonstration clip released by Unreal Engine illustrates a much smaller scale application for a hypothetical motorcycle commercial.
But as great as it looks, and however many constraints it removed, the Volume nevertheless still has constraints of its own. For one example, its LED array’s resolution is not high enough to take center stage in today’s 4K HDR production flow, relegated to out-of-focus mid- and long-distances in order to avoid moire effects. The people who built the Volume said they expect it to only be the first version in a long evolution, that they invite others to experiment with this new idea and together move the industry forward. We anticipate there are indie filmakers already working on how to implement this concept on their smaller-than-Disney budgets, and they’ll need to recruit hacker friends familiar with LEDs and 3D tracking electronics to make it happen. We can’t wait to see the results.
VR is cool and all but all I expect from it currently are cool video games
Everything else is a bonus
Honestly hope the are eventually the need to take a trip to work imagine the future you’re a miner instead of having to take a plane trip every couple of weeks on a fly in fly out basis you pop on the VR headset put on some haptic gloves and remote control some humanoid drones or sit in a virtual crew cabin on the other side of the country alsotalso you could do stuff and low earth orbit as well may need to send people up there if people can just remote control that from the ground.
Interesting timing on this article; I am currently in the Unreal Engine training fellowship that is happening right now. It’s pretty intense, even considering I already have some familiarity with CGI digital creation programs.
Regarding using Unreal Engine for low budget independent filmmakers, it is quite possible. I found this presentation interesting from a company that is already doing it in a limited way:
https://www.youtube.com/watch?v=qFGbgmmLRgQ
That’s a great video for showing how the concept works in a smaller scale production, thank you!
As a bonus, the video thumbnail/preview image illustrates the (usually undesired) moire effect I mentioned in the final paragraph.
When two dot grids interfere,
That’s a moire!
I have imagined something similar, though it would use rear/short throw projection to build a cube shaped room that becomes whatever you want to see. The floor would probably have to be a large led array with a good defuser to remove the grid lines, but projection for the walls and ceiling. Cameras either looking through the rear projection screens or in very small holes in the corners would look for an IR led worn on a hat. This would allow the imagery to move with the users changing perspective as they moved about the room. While I guess you could use it for filming, my intention was a place of relaxation and contemplation, an extension of a quiet place to think in your home.
I imagined something very similar to what you describe, a few years back before the Oculus was a thing and everyone was focusing on head tracking for VR (thanks to the backwards Wiimote thing). Four back projected screens, tracking markers on the user’s head, and stereoscopic projection with circular polarisation or LCD shutter glasses = basically a holodeck.
It actually already looks like quite a peaceful place to spend time, at least when the background is an open landscape (as in the image at the top of the page).
I suspect that when the rendered view is very ‘far away’, that you probably wouldn’t notice any parallax issues and your brain would feel that it was real.
In this case, I think the floor is ‘real’, in that it’s conventionally created scenery (eg a layer of sand).
I had no idea something like this was actually being used in real production. That is super wild!
It’s already been done for Oblivion in 2013 : https://www.youtube.com/watch?v=KmNxwWdG4m0
Now that’s a big CAVE! https://en.wikipedia.org/wiki/Cave_automatic_virtual_environment
…and now imagine if the screens and system could do 120fps, so that they could generate alternating views at 60fps, genlocked to left & right cameras, ending up creating a 3d stereo feed.
That’s been a thing for almost 20 years already : https://www.youtube.com/watch?v=-Sf6bJjwSCE
Why even bother writing text for this visual master piece? Just include the video..
I am wondering; Those LED screens generate a lot of heat and many of the LED walls you find in stores or on buildings produce a lot of noise to keep it cool.
How do they cool such an enormous screen without the sound of a jet airliner coming from the screens?
You can use large fans that can move a large volume of air without having to turn very quickly. You can use a forced air or liquid cooling system to move the cooling problem elsewhere. Even so, it may be the case that all the audio is dubbed in post-filming.
I wonder if using the Unreal Game Engine to film TV will have the same negative effects on some folks (migraines, nausea, seizures, etc.) that it does when used in video games.
Source: Me.
My own experience with most games that are built on the Unreal Engine often play havoc with my epileptic brain.
The technology used to generate the images isn’t what makes you sick. Rather, it is the selection of images themselves, and that is up to the director.