An old adage says out of cheap, fast, and good, choose two. So if you’re like [Philip Moss] and trying to make a comedy series on a limited budget rapidly, you will have to take some shortcuts to have it still be good. One shortcut [Philip] took was to do away with the set and make it all virtual.
If you’ve heard about the production of a certain western-style space cowboy that uses a virtual set, you probably know what [Philip] did. But for those who haven’t been following, the idea is to have a massive LED wall and tracking of where the camera is. By creating a 3d set, you can render that to the LED wall so that the perspective is correct to the camera. While a giant LED wall was a little out of budget for [Philip], good old green screen fabric wasn’t. The idea was to set up a large green screen backdrop, put some props in, get some assets online, and film the different shots needed. The camera keeps track of where in the virtual room it is, so things like calculating perspective are easy. They also had large arUco tags to help unreal know where objects are. You can put a wall right where the actors think there’s a wall or a table exactly where you put a table covered in green cloth.
Initially, the camera was tracked using a Vive tracker and LiveLink though the tracking wasn’t smooth enough while moving to be used outside of static shots. However, this wasn’t a huge setback as they could move the camera, start a new shot, and not have to change the set in Unreal or fiddle with compositing. Later on, they switched to a RealSense camera instead of the Vive and found it much smoother, though it did tend to drift.
The end result called ‘Age of Outrage’, was pretty darn good. Sure, it’s not perfect, but it doesn’t jump out and scream “rendered set!” the way CGI tv shows in the 90’s did. Not too shabby considering the hardware/software used to create it!
The video accompanying this article was much more fun to watch than I originally expected. We may really be “living“ in a simulation or the Matrix.
One philosopher stated that as as it is objectively impossible tell if we’re living in a simulation or not, that one should just pick whatever you feel better about and go with that.
I take it that “philosopher” is someone like Nick Bostrom? Always seemed like a bad take to me. The world being “virtual” only has meaning if there’s a “real world” to compare it to. In the absence of evidence of the latter, it makes more sense to me to treat this world as “real” until proven otherwise.
My take is that it’s probably wise not to pick at the threads of reality too much as it might be the nice comfy hammock suspending you over the abyss.
Dude, this is poetry.
I’m stealing it.
Have you read the “Magic 2.0” book series? Different people poking around the web find a massive text file which their computers shouldn’t be able to open yet it opens instantly and they can quickly search through it. Once they find it has names of people, they of course look for themselves.
They find that there are all kinds of parameters in their entries, along with everything they own, including bank accounts, and data on all of it. Then they usually try editing it…
But the file doesn’t just exist on the internet. Some people have found it somehow with a Commodore 64 or VIC-20. One guy in the series found it on a 9 track tape in 1973.
The text file *defines reality* and for some the power to edit the world becomes a bit too tempting.
Who is the dreamer and who is the dream?
This is just the beginning. The future is going to make it easy to manipulate actors in post production (poses, etc.), change what actors look like (virtual makeup and clothing included) as well as what they sound like (voice replacement complete with accents). A few talented actors, a camera, and green cloth will be enough to make entire movies that seemingly have dozens of people involved. Post production is where a few people talking to each other on camera will be made into movies of every genre.
Have a look at Unreal’s MetaHumans. That future is already here.
i can’t wait to see a digital Bogart or Edward G. in a contemporary gangster film.
Yup, one of the tricks pull off with VFX software and game engines. I believe it was the weather channel that did a good demonstration of game engines on set.
Why didn’t they use Blender? Trying to composite video in unreal is just weird.
I don’t see what unreal brings to this workflow.
So, the school of thought here is one can do more prep on set for less time in post rather than less prep on set and more time in post. With that in mind their benefit in putting time on set with VP are live tracking (little to none in post), the creatives can review the shots combined with backgrounds right after shooting (check if lighting immerses the actors or real props in scene), unreal can do timecode with audio for quick turnaround with editors and faster ideation. Also, later with bigger budget and success, a studio can later intergrate led light walls or other lighting techniques such as dmx for a better immersion with actors and the directors, dp can dial in their scenes real time. Other renders may only have offline rendering or are unable to combine everything above in real time. Blender can one day do all these too but i don’t think it can at the time of this reply. If this doesn’t help, then my best suggestion is to try it out yourself with a small, diy setup and see if it works for you? Cheers.