At the moment, virtual production with all its advantages is reserved for films with large budgets, because the technology is extremely expensive - especially expensive are the huge MicroLED screens (such as Sony&s Crystal LED or Samsung&s The Wall systems) composed of special modules, on which the virtual backgrounds are rendered in real time with correct perspective.
To make virtual film productions possible for teams with smaller budgets, the research project "Virtual Film Playground" was started at the Ostwestfalen Lippe University of Applied Sciences. There, the team led by Stephan Schuh and Jennifer Meier is developing alternative techniques to replace the large MicroLED walls with commercially available monitors.
In order not to be too restricted by the rather small size of monitors compared to the virtual backgrounds (or even traditional green screens) during camera work, a special trick is used: the monitor with the virtual background can be moved synchronously to the camera and renders the respective perspective-correct background in real time with the help of the Unreal Engine (just like in the "big" virtual productions). On movements of the camera towards or away from the monitor, the background image is correctly rendered smaller or larger. In this way, the possible angle of view is increased by quite a bit when the camera pans or moves.
This is made possible by equipping the camera as well as the mobile display with a HTC Vive VR tracker in each case, which sends its own position in space to the PC, which then continuously adjusts the graphics thanks to the self-developed code for the Unreal Engine.
;Monitor instead of LED wall
In the demonstration in the video, a small monitor is used, but the whole thing is of course also feasible with larger displays and then more useful to use - in the future, the team wants to work with an 80" monitor. Depending on the size of the monitor, correspondingly close camera settings must be used so that the virtual background takes up the entire image behind the person being filmed. Currently, a PC with an Intel i5 12500k CPU as well as a GeForce RTX 3060 was used to calculate the backgrounds in low quality in real time - better quality requires a more powerful computer. The image is output via a Blackmagic Decklink Studio 4K via HDMI.
Also demonstrated is a simple alternative to elaborately created virtual 3D backgrounds, namely a 180° panoramic photo of a real room taken with an 8mm Walimex lens, and a street panorama stitched from cell phone shots. Even with these, the perspective correct display works in real time depending on the camera orientation. In the future, it should also be possible to use videos as backgrounds.
The virtual film set of Disney&s "The Mandalorian
Soon there will be an own website for the project - the code will also be published at a later time. The current video shows a demonstration of the technology, a feasibility study - to be used in real film projects would of course still have to be turned on some screws such as image quality, latency and size of the background - but then this cheap method of virtual production for certain scenes would certainly be an interesting alternative to conventional techniques.
The advantages offered by virtual film sets
The biggest advantages of virtual film production using backgrounds rendered in real time on LED walls are probably the direct visual feedback for everyone involved in the film (camera and director as well as actors), as well as total control over the set - including changes to the background or lighting mood. The latter becomes particularly realistic to portray, as the light from the background also illuminates the actors and objects in the foreground and is reflected by them. And changes of the set are also accelerated immensely this way.
Do you know of any other projects that try to make virtual productions possible even in the low-budget area?