When looking at the current Film and Advertising market, a term is becoming more and more common: "Virtual Production" is on everyone's lips in the whole Film industry and describes the process of filming in-front of a large LED screen, which can be filled with virtual backgrounds. The special feature of this technology when compared to "conventional" green and blue screen recordings is that all participants on-site already have a better understanding of the scene and the main characters can be illuminated in real time by the LED screen. The camera movement is synchronised with the background so that a parallax shift suggests a real “depth effect” for the scene. As a result, and due to the lighting from the set, the foreground and background merge in a natural way in one camera shot.
Similar to game trailers, virtual tracking shots can be used to bring impressive images from virtual reality onto a 2D screen, which thanks to the latest render technologies by game engines such as the "Unreal Engine" look almost lifelike.
GAME ENGINES CHANGED OVER THE YEARS
Compared to classic render methods of the Film sector, game engines offer the advantage of being able to perform all calculations in real-time. If you look at the classic work process of a Visual Effects Artist, he builds 3D compositions in a development environment such as "Maya", "Cinema4D" or "Blender" and then uses time-consuming calculations to calculate the direction of impact of the light and the objects. Depending on the complexity of the scene, this can take several hours per image. If you consider that a film scene consists of an average of 24 frames / second, you understand the time involved in such a calculation.
Game engines take a different path. Since these - as the name suggests - come from the gaming world, all of the above calculations must be done in real time. The player simply does not want to wait for a virtual image to build up in-front of themselves minutes before they can start the next interaction. For this reason, computer games have always looked "lower fidelity" in comparison to Hollywood films. A lot has happened though in computer graphics over the past decade. Innovations such as “Real Time Raytracing” - the calculation of light in real-time - and the ever increasing capabilities of conventional computer chips and graphics cards bring computer games very close to the graphic fidelity of films. As a result, the rendering methods from games are becoming increasingly relevant to the Film market.
FROM GREENSCREEN TO LED-WALL
If you look at large Hollywood Film productions, these days many scenes are filmed in green or blue screen studios.
This has the advantage that the sets do not have to be physically constructed, but it presents the Actors, Directors and Artists with enormous challenges. The Actors have few points of reference when it comes to interactions with the environment and the Producers have no foresight of the end product through the camera lens, which leads to many iterations and exponentially increasing production costs with "re-shoots". In addition, there is always a difficulty in the green / blue screen in harmonising the Actors in terms of colour and lighting with the backgrounds.
The flexibility of game engines enables Directors to make spontaneous theoretical decisions on the spot. These can be visualisations, such as Film locations, which allow Artists to frame the scenery and therefore make better creative decisions.
Compared to a "classic" film production, the production chain shifts forward. VFX Artists have the option to prepare the scenes for the day of shooting, which will make the process easier for the Film Team.
This also means that many effects, that previously had to be laboriously combined with the filmed material in post-production, can now be recorded “in-camera” against a real background. This particularly affects the subject of lighting. By using several LED screens in-front of and behind the Actors, which change in real time to match the desired scene, the Actors are always shown in the “right light”. This is particularly impressive on reflective materials in which the background is reflected.
Shooting in-front of a virtual background also makes it easier for the Performers to find their way around. While the imagination of the Actors is often required in-front of a green or blue screen, they are now in the middle of the desired scenery. Another advantage compared to green and blue screen recordings is the avoidance of the “chroma keying” process. This leads to problems in post-production, especially in the area of blurred transitions from foreground and background and very fine objects such as hair. By recording the background “in-camera”, optical phenomena such as Lens reflections, depth of field and dynamic range are natively recorded in virtual production.
Virtual Production offers unlimited creative possibilities and is already used in major productions such as "Star Wars - The Mandalorian", "Westworld", "His Dark Materials", "The Lion King". We believe that this technology represents the immediate future of the film industry.
I hope that I have given you a little insight into the technology - and maybe even got you excited about it. I would also like to encourage you to take a look at the literature and videos below.
Literature & Videos:
THE VIRTUAL PRODUCTION FIELD GUIDE - Noah Kadner Presented by Epic Games
Virtual Production for the Class Room: