CINEMATOGRAPHY

Do Directors Dream of Virtual Sheep?


Jul 16, 2021

A cinematographers perspective of Virtual Production within the film industry
By James Medcraft

Without doubt the last year has been one to remember for the industry, it’s made us re-address every element of how stories can be told and how they are created. With all of the struggles to get back on track, the necessity to shoot and tell stories no matter what has spurred new developments within our industry.

This necessity for myself as a cinematographer, has seen past and present skills combine in a new form; the merger of video game and film technologies in the form of Virtual Production. It’s a process that, whilst relatively new, combines traditional and bleeding edge technologies to create a tool that when used correctly produces results unachievable via other means.

For the last year I’ve experienced the birth and growing pains of the Virtual Production industry; the highs when it works and the lows when it doesn’t. I’ve written this as a very brief introduction to the process; it’s pitfalls, benefits and where as a Cinematographer I see it’s advantages to Film. I’ve also cited some of the virtual production shoots I’ve worked on in the last 12 months with my honest thoughts on them, what we explored and where I think they do work. This isn’t a ‘how to guide’ but rather a ‘where next’ dialogue.

Before we begin, I’d like to set a little context to this piece. For those of you who don’t know me, I’ve become a Cinematographer via an untraditional route. Shooting film and stills from a young age I studied graphics design and taught myself 3D animation. My first job was working for the art practice United Visual Artists, where I worked for six years and experienced one of the most interesting periods in recent art history. Whilst there we developed new software, aesthetics and processes all based around technology and light; it was my job not only to help in these design processes but to capture them on film. Since leaving 12 years ago I’ve dedicated my time to cinematography and love pursuing techniques which combine traditional filmmaking and as yet unintegrated technologies.

So what is Virtual Production? Simply put it’s the technique of using vast set-sized LED screens to display digital environments behind actors and physical props. These environments can be photo or video assets but within Virtual Production CGI environments are predominantly created and rendered in real time in relation to the camera’s position; creating the illusion of actors being within real environments which would normally be created in post production. Whilst these LED screens are predominantly used as backgrounds, they are also used for lighting, creating realistic ambient and reflected light from CGI scenes onto the set and actors. It’s hauntingly similar to the nostalgic days of matte painted backdrops and rear projection, but with more versatility.

The technology, whilst not invented by, was made infamous by ILM in their Mandalorian series where the making-of made more of a stir in the industry than the finished product. The ability to create CGI worlds which light and imbed an actor within a scene is a seductive prospect, combined with the ability to change elements of the virtual set in real time such as lighting and geometry seems too good to be true. And in some aspects it really is a step forward for the industry, but as with all new developments new learning curves are created.

At present Virtual Production is being marketed in many ways. From a way to create immersive live presentations to cover location scenes in a studio when crew or cast can’t travel. Another classic example is for re-shooting dialogue in difficult lighting conditions such as twilight; all you need to do is capture an HDRI on location and you can recreate that scene on a soundstage. These kinds of shoots are nothing new, but until now they’ve always been created with matte painted backdrops or more commonly chroma screens. The development of virtual production technology opens up a world of possibilities for these shots, but I feel there’s so much more that can be done with the technology which is as yet unexplored.

Normally, when a production needs to mix real world and virtual scenes a chroma screen is the method of choice, a tried and tested method over decades of filmmaking has made this a smooth and trustworthy technique. Almost all of us have shot green screen productions, and whilst it certainly works well it does have it’s issues.

The first being lighting. When shooting on a chroma screen stage one needs to illuminate the green screen and the subject independently, and in a way where these lighting set-ups don’t influence each other. We’ve all experienced the dreaded spill of green on actors and props, issues with motion blur and hair and more often than not the lack of space to light a chroma screen evenly. Then we need to light our actors and physical set to match the CGI environment that we’re shooting for. The cinematographer then needs to create a lighting look that lights the set to match the CGI world it’s being composed into, sometimes this scene hasn’t yet been finalised so it can often feel like shooting blind. For actors too there are issues, they need to react to things they can’t see, anticipate timings and movements of objects and characters they’ll interact with. Then once shot we move to post production involving keying, roto and match moving to imbed the actor within the subject. For many scenarios this process works well, and will continue to do so; however for some of the described processes Virtual Production technology becomes an advantage for many productions.

Firstly let’s discuss the technology and core elements. Virtual Production is a merger of three core industries: video game production, events and film. With computer technology becoming ever more powerful, companies such as EPIC Games have developed software to render photorealistic environments in real time. Whilst their Unreal Engine was primarily designed for game environment creation, it’s ability to render 3D scenes in real-time made it useful as an onset preview tool in film production. As this software becomes evermore powerful, it’s ability to render scenes with photo realism increases too. The events industries, primarily used to creating live entertainment events, shows and presentations have integrated ever closer with the film industry, utilising affordable LED screens as dynamic lighting fixtures. Well known uses of LED screens for enviromental lighting where such features as Gravity and Murder on The Orient Express to create moving enviromental reflections and backgrounds to add realism to static sets. Then the final piece in the puzzle is the film production. As with the other industries, film and broadcast has seen its share of innovations within the last decade. With increasingly adventurous visual effects and post workflows, the recording of additional on-set data has become imperative. Camera positional data, HDRI photography, and lens metadata are all essential ingredients for compositing and post production in modern day productions. Within the live broadcast world, camera tracking has become the norm for overlaying virtual objects, or extending a studio environment to create a more immersive and entertaining viewer experience. Whilst all of the aforementioned developments in their respective fields are exciting, it’s where they intersect which has a unique benefit to filmmaking.

As described, where one would normally post-produce virtual elements onto pre-shot footage using chroma screens, virtual production enables us to capture everything in camera. For most instances this solves many problems inherent with chroma screen based workflows; we light the subject more accurately, real objects have reflections of virtual scenes, keying, rotoscoping and match moving of scenes into shot footage can be removed from post. Actors also have a greater ability to experience the worlds in which the story exists, creating more convincing interactions between what exists and doesn’t.

Whilst all of this seems like the death of chroma screen workflows, it isn’t yet. To truly benefit from this innovation the industry requires substantial changes and investment in technical, production and political workflows. VFX heavy projects tend to have a balanced pre and post production based timeline. Storyboarding leads to animatics, animatics to approved previs scenes with basic lighting, modeling and texturing. These previs scenes are then matched in reality and filmed with the necessary physical and action plate elements that go to post, whereby the director and team have almost infinite control to craft the scene post-shoot. However for virtual production almost all of the production process is required pre-shoot. Animatics now lead directly to the production of the finished environment in which the scene is to be shot. Almost every element of post production is now brought forward into pre-production, requiring production to create the finished look of the scene prior to filming. Certain elements of post production will still occur, such as clean up, grade etc. However as you can imagine having a finished CGI look pre-shoot isn’t currently normal procedure within film. This workflow is more akin to events based productions, where the theatrical release so to speak is on stage. The video games industry has also been used to this workflow since it’s birth. The film department is currently the only department with a largely based post workflow. So if film is going to get the most from virtual production it’s now our turn to make some changes to our working process; if we want that perfect shot in camera without any post.

From my experience the workflow change required in the foremost hurdle to overcome and requires a radical change of thought within the industry. As one of the most progressive creative industries we are also one of the most traditional in a political sense. From my experience, traditional film mixed with live events crew and video game designers has been a melting pot of skills. It’s also brought about understanding and respect for each profession and created new ways of creative problem solving.

Due to the high costs currently involved with the process, meticulous planning is essential. Broadcast in is in some respects more closely tied to the events industry, rather than film production. This is largely due to the immediacy of its medium and as such could be an interesting sector to explore how their workflows can help film develop into virtual production. It’s no coincidence that many of the camera tracking solutions now being used in film started as broadcast solutions to make live broadcast more immersive. So now new production pipelines, roles and workflows are inevitably evolving from the need to convert post-production into pre-production, or a sort of hybrid pre-post production. In my opinion this isn’t anything to be worried about, it’s a natural evolution of our creative world and whilst not a ‘turnkey’ solution to every production it adds a new tool to our creative arsenal. If we were worried about evolution we’d still be shooting black and white silent films on a theatre stage. This process will invigorate the industry in ways we can’t yet imagine. To add to this, skills will become totally transferible between industries; a game designer can now be a production designer, and a gaffer a VFX consultant.

In my opinion chroma based workflows aren’t going away, but a hybrid model of chroma and LED workflows will become the norm in a decade. Stages are presently being built with LED coves as a permanent fixture, XR stages as they’re being advertised. These stages not only allow for Virtual Production but also allow the LED screen to be used as a Chroma screen and thus avoiding all the issues with lighting, as previously mentioned. Many projects will require an intricate mix of both processes, so this seems like the perfect solution. I feel this type of stage will be the go to technology for all future film studios, the best of technology and tradition.

Currently the technology is driving and dictating the creative “A solution looking for a problem” as I’ve heard it described. But I feel the best creative uses of the technology are yet to be developed and will come from the cross pollination of creative disciplines evolving in industry. Currently many people are using the technology to replicate reality, such as car adverts shot with rolling backgrounds of cityscapes. This is a great use of the technology and is incredibly powerful for creating unachievable shots on location, but I feel it may never replace the look or realism that on location spontaneity provides, sometimes it’s the unplanned shots that work better. However, where I feel the technology is most interesting is its ability to relocate actors in environments that either can’t be accessed or do not exist in our physical world.

I recently was involved in a large scale production which did exactly this. Commissioned by one of the largest video games companies, we shot a series of inter-level drama episodes where real actors were placed within the games environment. Shot over 4 weeks and involving multiple ‘locations’ and scenarios we created a series of multiple outcome scenario stories where the actors play parts of the game that reflects your progress. Not only could the actors explore the video game sets, but as cinematographer I could also film the game environments in a way that felt as natural as being on location. Ready Player One touched on this theme some time ago, when characters explore fantasy environments in the physical realm. And for filmmakers the ability to physically ‘be’ within an unreal world opens up exciting possibilities.

With new tools come new skills, and as industries merge the role of the cinematographer becomes more important than ever. Ultimately the cinematographer sets the scene physically and now virtually, and it’s our responsibility to create an image that looks great aesthetically but is practically shootable. In traditional filmmaking the cinematographer works with the entire crew to create a set or location that gives the coverage required to bring life to the director’s vision. This is still the same principle, but now the cinematographer must be far more integrated into the technological pre-production side than ever before. From my experience the Cinematographer must design the shape of the LED screen specific to the project and the style of shots that are required. Whilst this is marketed as a ‘turnkey technology’, from my experience one shape of screen doesn’t suit all projects.

The best way I try to describe it is as shooting through a digital window. To get the best results the cinematographer now has to dictate what’s the best shaped window for them to shoot through, and in what shapes and positions they are in order to provide the best view, coverage and lighting. The right combination of camera, lens and movement are also vital to create convincing illusions, but for me the most important aspect is lighting. Understanding how real world and virtual lights match on set is by far the most important element to master in order to sell the illusion of reality. The last production I worked on saw the Gaffer work heavily with the environment designers so that the same lighting style was achieved virtually and on set.

Ultimately there’s no magic formula for this technique and it takes multiple projects for all members of crew in order to understand how to achieve the best results.

Historically the principles behind this technique aren’t new and mimic similar techniques used for matte painted backgrounds, the only difference now is that the matte painting responds to the camera movement. So with these new tools comes increasing responsibility for the cinematographer to craft the image in camera, the well used ‘we’ll fix it in post’ phrase can’t apply to this medium; and as such it’s an exciting time for our industry to develop new a new visual language by marrying modern technology with techniques developed at the birth of film.

A new generation will naturally grow into this workflow and realise the true potential for this technology. Do I encourage its use, absolutely. Moreover, will it replace chroma/matte based workflows, absolutely not; yet. Digital hasn’t killed film, it merely made us appreciate all it’s qualities that we took for granted. In the same respect Virtual Production is going to create huge opportunities for filmmakers, but it will also force us to appreciate and explore the physical world in ways we haven’t needed to do until now. Something I’m sure we’ll all appreciate after staring out of windows for nearly a year, be them real or virtual.

Here are four projects I shot over the last year, all very different and explore the benefits of the process in different ways. Each project is featured on my website and has a more in depth write up with additional production images and behind the scene films.

LeCol

https://www.jamesmedcraft.com/#/lecol-x-mclaren

The client brief was to convey a sense of streamlined movement and speed to promote their new aerodynamic clothing. In order to do this we created a stylised wind tunnel environment and placed our cyclist on a set of rollers, a rolling road for bikes. The dollied camera enabled us to create tracking shots that feel like a pursuit vehicle trying to keep up with our rider, coupled with sweeps of practical lighting we managed to create  a very convincing effect of movement through an environment. Interestingly here, happy accidents add a sense of velocity to the scene, one can be seen where a bump in a tracking shot looks like the tracking vehicle hits a bump in the road. This set mishap actually adds a sense of unplanned reality to this fantasy environment. What made this look so convincing is the triangular nature of the LED screens. They surround the action creating not only background scene but foreground lighting visible in the reflective helmet.

Lucid Dreaming:

https://www.jamesmedcraft.com/cinematography#/lucid-rina-sawayama-1

The first Virtual I was involved in, which was commissioned as an R&D project by Bild Studios. The director Dave Ferner created the 3d fantasy world in which the artist Rina Sawayama explores within her dream. It’s an incredibly well suited treatment for Virtual Production; enabling performers to physically explore worlds that either don’t exist or could never be built for the available budget. The fantasy nature of the world allowed us to explore the surreal environment and break the rules associated with real world environments. Gravity could be altered, physical objects can become fluid, our artist could dance with an avatar version of herself. This project brought out a great sense of experimentation within the entire crew. Look out for the strings making her dress appear to be floating whilst falling at 03:28, spoiler, she’s just sitting down and waving her hands!

Night Hawks

https://www.jamesmedcraft.com/cinematography#/masterpiece-nighthawks-1

For me, this remains the most interesting use of the technology and blends a very interesting mixture of industries. Not only an entertaining but educational concept that invites the viewer to explore familiar worlds within a new dimension. The piece focuses around Edward Hopper’s painting ‘Nighthawks’ whereby the presenter James Fox steps through the two dimensional canvas and physically explores the stories, myths and techniques associated with the painting. By recreating the painting in 3D, James can actually walk within it and explain specific aspects in a far more immersive fashion than ever before. This kind of medium would be well suited for programming focused on the arts, science, history.

Volume

This project really explores the benefits of Virtual Production to the advertising industry and for global campaigns in general. It was funded by Epic and Imagination as a part of the Mega Grants program. A short narrative that envolved actor, product and ‘location’ was written in order to demonstrate how advertising agencies can create more engaging and streamlined advertising campaigns. Due to covid regulations no client was on set, and interacted with the crew via remote video links. As we had a 3d model of the product (The Triumph motorbike), we could place it within the virtual world and have it onset physically, allowing us to create a greater sense of space when telling our story. Personally I can see this scenario working well on a larger scale for automotive brands that want to create global campaigns in multiple locations, where due to certain restrictions, be it physical, logistical or budget may not be practical to travel with large amounts of cast, product or crew.

  • vortex lighting
  • LCA
  • lca
  • litegear spectrum os2
universal lighting
CVP

creamsource

cinematography world

Related Posts

Share This