For decades, George Lucas and his team at Industrial Light & Magic dreamed of “the stage of the future,” the point at which technology would allow for a photorealistic virtual backdrop that could be filmed as a three–dimensional, real-time set or location.
Version 1.0 of that dream became a production reality during the filming of Season 1 of “The Mandalorian,” where the Disney+ series visited the far–off planets of “Star Wars” via a curved 75-foot by 25-foot wall of LED panels known as ILM’s StageCraft. It is a breakthrough that will change, for better and forever, how Hollywood makes film and television. It was also a breakthrough rooted in lighting as much as visual effects, and that was jumpstarted and shaped by ILM’s unique collaboration with cinematographers.
The period of Hollywood filmmaking between “Star Wars” (1977) and “The Mandalorian” (2020) has been defined, in large part, by how far studios have pushed the boundaries of green and blue screen technology — what film history will no doubt come to see as an intermediary step, a compromise to the photorealistic virtual stages like StageCraft — where a scene’s backdrop could be added after filming. It was through this composited layer that films traveled to the fantasy worlds of the franchise blockbusters that became studios’ financial ballast.
Marvel / courtesy Everett Collection
The limitations with green screen technology have always been rooted in the fundamentals of cinematography. The ability to compose a shot — and capture the relationship between a subject and its environment, and the camera’s perspective on both — with green screen has always been challenged by the fact that the background is missing when a DP frames a shot. Meanwhile, to accept (consciously or subconsciously) these green screen backdrops as real, the light emitting from them — may it be the sun shining on a snow–covered mountain, a baseball stadium lit up at night, or the planet of Tatooine seen through cockpit windows as the Millennium Falcon approaches — needs to feel naturalistic in how it falls on the “real” people, objects, and settings in front of the camera.
While the great cinematographers of the last 40 years became proficient at not only framing, but simulating the light emanations they could not see on set, and visual effects artists developed tools and techniques to better render depth and light of these layers being created months after filming, the best work came when the friction between cinematography and visual effects was minimized.
The success of StageCraft comes from the fact it not only eliminates this friction, but adds new and exciting tools to the cinematographer’s arsenal. In fact, ILM’s first experiment with filming an LED wall as backdrop — and the earliest precursor for what would become known as StageCraft — was sparked by the need for a creative lighting solution, not VFX world–building.
In 2015, when Gareth Edwards picked Greig Fraser to shoot his “Star Wars” spin-off “Rogue One,” the cinematographer seemed liked an odd choice. Fraser came from the indie world and was a leader of a generation of filmmakers who strove to remove as much of the apparatus of filmmaking as possible from set. He was someone who put a premium on images looking unlit while working with directors (like Kathryn Bigelow, Jane Campion, and Bennett Miller) who put an emphasis on a handheld-like connection between camera and character. “Lion,” which would come out just weeks before “Rogue One” and earn Fraser his first Oscar nomination, was the pinnacle of the DoP’s ability to extract striking, evocative images from the natural light and landscape.
“My biggest concern, visually, was I didn’t want it to feel false,” said Edwards of “Rogue One,” which he saw as a gritty, grounded war film he wanted to “feel like a real experience [by] trying to make it feel like we got really lucky all the time and didn’t contrive anything, which is really hard.” The first thing Fraser said when he took a meeting with Edwards was he didn’t want to make a glossy Hollywood film. “He basically said my spiel back to me,” recalled Edwards.
Be sure to check out our exclusive video essays, focusing on Fraiser, ILM, and their evolving body of work, below.
It was a shared vision that presented instant challenges Fraser would have to solve. Principal among them: how to light the interior of an X-wing cockpit taking part in a visceral dog fight right above the planet of Scarif.
“Scarif was effectively similar to Earth and had blue skies and blue water and sand,” said Fraser. “So when you’re up in space and you’re doing a barrel roll [an aerial maneuver where airplane makes a complete rotation] one side is pitch black and one side is illuminated [by] the sun light bouncing off the planet. So I needed to come up with a solution whereby, we were never going to do barrel rolls, but the world around us could do a barrel roll from a lighting perspective.”
The traditional way to light the interior of cockpit required an army of gaffers, grips, and electricians spinning special lighting rigs and mirrors to simulate flying and fighting in space. It was hardly dynamic or realistic enough for what the filmmakers envisioned for the visceral action scene. Wracking his brain, Fraser zeroed in the idea of surrounding the X-Wing with Sky panel lights.
“Greig had this hunch that LEDs and LED walls could be a really useful lighting instrument, and so did we,” said Rob Bredow, Executive Creative Director and Head of ILM. “We’d been going down this path for a little while, but this was truly the beginning of when LEDs were just starting to be useful for this kind of use on set.”
Fraser talked through the idea with John Knoll, the visual effects guru behind so many of ILM innovations on the “Star Wars” films. “[Knoll] put two and two together during that conversation and went, ‘Ah, well because we’re wanting to try these stage of the future,’ which George Lucas kept talking about,’” recalled Fraser. “Which is effectively the walls of the studio change to suit the background. So that was effectively the meeting of practical solution to a problem that had been occurring.”
Knoll would build an LED wall that could play a dynamic, pre-visualized green screen–like background that the visual effects artists created for Fraser to test. As the image rapidly changed — like the point of view of a dog fight, or roaring through hyperspace — the light emanating from theater-sized screen created the exact light effect Fraser needed. The cinematographer became so enamored with the results that he and Knoll would collaborate on using an LED wall for most of the “Rogue One” shots from inside the various spacecraft cockpits.
Courtesy of ILM
“A perfect example of that is one of the ships is coming into a landing on ‘Rogue One,’ and we had the content on the background [LED wall],” said Bredow, describing a far different scene, where the heroic crew approaches their climatic destination and the exterior light slowly and dramatically moves across their faces. “You can see the sky and the sun, and it’s kind of flaring the lens. Greig was able to silhouette his actors, as they’re piloting the ship in.”
If it had been a blue screen outside the interior cockpit, to get the same effect, Fraser would have had to create a moving light source by guessing and then perfectly matching what was happening outside the craft. “‘Rogue One,’ it was really the first time, where the lighting on the actors was able to be driven by content on the walls,” said Bredow. “Some of that was high quality content, created at ILM, so it has the right lighting rations, but it wasn’t final quality content. It created fantastic reflections. You got all sorts of different sources of lighting, just a different feel of the lighting on the actors than we’d ever been able to achieve in a ‘Star Wars’ movie before.”
In essence, Fraser, Edwards and the filmmaking team were enabled to make filmmaking decisions that were more photographic, or more realistic-looking, and that suited the story better.
Fraser left “Rogue One” a believer in LED lights in general. Like a lot of DPs, Fraser came up thinking of LED lights as hard to diffuse, bounce, and color balance — they looked horrible on skin and often had green-magenta shifts. But the new generation of LED lights had a full spectrum of color that is easily controlled. New products like the ARRI Sky Panels are small, lightweight, extremely energy efficient, and allow their users to dial in the exact softness, color, and color temperature with an iPad.
“I’m trying to come up with a visual solution for a director, I also want to make the solution as efficient as possible to allow the director more time on set, more money to do other things with, more of everything that the director needs to make a movie,” said Fraser. “So that’s a big thing that I personally aim for in every film, is to do my job with the least amount of resources possible so I can give the director more time and more money. With LED lighting, and that includes LED screens, the way to achieving a goal with those lights is much faster, much, much faster. Also, frankly, it’s far more energy efficient.”
An environmentalist with an eye on how film productions can lighten their footprint, Fraser bragged that he lit the entirety of the enormous West Wing set on “Vice” with LED lights that used up the same energy it takes to strike a single 18K light.
The one place where LED technology on “Rogue One” was still sorely lacking was the size of the pixels on the LED panels themselves. They were way too big to be filmed, and had to be replaced, like green screen. The moiré effect off them at times was so strong, Fraser hung huge sheets of clear shower curtain-like material between the camera and screen to knock it down.
courtesy of ILM
Nonetheless, the results achieved on “Rogue One” left ILM inspired. “We immediately started testing,” said Bredow. “Could we directly photograph these panels? Could we get the colors to be accurate enough to where we could put them directly into our films?”
LED panel technology would continue to progress steadily, but while it did, ILM would need to develop its gaming engine to the point that the previsualized backgrounds could render photo-real environments in 24 frames-per-second that also shifted perspective (what became known as “camera tracking”) as the camera panned, tilted, or moved through space.
In June of 2018, at the behest of Jon Favreau, who was considering using StageCraft to make “The Mandalorian,” Fraser did a proof of concept test before a year of resources went toward building the first full LED volume and the series virtual environments. The first shot Fraser attempted was with the camera on his shoulder, wanting to test right off the bat if camera tracking could maintain his beloved handheld aesthetic.
“I was walking behind the Mandalorian,” recalled Fraser. “My right eye was through the viewfinder and I’m watching Mando in this space. And I’m like, ‘Wow. This is pretty good.’ I open my left eye, just as I knew we were getting towards the end of the run, towards the screen, my left eye saw an LED screen that I was about to hit. And my right eye saw the beautiful 3D, all rendered. I almost fainted. I’ve never fainted in my entire life, but obviously my two eyes were telling me two different stories.”
That camera tracking fooled the DP’s eye and brain, and Fraser was convinced that — working within Favreau’s methodical Western aesthetic, which would purposefully limit quick camera movements — ILM’s camera tracking tech could indeed work. Fraser still marvels that, without anything built or with any kind of back-up plan in place, ILM and Favreau pulled the trigger on basing “The Mandalorian” production on the untested, unbuilt tech.
“That’s a gutsy move, to base an entire series, the concept of the entire series, on that little sliver of success. The fact that everybody made a leap of faith like that was huge, absolutely huge,” said Fraser. “Very rarely in the industry do we make such massive leaps of technology concept and working methodology. Everything we do is always one small step beyond where we’ve been before, whereas this was a giant, kind of cataclysmic step.”
It was a step that saw Fraser taking on a producer credit and a nine–month prep period that was unlike anything any cinematographer had ever done. Fraser was the loudest voice in the room for the LED volume to deliver on its promise of efficient production by designing the volume so it could serve as the sole lighting source.
It was a decision that led to thousands of new ones: Should the LED volume be curved? How long? How tall? Should the panels be horizontal or vertical? Angled? Would panels be needed above and behind the camera? Would a large format digital sensor, cutting back on depth of field, make sense for shooting see-forever vistas off the LED screens?
“Every day that I went home during pre–production on ‘Mandalorian,’ I had a headache, because we were making hundreds of decisions every day that I’ve never made before in my life,” recalled Fraser. “There’s a lot riding on these decisions.”
There were also the fun decisions, specifically the creative license StageCraft gave cinematographers to play God. By being part of the creative team helping design the landscapes and set backdrops, Fraser was essentially pre-lighting the series with his “Rogue One” second unit cinematographer Baz Idoine, who would have to take over when Fraser had to leave the production to go shoot “Dune.”
“What I loved was working in 3D, there’s such a power to being able to put the sun wherever the hell you want,” said Fraser. “Like, you can put at any degree height, you could over a mountain range, just looking over a peak of a mountain range, creating a shadow on the background that’s a particular shape of the mountain range. There’s so much power. It’s very invigorating having that much power over light.”
That ability to dial in a naturalistic exterior light and for it not to change — no sun going behind the cloud headaches — offered the ideal shooting conditions, allowing Fraser and the directors to strip down to camera and character.
To Fraser and Bredow, StageCraft Version 1.0 of “The Mandalorian” seems as far ago as “Rogue One” was when they shot Season 1. ILM’s engineers, 24 of whom worked around the clock during that first season, were constantly adjusting to the real world shooting lessons they were learning from the filmmakers as they put their wizardry into practical application.
The version of StageCraft used for Season 2 or George Clooney’s “Midnight Sky” is different than what Fraser is currently using on Matt Reeves’ new “Batman,” and there’s no doubt the volumes being built for Deborah Chow‘s Obi-Wan Kenobi series or Taika Waititi’s newest chapter of “Thor” will be yet another step forward. Away from how the “Mandalorian” volume needed to supply the total setting for entire series, Fraser is excited about how they’ve been able to experiment with the shape of the volume on “Batman.”
“If you know the directions you’re going to be facing, you can tailor the shape of a volume to suit whatever it is you’re building,” said Fraser. “It might be that you build multiple volumes, as in the volume changes. That’s what’s great about these LEDs, is that they come apart and they go back together again like LEGO. So if you’re building sets — I can’t obviously get into the specifics of ‘Batman’ — but if you’re building sets, you can evolve the shape of the volume to suit the sets.”
On the set of Season 1 of “The Mandalorian,” Fraser found himself the ambassador to the game–changing technology, not only helping the directors figure out how to adjust to a new mode of production, but walking curious visitors like Steven Spielberg or George Lucas himself through its capabilities. Reflects Fraser, “I’m just excited to see how other directors and DPs are going to use it.”