Back to IndieWire

Hey, Academy: Here’s Why the Best Cinematography Oscar Should Be Divided Into Two Awards

Hey, Academy: Here's Why the Best Cinematography Oscar Should Be Divided Into Two Awards

The cinematography of 2013 represented a hodgepodge of capture formats with one thing in common: it was all distributed digitally. Whether a movie was shot on 35mm, RED Epic 5k or RED One 4k, ARRI Alexa 2.8k RAW or 1080p ProRes 4:4:4, 1080p on a hacked Panasonic GH2 or a Canon 5D or a C300, 2D or 3D — that movie was digitally distributed. And because the final form is digital, movies are now, by definition, a digital medium.

As film stocks and lenses — and now digital cameras — have progressed over the years, the look of movies has changed. Whether it was the zooms and grain of the 1970s, or the introduction of widescreen formats in the 1950s, or the use of processes like ENR and bleach-bypass in the 1990s — different eras look different (you can even observe the changes between David Fincher’s “Zodiac” in Viper 2k with “The Girl With the Dragon Tattoo” in RED 4k/5k) — and, to me, I now feel that the modern look of a movie is entirely digital, usually being graded from a RAW or Log source. Since it’s starting from such a flat, desaturated image, the final product often retains what sometimes looks like a touch of monochrome mixed in. I really love that look.

Here’s the thing: Now that we’re in a digital world, I do feel like we need to redefine what constitutes cinematography. Starting with the Oscar win for “The Fellowship of the Ring” and emphasized more recently by “Avatar,” “Life of Pi” and “Gravity,” we’re seeing movies where a large portion was actually created in a manner pretty much the same a Pixar cartoon. These movies mix live-action with digital environments and full animation — yet they’re not only competing with traditionally-lensed movies for the Oscar, they’re often winning. If “Gravity” wins best cinematography this year (and it’s certainly a front-runner), it will be the fourth time in five years that the prize will have gone to a 3-D movie using a large amount of computer-generated images.

I don’t think that’s fair. The Academy needs to create an entirely new cinematography category just as it did in the old days when they gave out separate awards for color and black and white. There needs to be one award for conventional live-action photography and another for CGI-based filmmaking. The industry should seriously consider this idea, which I consider to be more practical than one might assume.

With the exception of my LomoKino, I haven’t shot celluloid in a dozen years. Recently, while scanning my Twitter feed, I came across an exchange between filmmakers Joe Swanberg and Alex Ross Perry, in which they wondered whether we’re heading toward a situation where big budget movies will be shot digitally, while indies wind up on film. I rabble-roused that I probably wouldn’t shoot film even if I was paid to do so. When Perry asked why, I replied that I no longer thought of images in terms of celluloid, and that I felt that most of the times I’ve seen modern 35mm screened in DCP, it didn’t look quite right. Another reason, which my 140 characters forced me to cut, was that I don’t like the film-to-digital workflow.

This year’s New York Film Festival, featuring a wide variety of capture formats, screened in DCP, strengthened my opinions on the matter. Most of the big movies were still shot on celluloid (“12 Years a Slave,” “Inside Llewyn Davis,” “The Immigrant,” “The Secret Life of Walter Mitty,” “Captain Phillips”), but it wasn’t until I saw two movies shot on the ARRI Alexa (“All is Lost” and “Her”) that I found images that looked right to me. Alexander Payne’s “Nebraska” was shot on the Alexa with a layer of 35mm grain added to simulate film — but to me, it looked more like old VHS noise than celluloid, which annoyed me even though the images were otherwise gorgeous. I even thought the C300 was the correct choice for “Blue is the Warmest Color,” and that, along with “Upstream Color,” shot on the GH2 and which I saw earlier in the year, visualized for me how 1080p will be to Super-16 what 4k will be to 35mm in digital cinema.

Some of the film-to-DCP translations worked better than others. “Captain Phillips” was pretty good, as was “12 Years a Slave” and the fogged-up “Inside Llewyn Davis.” Ben Stiller’s “Walter Mitty” had beautiful photography, but I thought film was the wrong choice for it, especially because it’s a movie with a TV commercial cutting-edge aesthetic. I had the most problems with “The Immigrant,” which was shot to evoke older movies. I watched “The Immigrant” from the front row, where I could clearly see magenta/turquoise digital fringing throughout its grain storm — even though the photography itself was stunning.

There are many filmmakers who still adamantly prefer the look of film (including Rian Johnson, who boasted publicly that “Breaking Bad” was 35mm — on TV, where the format is least important), so much so that they’ve convinced themselves that it even looks better projected digitally than digitally-sourced images. I simply can’t agree with that. It’s one thing if we’re talking about 35mm projected on 35mm — but that is not the case anymore. Some filmmakers are just so used to the look of film that they can’t let go, even if it really doesn’t look right. (I can’t imagine anybody actually believing that “Only God Forgives,” shot mostly on the Alexa with some RED Epic mixed in, would’ve looked better if it was 35mm.)

A handful of movies are mixing formats. “The Wolf of Wall Street” is a good example. When I shot an interview with Martin Scorsese last year, it was just before he began production, and I asked him if he intended to use the Alexa again, as he had on the Oscar-winning “Hugo.” He said yes. In the end, however, after viewing tests by Rodrigo Prieto, he shot predominantly 35mm, while relying on the Alexa for nighttime and low-light scenes.

I need to stress here: I am not against celluloid. I love the look of 35mm. Even on Blu-rays of older movies, I’d rather have a grainy image that’s sharp than a digitally smoothed version that’s soft. My issue is just that I don’t really care too much for the way modern 35mm-shot movies look when projected digitally (one really good-looking 35mm movie this year that I haven’t mentioned was “Ain’t Them Bodies Saints”). This isn’t the fault of digital — digital is not film and it will never look exactly like film. Filmmakers need to work with digital, and get used to it, just like anything else. If you treat it like it’s a necessary inconvenience, you’ll never adapt to it.

Speaking of adapting, the late cinematographer Harris Savides comes to mind in this conversation. I interviewed him around the time of “Zodiac,” back in 2006, and he confessed to having a miserable experience shooting digital. He did, however, continue to experiment with digital in his promo work, as illustrated by a series of Delta spots he did with Mark Romanek a few years back.

On the Blu-ray supplements of “Frances Ha,” there’s quite a bit about how Harris, who shot Baumbach’s “Margot at the Wedding” and “Greenberg,” helped advise the look of the movie, which was shot with a Canon 5D Mark II. Watching the supplements, I realized I had gotten my 7D around the time Baumbach was doing his initial tests, and I recall discussing with Harris the differences between the 5D and 7D. The specific thing about the look of “Frances Ha” was how they attempted to find a digital analogue for degrading the image rather than trying to make it look like degraded film by adding grain. They achieved their aesthetic by blending a blurred layer on top of the primary grade, resulting in an image that kind of reminded me of what it looked like when people used a 35mm adapter on a Panasonic DVX100a.

Harris Savides’ final film, “The Bling Ring,” was shot on a RED One. Considering his preference for 35mm, I was surprised when I learned he’d gone digital. There was one moment in the movie that really showed how well he understood images that are inherently digital: a nighttime exterior house zoom that was apparently his idea. That shot was a masterpiece — a beautiful final note for him to leave us with.

And that’s all I’ve got for 2013. Now that the digital world is fairly standardized, expect a lot of the same mixed with dashes of the new as we move forward.

Jamie Stuart is a New York-based filmmaker. His personal site is here.

Sign Up: Stay on top of the latest breaking film and TV news! Sign up for our Email Newsletters here.

This Article is related to: Awards and tagged , , , , , , , , , , , , ,

Get The Latest IndieWire Alerts And Newsletters Delivered Directly To Your Inbox