By Jamie Stuart | Indiewire December 30, 2013 at 10:00AM
The cinematography of 2013 represented a hodgepodge of capture formats with one thing in common: it was all distributed digitally. Whether a movie was shot on 35mm, RED Epic 5k or RED One 4k, ARRI Alexa 2.8k RAW or 1080p ProRes 4:4:4, 1080p on a hacked Panasonic GH2 or a Canon 5D or a C300, 2D or 3D — that movie was digitally distributed. And because the final form is digital, movies are now, by definition, a digital medium.
As film stocks and lenses — and now digital cameras — have progressed over the years, the look of movies has changed. Whether it was the zooms and grain of the 1970s, or the introduction of widescreen formats in the 1950s, or the use of processes like ENR and bleach-bypass in the 1990s — different eras look different (you can even observe the changes between David Fincher's "Zodiac" in Viper 2k with "The Girl With the Dragon Tattoo" in RED 4k/5k) — and, to me, I now feel that the modern look of a movie is entirely digital, usually being graded from a RAW or Log source. Since it's starting from such a flat, desaturated image, the final product often retains what sometimes looks like a touch of monochrome mixed in. I really love that look.
Here's the thing: Now that we're in a digital world, I do feel like we need to redefine what constitutes cinematography. Starting with the Oscar win for "The Fellowship of the Ring" and emphasized more recently by "Avatar," "Life of Pi" and "Gravity," we're seeing movies where a large portion was actually created in a manner pretty much the same a Pixar cartoon. These movies mix live-action with digital environments and full animation — yet they're not only competing with traditionally-lensed movies for the Oscar, they're often winning. If "Gravity" wins best cinematography this year (and it's certainly a front-runner), it will be the fourth time in five years that the prize will have gone to a 3-D movie using a large amount of computer-generated images.
I don't think that's fair. The Academy needs to create an entirely new cinematography category just as it did in the old days when they gave out separate awards for color and black and white. There needs to be one award for conventional live-action photography and another for CGI-based filmmaking. The industry should seriously consider this idea, which I consider to be more practical than one might assume.
With the exception of my LomoKino, I haven’t shot celluloid in a dozen years. Recently, while scanning my Twitter feed, I came across an exchange between filmmakers Joe Swanberg and Alex Ross Perry, in which they wondered whether we're heading toward a situation where big budget movies will be shot digitally, while indies wind up on film. I rabble-roused that I probably wouldn’t shoot film even if I was paid to do so. When Perry asked why, I replied that I no longer thought of images in terms of celluloid, and that I felt that most of the times I've seen modern 35mm screened in DCP, it didn't look quite right. Another reason, which my 140 characters forced me to cut, was that I don't like the film-to-digital workflow.
This year's New York Film Festival, featuring a wide variety of capture formats, screened in DCP, strengthened my opinions on the matter. Most of the big movies were still shot on celluloid ("12 Years a Slave," "Inside Llewyn Davis," "The Immigrant," "The Secret Life of Walter Mitty," "Captain Phillips"), but it wasn't until I saw two movies shot on the ARRI Alexa ("All is Lost" and "Her") that I found images that looked right to me. Alexander Payne's "Nebraska" was shot on the Alexa with a layer of 35mm grain added to simulate film — but to me, it looked more like old VHS noise than celluloid, which annoyed me even though the images were otherwise gorgeous. I even thought the C300 was the correct choice for "Blue is the Warmest Color," and that, along with "Upstream Color," shot on the GH2 and which I saw earlier in the year, visualized for me how 1080p will be to Super-16 what 4k will be to 35mm in digital cinema.