My college major was something called “Television, Radio, Film.” I vividly remember the first time I told one of my relatives I was taking a class on the history of television. He chuckled and groaned “Really?” as if I’d just announced I was taking a class on the history of hopscotch or bubble gum — something frivolous and childish and altogether unimportant. Why would anyone study television? The conversation moved on and a few minutes later it circled around to interesting developments in the world of popular culture. “Hey, by the way,” he asked “have you heard about this new gangster show on HBO? I think it’s called ‘The Sopranos?'”
Thanks to those Sopranos and the earth-shattering changes they brought to their medium, studying television doesn’t seem like such a dumb idea anymore. In fact, the whole media hierarchy of my old college degree has been reversed. When I was in school, studying film was cool; studying television was a chore. Electives in single camera film production and screenwriting were in high demand; electives in multicamera production were avoided by all but the most serious TV careerists. The general feeling amongst my college peers, at least as I perceived it, was that film was the dream job and television was the backup plan in case the dream job never came to fruition. I don’t hang around college campuses much these days, not since that damn restraining order anyway, but I have to imagine the next generation of media makers are as or more interested in making and studying television as they are in movies. Why wouldn’t they be? As improbable as it seemed less than fifteen years ago, TV is just plain sexier than movies.
For proof of this drastic shift in perceptions, one need only look to wave upon wave of recent articles hailing the rise of television and the decline of film. The latest was sparked by James Wolcott‘s recent piece in the pages of Vanity Fair, which is technically titled “Prime Time’s Graduation,” but which is referred to in VF‘s Most Popular widget as “Television Has Officially Surpassed the Movies.” Wolcott’s “official” judgement has a few underlying arguments, including the damning (and accurate) one that the whole experience of going to the movies has become hopelessly debased by rude, cell phone obsessed audience members. But the content of movies, Wolcott says, has degenerated just as rapidly as the environment around them:
“Like ‘Twin Peaks,’ ’24,’ ‘Mad Men,’ and ‘The Sopranos’ before it, ‘Downton Abbey’ enriches the iconography and collective lore of pop culture. It replenishes the stream. (It also provides the perfect layup for PBS’s next prestige import, starting in April: the BBC adaptation of Sebastian Faulks’s best-selling novel ‘Birdsong,’ which will once again elegantly chuck us into the W.W. I trenches.) By contrast: for those of us who have fallen out of romance with movies, its franchise blockbusters seem to be leeching off the legacy of pop culture and cinema history, squandering the inheritance with endless superhero sequels and video-game emulations that digitize action stars into avatars and motion-capture figures, a mutant species with an emotive range running strictly in shades of bold. And those films that aren’t aiming for an opening-weekend monster kill seem to dwell solely within a realm of discourse dominated by film bloggers and Twitter twitchers, these configurations of loyalists and lost-causers adopting a film that they fell for at some festival and cradling it like a football as they chug downfield in a deserted stadium. ‘Margaret,’ ‘Bellflower,’ ‘Martha Marcy May Marlene,’ ‘The Future,’ ‘Shame,’ ‘Take Shelter—these are quality titles (so I assume, I haven’t seen most of them, I shall Netflix them in the fullness of time) that become objects of obsession for a few but float in limbo for those not on screening or “screener” lists… Arty entries may accrue a cult status over time that collects more disciples into the fold, but they lose the catalytic moment to set the culture humming.”
This is an interesting point: television circa 2012 adds to the culture while movies circa 2012 simply suck culture dry like media vampires. And, of course, television’s biggest indisputable advantage over movies — length, and thus the potential for depth and intricacy — comes into play here as well. Television introduces us to amazing characters — Tony Soprano, Don Draper, Coach Eric Taylor — then explores their lives and minds for dozens of hours. The longer their shows go, the richer their characters become. Movies, on the other hand, introduce us to characters for 90 minutes, and after that they’re gone forever. On the off chance they’re popular enough to warrant a sequel, their quirks and charms are often smoothed over and made more accessible, because sequels are driven by the search for financial gain, not probing emotional insights. Where television shows like “The Sopranos” or “The Wire” welcome the complexity that comes with age, movie franchises tend to favor accessibility, and they often reboot bankable properties after just two or three installments. When we meet Peter Parker in “The Amazing Spider-Man” this summer, it will not be the Peter Parker we’d come to know in three previous movies by Sam Raimi. This Peter will be a blank slate, the better to attract an easily distracted young audience.
With all of those concessions to Wolcott’s good points, though, things may yet prove more complicated than a simple “TV > Movies” mathematical formula. For one thing, even as he denigrates the state of modern cinema, he concedes that he hasn’t seen recent films like “Margaret,” “Bellflower,” and “Take Shelter,” remarkable works that possess many of the same pleasures — depth of character, acting, and narrative — that Wolcott finds in good television. If I wrote a response to Wolcott’s piece entitled “Why Film Is Still Better Than Television” and I sang the praises of “Take Shelter” and “The Cabin in the Woods” and “Undefeated” and listed eight different reasons why movies are still a better medium for visual and non-fiction storytelling, but I noted that I was making that argument without having watched “Breaking Bad,” “Game of Thrones” and “Justified,” would you take my opinion seriously? Probably not.
In my mind, there’s no question that television is on the rise. In my mind, there’s no question that television owns the cultural conversation. In my mind, there’s no question that television is better suited to take advantage of the pleasures of social media, if only because when you take advantage of the pleasures of social media in a movie theater you get scolded by James Wolcott and Matt Singer (and then Matt Singer talks about himself in the third person). But while I don’t think quality television is going anywhere, I do wonder whether this trend is a sustainable sea change or a fad buoyed by a fortuitous confluence of events. As good television shows and the networks that produce and air them grow more powerful and more profitable, will the demands of big business force the medium back towards the mainstream? As TV creators like David Simon — one of the patron saints of this new era of good television — come out publicly against the world of online TV consumption, dissection, and recapping, will websites rethink their coverage strategies? Simon’s comments were needlessly petty and grumpy, but they also hinted at the possibility that many people are writing about television right now specifically because it is cool — and if more producers like Simon denounce their work, it may not seem quite so cool for very long.
Wolcott credits the Internet with helping fuel television’s rise; everyone watches the same episode of “Mad Men” at the same time on Sunday, and everyone can participate in the same post-show conversation on Twitter. Cool arthouse movies like “Martha Marcy May Marlene” tour the country incrementally, limiting their audience and their possibilities for large-scale conversations. But the way in which movies resist instant gratification speaks to one of the things that still makes cinephilia special in the age of telemania: it’s harder to be a movie lover than a TV lover. Compare the amount of legwork required to see an underground arthouse hit like “Martha Marcy May Marlene” — following it from Sundance to acquisition to distribution to its opening at your local art house — with setting your DVR box to record an episode of “Luck” after someone recommends the show to you.
In this age of streaming video, movies on demand, and instantaneous choice, there’s something pure, and maybe even a little beautiful, about having to work at a pop culture obsession. Maybe television is better than film, maybe TV is the new cinema. Maybe TV will become the dominant mainstream medium. And maybe that is the best thing that could ever happen to film. If TV takes over the mainstream, then film can expand into the margins, where it’s not such a bad thing to be treasured like a football carried downfield by an unstoppable running back. The only difference is, in this case, the stadium isn’t empty. It’s just a little bit smaller than it used to be.
Read more of James Wolcott’s “Prime Time’s Graduation.”