Back to IndieWire

Daily Reads: Film vs. Digital Explained, Remembering When ‘American Idol’ Was Good, and More

Daily Reads: Film vs. Digital Explained, Remembering When 'American Idol' Was Good, and More

Criticwire’s Daily Reads brings today’s essential news stories and critical pieces to you.

1. Film Vs. Digital: The Most Contentious Debate in the Film World, Explained.
For the cinephiles out there who are already keenly aware of the power of film and have already picked sides in the film vs. digital war, having to explain the importance of film can be tiring. However, there are many budding cinephiles, as well as curious outsiders, who simply don’t quite know the difference. At Vox, Charles Bramesco explains the most contentious debate in the film world: film vs. digital.

Once upon a time, all movies were shot using machines that would take 24 photographs or “frames” every second and instantaneously leave a negative of those images on a filmstrip. It would then be treated with chemicals and displayed for showings by running the reels containing these strips of pictures through a projector. (Some cameras can also take more than 24 frames per second — see motion smoothing, above — but this generally produces an image that looks too real to our used-to-24-fps eyes.) If that sounds like a long, laborious process with tons of room for mechanical and human error, that’s because it is. The advent of videotape and the handheld video camera made physical media somewhat easier to work with, but all it takes is one afternoon spent carefully respooling the magnetic tape on a VHS cassette (or one VCR chewing up a bunch of that tape) to realize how easily ruined it all is. Instead of these potentially error-ridden physical procedures, many cameras now save these images as data to a digital bank, which can then be accessed like any other file. Digital video doesn’t really exist in the same way that MP3s don’t exist. As such, transportation, preservation, and even tinkering with the look of the finished product are now simpler than ever before. Recent technological advances have streamlined this process beyond what the filmmakers of bygone eras could have even imagined. So digital is the solution, right? Not so fast. Though digital photography may be more practical, film has aesthetic merits that aren’t as easily pinpointed. Those with the spider senses to discern such things have a habit of claiming film “just looks better,” much in the same way audiophiles can tell that vinyl “just sounds better,” but these both circle back to the inherently vague “know it when I see it” phenomenon. Filmstrips are a living thing — they degrade and expand and contract and mutate and warp over time based on the conditions they’re left in. As such, they have a lived-in look. A filmstrip saved from 1979 and shown again today has clearly seen some shit. Little imperfections such as scratches or so-called “cigarette burns” (take it away, Ed Norton) appear on the strip, and quiet crackles and pops develop on the audio track. But, in many ways, these imperfections are an argument in film’s favor. Though good ol’ film stock may represent an inferior experience in terms of pure empirical quality, it has the soft-around-the-edges look that we associate with old movies. Film is transportive; it inspires nostalgia, especially among film buffs. Compared with that, digital video can look antiseptic and polished. (This is some of what Tarantino means when he calls digital projection “television in public.” Television, too, can look antiseptic and polished.) These technical distinctions dictate the daily push and pull of the film industry, but for the garden-variety viewer, digital and analog are no more than two visual modes for a film to work in, each with its own individual vibe. Neither is better than the other, only more well-suited to the story the filmmaker has chosen to tell.

2. Remembering When “American Idol” Was Good.
This Wednesday, former cultural phenomenon “American Idol” will debut its 15th and final season on FOX. For years, “American Idol” captivated the hearts and minds of Americans as it watched amateurs become stars, laughing stocks get humiliated, and regular folks chasing a dream just out of reach. Philly.com’s Molly Eichel writes about her love of “American Idol” and remembers a time when it was actually good.

But the promise of “American Idol” – and of competitive reality TV shows in general – was never fulfilled. And that’s one of the reasons it’s time for “American Idol,” the former ratings juggernaut, to shuffle off this mortal coil. What happened? The primary problem was that reality TV decided it still had to be TV. The initial goal stopped being, “Let’s bestow success on diamonds in the rough” and became, “Let’s entertain the rough.” And give the audience lots of rough. Sob stories became so regular on “Idol” it seemed the background of every contestant was mined and stretched for as much hardship – and extra screen time – as possible. Televised auditions – this year includes a Philadelphia stop, by the way – always varied in quality. But eventually, they veered between incredible and atrocious (call it the William Hung Effect). Judges went from celebrities qualified to give contestants real advice to just-plain-celebs who could attract a fan base (remember when Ellen DeGeneres sat at the judges’ table and managed to be charming and yet say nothing at all?). Such failings, shared by a lot of reality shows, are why the reality-competition genre is largely in decline. I miss the halcyon days when “Project Runway” focused on design, not crazy characters and wild histrionics. After the novelty of the fame-by-reality idea wore off, it became apparent that famous people were famous for a reason. Charisma is hard to define; you either have it or you don’t. Those cultural gatekeepers? Turns out they weren’t so much ignoring this huge untapped pool of talent as they were picking and choosing the type of people who could be viable celebrities. There’s a reason that, out of all “Idol” contestants, history will remember Kelly Clarkson, Jennifer Hudson, and Carrie Underwood the longest. They had that undefinable It. Would Madonna have won “American Idol”? Prince? Bowie? Springsteen? Probably not…well, maybe Prince; he can do anything. But such people are not good at being just artists; they’re good at being famous.

3. Haskell Wexler: An Inside Outlier.
The film world still mourns the loss of legendary cinematographer Haskell Wexler, who photographed films like “Who’s Afraid of Virginia Woolf,” “In The Heat of the Night,” and “Days of Heaven,” as well as directing the perpetually-underrated “Medium Cool.” At the American Society of Cinematographers site, cinematographer John Bailey pays tribute to Wexler and his ferocious talent.

Though much of Haskell’s Hollywood work was a veritable roll call of memorable dramas of the 1960s, ’70s and ’80s, his stream of nonfiction films was equally long, embodying a deeply activist social and political commitment. His sense of justice (and injustice) seemed to continually stoke not just his camera, but the depths of his very soul. One of those later documentaries, “Who Needs Sleep?,” is a call to action to the film industry itself. Focusing on the problem of unreasonable crew work hours, the film was prompted by the death of camera assistant Brent Hershman, who fell asleep driving home after a 19-hour shooting day. For years after Hershman’s death, Haskell argued for the adoption of a mandatory limit of 12-hour work days, sparking controversy even among the ranks of fellow crew members who perceived shorter hours not as a humane imperative, but as a pay cut. It was a signature of Haskell’s singular focus that after decades of involvement with topics of broad national and international interest, he boldly pointed his lens at his own backyard, unafraid to speak out to an industry rife with fears of career implosion. Haskell may have had the wealthy man’s luxury of being able to speak and act publicly and loudly with impunity, but his fiscal independence was not what drove him. One look at the IMDb roster of his documentaries (including “The Living City” in 1953, “The Bus” in 1965, and “Four Days in Chicago” in 2013) is all the evidence you need that Haskell Wexler always put his camera where many others only put their mouths — or their wallets. His detractors, and there were many inside and outside Hollywood, were always outflanked by his passion for whatever cause he took on. According to an in-depth tribute published by The Guardian, Haskell even garnered prime status on J. Edgar Hoover’s FBI “watch list” as a result of his directorial debut, “Medium Cool,” which is set amid the demonstrations of the 1968 Democratic Convention in Chicago, and for his photography of a polarizing 1976 documentary about the militant activist group The Weather Underground. The FBI said Haskell was “potentially dangerous because of background, emotional instability, or activity in groups engaged in activities inimical to the United States.” Haskell’s camera lens was a metaphor for a gun or bomb.

4. Too Critical or Not Critical Enough?
The role of the critic has been debated and discussed ever since popular criticism became, well, popular. Is it enough for critics just to recommend their favorites or do they exist to ward people of the bevy of crap that exists in the multiplex or on their television? At his Episodes blog, veteran critic Todd VanDerWerff writes about this question with relation to TV criticism and how it’s possibly not critical enough.

We all play different roles, depending on the medium or even the work. But I do think there is a temperamental balance at work here. Roger Ebert was fundamentally an appreciator; Gene Siskel was a warning signal. That’s why their show worked and why it struggled once Richard Roeper (another appreciator) was added. Some of us just like liking stuff and are always looking for the good amid the bad. Others are leery of the idea of giving a pass to something that is, on some level, mostly crap, because it does a few interesting things. The best critics can turn on a dime and do something completely unexpected. But even the best of us usually fall into one of those two categories. I’m, broadly speaking, an appreciator. Ebert is the critic whose writing has most influenced me. My mentors have been appreciators. Most of those I’ve hired and championed have been appreciators. It’s just the way I skew. So consider all of this preamble to the idea that when blogger Kevin Drum held me up as an example of something that bugged him in TV criticism shortly before Christmas, I was more or less sympathetic to what he was struggling with. I did, indeed, list 60 shows on my year-end top TV list, and when looked at as a percentage of all scripted shows on TV (which ended up number 409), that seems way too high. It’s around 15 percent, and if you accept Theodore Sturgeon’s old maxim that 90 percent of everything is crap (as Kevin does), it’s at least 5 percent too high…Kevin has also run headlong into the fact that most TV critics are, at their core, appreciators. I know that doesn’t sound right, since so much TV is so bad, but it is. In fact, because so much TV is so bad, lots of TV critics tended to be appreciators. For decades, you had to really look for the diamonds in the rough, amid the formulaic pap. But they were always there. And TV is also skewed toward believing that things will turn themselves around. A show could always get out of a slump. A series you hate could air an episode you love. And on and on. Because it’s forever unfinished, you always have to leave the possibility that it will get good at some point (or the reverse). And yet, if you look at critics in other media (especially appreciators), we’re all dealing with something that I think audiences are feeling, too: the glut of choice. It wasn’t uncommon for film critics to suggest their lists could easily extend to 30 or even 40 titles this year, and the same has been true in both music and books for decades. Movies and TV used to have barriers to entry, gatekeepers who stood in the way of the most esoteric stuff. But those are mostly gone now, and the incentive is to make stuff that’s good or at least distinctive, so you’ll stand out a little bit from the pack. Are there plenty of shows that eke out an existence being just mediocre enough to continue to attract an audience? Sure. But there are also increasing niches for shows like “Rectify,” which have minuscule audiences but pay for themselves in prestige.

5. The Ragged Charm and Undeniable Greatness of “Chimes at Midnight.”
Legendary director Orson Welles’ 1966 film “Chimes At Midnight,” a film about Shakespeare’s Sir John Falstaff and his tumultuous father-son relationship with Prince Hal, has been notoriously difficult to see for years now. However, a new restoration by Janus Films and the Criterion Collection has allowed for a brief theatrical run in New York from now until January 12th, preceding an eventual Criterion release. RogerEbert.com’s Glenn Kenny writes about the ragged charm and undeniable greatness of one of Welles’ best works.

Rather than adapt (and by commercial cinema convention necessarily condense) a single Shakespeare work, as he did with his backlot modernist “Macbeth” of 1948 and his on-the-run-in-Venice “Othello” of 1952, Welles, harking back to the theatrical productions he concocted with teacher and friend Roger Hill as a private-school prodigy in the 1920s, concocted a stand-alone scenario centered around the larger-than-life tragicomic figure John Falstaff, cutting and pasting “The Merry Wives of Windsor” and both “Henry IV” plays, putting ostensible comic relief on center stage. The now portly-to-say-the-least Welles had finally, at age fifty, ripened into Falstaffian dimensions and so took the part. Shot in Spain in glorious black and white (Edmond Richard, who had shot Welles’ “The Trial” in 1962 and would go on to lens Luis Buñuel’s final films, was the cinematographer), on provisional locations and with post-synchronized dialogue, the movie is an incredibly dynamic piece of filmmaking that is also in many respects kind of threadbare. Which is to say: there are apparent limits to how decent looking and sounding a version there will ever be. [Pauline] Kael’s laudatory review begins and ends with blunt complaint. “You may want to walk out during the first twenty minutes of Falstaff. Although the words on the soundtrack are intelligible, the sound doesn’t match the images.” She’s not wrong. The dialogue was not recorded during shooting, and as became Welles’ practice during much of his latter filmmaking career, the director himself put on different voices to dub some of his actors. Note that Fernando Rey, the Spanish-born actor who plays Worcester, speaks in tones with no Spanish accent; Welles is dubbing in his part. (This approach would not do for John Gielgud, the unmistakably-voiced master who plays Henry IV.) Kael later complains about cutting that seems to want to camouflage the dubbing issues; I find this point an arguable one. Welles’ vision of Shakespeare was never a stodgy one, and even when making a period film his style gloried in a kind of modernist abruptness; through cutting, Welles always makes the material’s pulse speed up to potentially dangerous levels. The cut from Welles approaching his temporary bed to a shot of Jeanne Moreau’s Doll Tearsheet sitting up on the mattress and lowering her blanket to address him would remind one of an edit in a French New Wave film, had Welles hadn’t been doing the same kind of cutting since even before the purposefully jagged “The Lady From Shanghai.”

6. A Made-Up Place With Superficial Roots: Why “Star Wars” Matters.
Amidst the new “Star Wars” films and all the debates over its quality and the criticisms of its perceived redundancy, it’s often easy to forget that “Star Wars” is one of the few examples of popular entertainment that has had a meaningful impact of large swaths of people, both in and out of America. On her blog, Sonia Saraiya writes about her most recent trip to India, jet lag, and “Star Wars: The Force Awakens.”

Maybe that’s why “Star Wars” matters so much? I can’t think of another American fairy tale that delivers the same combination of longing and vastness that I feel in this weird liminal space between time zones and continents. A lot of people I respect and trust didn’t love “The Force Awakens,” and there’s a degree to which I even applaud them for their reaction, because so much of the cult around “Star Wars” is maddeningly superficial. But whatever I feel for it goes beyond critical response, or beyond, I don’t know, sentences. To me, “Star Wars” feels like coming home. Like it gets where I live, or to be exact, it gets that I don’t really live anywhere except for in my head. There is some tragedy of my existence that “Star Wars” understands and welcomes, and it’s not even that peculiar of a tragedy, because half the planet feels the same way. It’s just some tragedy of existing. Where the world we live in feels endlessly meaningless, but there’s this vision of another world where nothing is meaningless, and machines have feelings, and your family is the center of the universe, and the shades of your own personal struggles have visible, tangible relevance.

Tweet(s) of the Day:

Sign Up: Stay on top of the latest breaking film and TV news! Sign up for our Email Newsletters here.

This Article is related to: News and tagged , , , , , , , , , , , , , ,

Get The Latest IndieWire Alerts And Newsletters Delivered Directly To Your Inbox