Criticwire’s Daily Reads brings today’s essential news stories and critical pieces to you.
1. The “Hollywood Blackout” at the 1996 Academy Awards. When the Oscars announced their nominations this year, the whiteness of the nominees was lost on literally no one. Within the day, the #OscarsSoWhite hashtag had brought attention to it throughout social media, and not long after the Academy announced it would be taking great strides to diversify their membership. But did you know that there was a similar outcry about twenty years earlier? The New Republic’s Esther Berger examines the 1996 incident and how the times have changed since then.
But the awards season of 1996 was one of the few times that whiteness made national headlines. There were calls for a boycott, questions about whether a black Oscars host and producer should step down, and disavowals of racism by white Academy members. Jesse Jackson’s Rainbow-PUSH Coalition announced an Oscars protest, telling reporters: “It doesn’t stand to reason that if you are forced to the back of the bus, you will go to the bus company’s annual picnic and act like you’re happy.” Landon Jones, then-editor of “People” magazine, was in attendance at the 1995 Academy Awards when he noticed something strange. “First of all, the audience was entirely white,” he told me. “Then I realized that the seat-fillers were entirely white.” The seat-filler positions — beautifully-dressed extras who swoop in when audience members leave their seats so that TV viewers don’t notice empty chairs — often went to the friends and family of Academy members. None were black. Neither were any of the people who went onstage to accept an award that night. “I came back to ‘People’ and said, ‘We need to do a story about this,'” Jones continued. The following February, after the nominations were announced, a team of “People” reporters called every nominee to ask a single question: “Are you black?” “We know Meryl Streep is white,” Jones said, but in a pre-IMDB age the only way to find out the race of lower-profile nominees was to work the phones. “The reporters hated it when I said they had to ask all these nominees if they were black,” Jones said. The results of that survey were accompanied by a 3000-word cover story alleging that “exclusion of minorities has become a way of life in Hollywood,” describing the effects of institutional racism within the movie industry, from the Academy to studio boardrooms and union rosters. “People” found that in 1996, only 2.3 percent of the Director’s Guild and 2.6 percent of the Writer’s Guild membership were African-American. The numbers in the union of set decorators was even smaller. “What’s wrong is considerably more significant than whether Whitney Houston…gets another conversation piece for the powder room,” “People’s” Pam Lambert wrote. “Hollywood’s creations are the mirror in which Americans see themselves — and the current racially skewed reflection is dangerously distorted.” The issue sold poorly on newsstands — institutional racism can be less glamorous than the latest celebrity divorce — but it received widespread media coverage. When Jones went on CNN’s “Reliable Sources” to promote the story, the panelists were amazed that “People” had produced such a substantive piece of journalism. One guest described is as “the sort of thing we might have expected from the ‘New Republic’ or ‘New Yorker.'” Another asked why those magazines hadn’t done a story like this themselves. Howard Kurtz, the host, had a theory: “One of the reasons you don’t see this in magazines like ‘The Atlantic’ or the ‘New Republic’ is that some of those magazines have almost no black staffers themselves. They might be reluctant to make an issue of this.”
2. Everybody’s a Critic. And That’s How It Should Be. One of the best and worst parts of the Internet is how it has democratized criticism, taking what was once relegated to those living in ivory towers and bringing it to the masses. But despite the benefits of this, it has also created a culture of half-baked, blustering opinions masked as criticism. Yet, some people argue it’s a good thing that everybody’s a critic. Excerpted from his new book, The New York Times’ A.O. Scott writes about how everybody’s a critic, and that’s how it should be.
On the Internet, everyone is a critic — a Yelp-fueled takedown artist, an Amazon scholar, a cheerleader empowered by social media to Like and to Share. The inflated, always suspect authority of ink-stained wretches like me has been leveled by digital anarchy. Who needs a cranky nag when you have a friendly algorithm telling you, based on your previous purchases, that there is something You May Also Like, and legions of Facebook friends affirming the wisdom of your choice? The days of the all-powerful critic are over. But that figure — high priest or petty dictator, destroying and consecrating reputations with the stroke of a pen — was always a bit of a myth, an allegorical monster conjured up by timid artists and their insecure admirers. Criticism has always been a fundamentally democratic undertaking. It is an endless conversation, rather than a series of pronouncements. It is the debate that begins when you walk out of the theater or the museum, either with your friends or in the private chat room of your own head. It’s not me telling you what to think; it’s you and me talking. That was true before the Internet, but the rise of social media has had the thrilling, confusing effect of making the conversation literal. Like every other form of democracy, criticism is a messy, contentious business, in which the rules are as much in dispute as the outcomes and the philosophical foundations are fragile if not vaporous. We all like different things. Each of us is blessed with a snowflake-special consciousness, an apparatus of pleasure and perception that is ours alone. But we also cluster together in communities of taste that can be as prickly and polarized as the other tribes with which we identify. We are protective of our pleasures, and resent it when anyone tries to mock or mess with them. Obsessives and dilettantes, omnivores and geeks, highbrow and low, we are more likely to seek affirmation than challenge. Some people love opera. Others love hip-hop. Quite a few are interested in both. “It’s all good!” you might say. But you don’t believe that, any more than I do. Some of it is terrible. There is, axiomatically, no disputing taste, and also no accounting for it. And yet our ways of thinking about this fundamental human attribute amount to a heap of contradictions. There is no argument, but then again there is only argument. We grant that our preferences are subjective, but we’re rarely content to leave them in the private realm. It’s not enough to say “I like that” or “It wasn’t really my cup of tea.” We insist on stronger assertions, on objective statements. “That was great! That was terrible!” Or maybe that’s just me. This newspaper, after all, pays me to turn my personal impressions of movies into persuasive arguments — not only to share my feelings about movies but also to assess them and provide some useful counsel to readers. So it might seem as if I’m setting out here to make a self-serving point. Don’t trust the Hollywood insiders who control the Oscars! Ignore the quantified peer pressure of Rotten Tomatoes or Box Office Mojo! Listen to me! And sure: I do have a stake in defending the relevance of my own job, even as I grant that it’s kind of a ridiculous way to pay the rent. Critics are sometimes appreciated — or even, in rare cases, admired, like Roger Ebert — but we are more often feared, resented or ignored altogether. In the popular mind, critics are haters and killjoys. Maybe we’re sadists, like the viperous, martini-swilling New York Times theater reviewer in “Birdman.” Or maybe we’re masochists: In spite of that cruel caricature, “Birdman,” an Oscar best picture, is “Certified Fresh” by Rotten Tomatoes (I think it’s vastly overrated, by the way, but that’s just my opinion). The ability of critics to make a living may be precarious, but criticism remains an indispensable activity. The making of art — popular or fine, abstruse or accessible, sacred or profane — is one of the glories of our species. We are uniquely endowed with the capacity to fashion representations of the world and our experience in it, to tell stories and draw pictures, to organize sound into music and movement into dance. Just as miraculously, we have the ability, even the obligation, to judge what we have made, to argue about why we are moved, mystified, delighted or bored by any of it. At least potentially, we are all artists. And because we have the ability to recognize and respond to the creativity of others, we are all, at least potentially, critics, too.
3. Who Should Pay for the Arts in America? The National Endowment for the Arts turns fifty years old this year, and it’s struggling to find a way to make the performing arts available to everybody. The NEA argues the arts are a public good, that the progress of society and arts are parallel. For the Atlantic, Andy Horowitz explores who should pay for the arts in America and how the system that supports it can be better improved.
One morning last August I visited Williams College in Massachusetts to teach a workshop on “building a life in the arts” with a group of racially, geographically, and economically diverse young people working at the Williamstown Theatre Festival. Later that night I attended a show at the theater, where I saw these idealistic apprentices taking tickets from, ushering, and selling merchandise to an overwhelmingly white audience — mostly over 60 and, judging by appearances, quite well-off. The social and cultural distance between the aspiring artists at Williamstown and their theater-going audience couldn’t have been more pronounced. This gulf is quite familiar to most producers and practitioners of the performing arts in America; it plays out nightly at regional theaters, ballets, symphonies, and operas across the country. The current state of the arts in this country is a microcosm of the state of the nation. Large, mainstream arts institutions, founded to serve the public good and assigned non-profit status to do so, have come to resemble exclusive country clubs. Meanwhile, outside their walls, a dynamic new generation of artists, and the diverse communities where they live and work, are being systematically denied access to resources and cultural legitimation. Fifty years ago, the National Endowment for the Arts was created to address just such inequity. On September 29, 1965, President Lyndon B. Johnson signed the National Endowment for the Arts into existence, along with a suite of other ambitious social programs, all under the rubric of the Great Society. Johnson imagined these programs as ways to serve “not only the needs of the body and the demands of commerce but the desire for beauty and the hunger for community.” Half a century later, the ethos upon which the NEA was founded — inclusion and community — has been eroded by consistent political attack. As the NEA’s budget has been slashed, private donors and foundations have jumped in to fill the gap, but the institutions they support, and that receive the bulk of arts funding in this country, aren’t reaching the people the NEA was founded to help serve. The arts aren’t dead, but the system by which they are funded is increasingly becoming as unequal as America itself.
4. The Nasty World of Theater “Clearances,” and Why It Matters to Filmgoers. In the film industry, there is a practice theater chains employ called “clearing,” whereby they demand that distributors not license films to their competition so that their audience doesn’t become diluted. The Washington Post’s Ann Hornaday examines the nasty world of theater clearances in relation to Landmark Theaters’ suit against Regal Cinemas.
Clearances were initially instituted as a useful way for theaters to balance out their programming and avoid over-saturating markets with a small number of movies. But in several instances, clearances have had the opposite effect, drastically limiting the choice of which films theater owners can book, and which their customers can see. As that contradiction has become more pronounced, clearance practices have increasingly come under scrutiny as unfair, anti-competitive and possibly illegal. Ultimately, it’s the distributors who decide where their films will play: Disney will put “Star Wars: The Force Awakens” wherever it wants to. But that choice is subject to negotiation. Exhibitors — especially big chains, which can sometimes be the only theatrical option in smaller cities — routinely use their market clout to persuade (or threaten) studios to play only at their theaters. Whether tacit or explicit, the implication is clear: If the distributors don’t play ball this time, they might have trouble booking their wares in the same chain’s theaters down the line. Over the past several months, more bookers, exhibitors and distribution professionals have begun to come forward with information on the industry’s most cutthroat practices: The Department of Justice’s antitrust division launched an investigation last year into whether clearance practices at the nation’s biggest chains (Regal, AMC and Cinemark) violate federal law. Lawsuits similar to Landmark’s have been filed against Regal and AMC in Texas, Georgia, California and New York. “Clearances in some instances can be reasonable, particularly when they relate to specialized or art film, rather than commercial film,” Landmark president and chief executive Ted Mundorff told me in December, when he was preparing his company’s complaint. “But what [Regal] is doing is actually predatory.” Rather than a reasonable means of preventing market saturation, he continued, it’s become “something to protect their business by targeting much smaller competitors, and I think there’s a distinction there.” Ironically, that’s precisely the same distinction that Washington’s local theaters have been making for years — about Landmark. Josh Levin, who owned the West End Cinema, said he was “flabbergasted” when he read Landmark’s filing. “Reading their complaint felt like the story of my life with the names changed,” Levin said. “The practice of clearing that Landmark is complaining about is a practice they engage in, in every single market where they have a cinema.”
5. “Dirty Grandpa” Is a Dumb Comedy, But It’s Also a Movie About Fear and Death. You may have heard about the new Zac Efron-Robert DeNiro vehicle “Dirty Grandpa,” the raunchy comedy where Efron and DeNiro go on a road trip where they head to Daytona Beach and party with college kids. It’s gross and dumb, but it’s also very much about one’s own mortality. Uproxx’s Charles Bramesco explores “Dirty Grandpa” as a meditation on fear and death.
Most of the film is eaten up by a road trip to Daytona Beach that the two men undertake in order to secure willing sexual partners, with Dick performing literally death-defying feats of vitality and strength along the way. Turns out that Dick secretly spent most of his life with the Green Berets training insurgents overseas, a shocking reveal that clears a path for later scenes in which this 72-year-old man handily disposes of a half-dozen heavily armed gang members. One detour finds him essentially benching the already-hulking Jason with one arm in a flex-off against a pair of frat boys, and in another, he expertly maneuvers an ice-cream truck in a high-speed pursuit with the police. Dick must be an acutely comforting figure to a certain sort of person, living evidence that a man can advance into old age without losing touch with the things that, in the director’s cockeyed estimation, make him a man. In this R-rated Never Never Land, nobody has to age out of chasing skirt and getting loaded. Beyond that, the film seems woefully unaware of the dark irony inherent in De Niro’s participation with this project in specifics. “Dirty Grandpa” is consumed by its preoccupation with reclaiming fading greatness, but stays blind to its role in robbing the crumbling titan De Niro of his remaining integrity as an actor. Few actors emblematize late-phase decline quite like De Niro; a few decades ago, he was delivering one or two immortal performances per year, captivating audiences with his combination of ferocious intensity and insightful nuance. He’d experiment with self-parody in the late ’90s and early ’00s with “Analyze This” and “Meet The Parents,” then transition to full slumming-it mode as the ’00s rolled on. He’s shown signs of life in his recent collaborations with David O. Russell in “Silver Linings Playbook” and “Joy,” but in most recent films, De Niro’s resigned frown has been a sad reminder of the compromises that aging demands. That “Dirty Grandpa’s” casting director would tap De Niro for this aggressive denial of everything he stands for, and then reinforce how low he’s fallen by giving him so much embarrassing schlock to recite, is a gobsmackingly antithetical move just about on par with slaughtering a calf at a PETA sit-in. What makes “Dirty Grandpa” a sophomoric fantasy where men sell themselves in order to smooth the transition into impotency instead of a brilliantly subversive black comedy (aside from shameful miscasting, and fart jokes) is its commitment to its own delusion. When the film nears its end and Jason nears his wedding day, he’s got a choice to make: He can either go through with the marriage and advance at his law firm, or throw it all away and live on a ship for a year with his boho college crush that he happens to run into outside Daytona. Under the tutelage of his vicariously invested grandpa, Jason pulls a reverse “The Graduate” and sabotages his own wedding at the last moment, fleeing to hop on the magic bus and live what we’re made to believe will be a more fulfilling, wholesome lifestyle. By the film’s own measure, this is a happy ending, though Jason has just torn several lives to shreds, including his own. But he gets to fulfill the all-too-common office-slave fantasy of blowing it all up and starting anew pursuing a personal passion, as blithely unrealistic as that might be.
Tweet of the Day:
Tonight’s Grease brought to you by a fantasy of a 1978 fantasy of a 1971 fantasy of a 1959 fantasy of the early 1950s.
— Phil Gentry (@pmgentry) February 1, 2016