Back to IndieWire

Daily Reads: The Need for Diversity in Film Criticism, Why Pop Culture Is Finally Getting Hacking Right, and More

Daily Reads: The Need for Diversity in Film Criticism, Why Pop Culture Is Finally Getting Hacking Right, and More

Criticwire’s Daily Reads brings today’s essential news stories and critical pieces to you.

1. Where Are All the Diverse Voices in Film Criticism?
Though the word “diversity” gets bandied around so much that many must believe it amounts to nothing more than a quota, what it really means is increasing the number of perspectives in any given field. If every field had the same type of people within it, the collective perspective is limited instead of varied, which in turn will stunt evolution and growth. At The Daily Beast, Chaz Ebert writes about the need for diverse voice in film criticism and how it will benefit the field overall.

My husband Roger called the movies a giant machine that generates empathy, allowing us to walk in the shoes of a person of a different race, or age, or gender, or economic circumstance. From there, he said, the greater understanding that results can inspire increased kindness and compassion. It is not enough to have reviewers who understand how to discuss film. We need reviewers who can speak deeply and with nuance because of their lived experiences. The trusted voices in film criticism should be diverse ambassadors who have access to the larger conversation. If we can’t recognize ourselves within the existing public discourse, we are implicitly being asked to devalue our experiences and accept a narrative that is not our own. Excluding diverse voices from the conversation de-emphasizes the value of our different experiences. It is critical that the people who write about film and television and the arts — and indeed the world — mirror the people in our society. Recently, I partnered with the Hawaii International Film Festival to launch the first Ebert Young Writers for the Arts program in Hawaii. Roger and I fell in love in Hawaii and developed an interest in its history and culture. We attended the film festival there for many years and witnessed the importance of a strong authentic voice in film. This program is meant to broaden and strengthen film criticism culture in Hawaii. In a swiftly changing media environment, informed writing and criticism on cinema by diverse voices is vital to a strong film culture and industry. For our first class of eight students, we chose Chicago-based Kevin B. Lee, an award-winning filmmaker, educator, and film critic, as a mentor. Championing diversity is a particular passion of mine. The issue affects the kind of stories we see every time we step into a movie theater or turn on a television or pick up a newspaper or digital tablet. If only one race or gender is allowed to tell their story, then the experiences of so many other lives will be left off the screen. So many moviegoers have actively sought out independent and foreign titles in order to diversify the stories they consume.

2. Pop Culture Is Finally Getting Hacking Right.
Anyone who has paid attention to pop culture since personal computers and the Internet became normalized knows that “computer hacking” has been portrayed in a somewhat cheesy and unrealistic light. Most of the time it involves someone “breaking into the mainframe” over tense music until someone else says, “We’re in!” It’s always been vague and absurd, mostly because for a long time people didn’t understand cybersecurity. But in the past few years, that has changed. The Atlantic’s Joe Marshall explores how pop culture is finally getting hacking right using Michael Mann’s “Blackhat,” and shows like “Mr. Robot” and “Silicon Valley.”

In some ways, cyberthrillers are just a new kind of procedural — rough outlines of the technical worlds only a few inhabit. But unlike shows based on lawyers, doctors, or police officers, shows about programmers deal with especially timely material. Perry Mason, the TV detective from the ’50s and ’60s, would recognize the tactics of Detective Lennie Briscoe from “Law & Order,” but there’s no ’60s hacker counterpart to talk shop with “Mr. Robot’s” Elliot Alderson. It’s true that what you can hack has changed dramatically over the past 20 years: The amount of information is exploding, and expanding connectivity means people can program everything from refrigerators to cars. But beyond that, hacking itself looks pretty much the same, thanks to the largely unchanging appearance and utility of the command-line — a text-only interface favored by developers, hackers, and other programming types. So why has it taken so long for television and film to adapt and accurately portray the most essential aspects of programming? The usual excuse from producers and set designers is that it’s ugly and translates poorly to the screen. As a result, the easiest way to portray code in a movie has long been to shoot a green screen pasted onto a computer display, then add technical nonsense in post-production. Faced with dramatizing arcane details that most viewers at the time wouldn’t understand, the overwhelming temptation for filmmakers was to amp up the visuals, even if it meant creating something utterly removed from the reality of programming. That’s what led to the trippy, “Tron”-like graphics in 1995’s “Hackers,” or Hugh Jackman bravely assembling a wire cube made out of smaller, more solid cubes in 2001’s “Swordfish.” But more recent depictions of coding are much more naturalistic than previous CGI-powered exercises in geometry. Despite its many weaknesses, this year’s “Blackhat” does a commendable job of representing cybersecurity. A few scenes show malware reminiscent of this decompiled glimpse of Stuxnet — the cyber superweapon created as a joint effort by the U.S. and Israel. The snippets look similar because they’re both variants of C, a popular programming language commonly used in memory-intensive applications. In “Blackhat,” the malware’s target was the software used to manage the cooling towers of a Chinese nuclear power plant. In real-life, Stuxnet was used to target the software controlling Iranian centrifuges to systematically and covertly degrade the country’s nuclear enrichment efforts. In other words, both targeted industrial machinery and monitoring software, and both seem to be written in a language compatible with those ends. Meaning that Hollywood producers took care to research what real-life malware might look like and how it’d likely be used, even if the average audience member wouldn’t know the difference. Compared to the sky-high visuals of navigating a virtual filesystem in “Hackers,” where early-CGI wizardry was thought the only way to retain audience attention, “Blackhat’s” commitment to the terminal and actual code is refreshing.

3. Why “Creed” Is the Greatest Underdog Movie Since “Rocky.”
Ryan Coogler’s “Creed,” the seventh film in the “Rocky” franchise and the first starring Michael B. Jordan as the son of Apollo Creed, has been racking up critical acclaim for everything from its direction to its performances as well as box office receipts (it has already surpassed its budget). Rolling Stone’s David Ehrlich examines the “Rocky” franchise and how “Creed” is the greatest underdog movie since the original “Rocky.”

Rocky Balboa is the greatest loser that cinema has ever known. A piddling Philly prizefighter who was considered a bum by the few people who knew his name, the battered bruiser didn’t have to beat heavyweight champion Apollo Creed in order to fulfill his destiny as a blue-collar hero. But by “going the distance” — and withstanding an unholy barrage of fists in the process — the seemingly indomitable boxer proved his worth and earned himself the pride that had eluded him all his life. In doing so, Rocky became the rousing face of a country who’d bloodied by Vietnam, Watergate, and a mess of other Nixon-era national embarrassments. The franchise’s iconic first installment, which begins 40 years to the day before “Creed” began barnstorming the box office last week, became the highest-grossing film of 1976 (and a surprise Best Picture winner, snatching the prize away from a sobering collection of stone-cold classics that included “All the President’s Men,” “Taxi Driver,” and “Network”) because of how palpably the sentimental melodrama conveyed the virtues of defeat. As the champion, Rocky’s loss would have felt like a brutal blow; as the challenger, losing to Apollo Creed with integrity was a galvanizing reminder that underdogs don’t have to win in order to emerge victorious. That’s what we love about them. America may be the most powerful country in the world, but losing is what we love. Losing is what allows us to keep fighting, and fighting is what we do best. We loved Rocky for being an underdog — we didn’t have to love him for being a winner. To make the leap from a single film to a massive franchise, however, he would have to become a winner. And when he became a winner, he got boring. (See every Rocky movie after “Rocky II.” Yes, we know, he beats Ivan Drago and thus is singlehandedly responsible for ending the Soviet Union and the Cold war, but still.) Victory is a narrative dead end for this saga, and as filmmaker Andrew Bujalski observed in a recent New Yorker essay: “The greatest fictional fighter of all time … and the chapter of his life in which he actually reigns as champ registers barely as a footnote. But where would the drama be in watching someone merely maintain his dominance?” In order to remain emotionally relevant, Rocky has consistently been forced to contrive himself into the challenger’s corner. And as Stallone’s fame went stratospheric, it became increasingly difficult to perceive the Italian Stallion as David rather than Goliath. In the beginning, Rocky won even when he lost. By his fifth film, he lost even when he won.

4. What It’s Like to Be a Nielsen Family.
Despite living in the age of DVR, streaming content, and Internet torrents, Nielsen ratings are still the standard metric for determining how many households are watching anything on television. But what does it mean to be a “Nielsen family,” the family that statistically represents the television-viewing public? At Vulture, Melvin Mar writes about exactly that and how it was fun to screw with the system.

In 2006, I was newly married, living with my wife and dog, when we received a door-hanger flyer to consider becoming a Nielsen family. My wife was intrigued, as her main hobby was watching television. In fact, she’s unusually methodical about it. Every fall season, she creates a grid of the TV schedule, color-coded to indicate the shows she will watch and what she anticipates will be canceled or renewed. Her predictions tend to be eerily accurate. So when the opportunity rose to become viewers who could actually be counted and heard, we seized it. We became a Nielsen household, despite the rather significant fact that I worked in the entertainment industry. I worked in movies, not television, and I was in between movies at the time, so technically unemployed – or at least that’s how I justified it. A week later, Paul from Nielsen came by our apartment. Paul set up every television with a black box, accompanied by a chaotic clutter of electronics behind the set. All this equipment connected to our internet and was used to feed our viewing data to servers nightly. There were now rules to watching television. Every time we turned on the TV, we were required to log in. There were eight buttons. My wife was No. 1; I was No. 2. If we had any additional people watching, they would be logged in as guests three through eight. Full disclosure: I often logged in my dog Joe as guest No. 3. (He sat next to me, so I thought that should count.) Every 15 to 20 minutes, the lights on the black box would start blinking to prompt us to confirm if we were still engaged and watching. This was especially interesting when you fell asleep watching TV and then woke to a barrage of lights in your face, as if you were tripping on drugs or having a seizure of some sort. Besides the procedural aspects of viewing television as a Nielsen family, there were the regular surveys. Paul came over every few weeks to check on the equipment and ask a battery of questions. “Did you buy a new car?” “Any changes to the household?” “Did you get a new job?” No, dude, still here talking to you at two in the afternoon. All this information was apparently fed to a system that would create profiles for who was watching what in America. For our troubles, we were paid $15 a month. I was unemployed at the time, so, yes, we partially did it for the money, too. During our time as a Nielsen family, my wife took her duties seriously, while I had fun with it — maybe too much. I fiddled with the number of guests viewing at our home, and often hooked up people I worked with and liked — for example, David Duchovny, whose first season of “Californication” had just debuted in 2007. We had a lot of “viewing parties” for that show, though I never told David about it. And I’m still convinced that my viewing habits helped keep “Chuck” (starring my friend Zachary Levi — I also never told Zach) on for five seasons. I thought of myself as their silent ratings Obi-Wan, watching over their shows.

5. Cataloguing Frank Ocean’s Obsession With Film.
Filmmakers are often influenced by a variety of different kinds of art, like literature, painting, and music. The same can be said of musicians, many of whom are influenced by the films in their life. Pitchfork’s Simran Hans explores Frank Ocean’s music and how it’s been influenced by a variety of different films.

It’s hard to predict much about Frank Ocean’s new album from its title alone. Surely “Boys Don’t Cry” is a callback to the Cure’s 1980 album of the same name, but what if Ocean is referring to something else? What if instead, it’s a stoic nod to the 1999 film of the same name? Directed by Kimberly Pierce, “Boys Don’t Cry” is based on the true story of American trans man Brandon Teena, though it is as much about the broad themes of identity, nascent sexuality, and body politics as it is about the violence experienced by transgender bodies. It wouldn’t be the first time Ocean’s music has alluded to a movie. From the Richie Tenenbaum outfit (yellow blazer, striped sweatband) he wore during his performance of “Forrest Gump” at the 2013 Grammy Awards to the mention of “Dragon Ball-Z” character Majin Bu in “Pink Matter,” Frank Ocean is obsessed with film and TV. Sometimes, Ocean quotes movies directly – the “too weird to live, too rare to die” line in “Lost” is lifted from Terry Gilliam’s madcap desert orgy “Fear and Loathing in Las Vegas” (1998), whose influence also looms large in the “Pyramids” video. In other instances, his imagery is subtly suggestive; the drugged-up silver-spoon students in “Super Rich Kids,” for example, are from the same cinematic universe as their “Less Than Zero” (1987) counterparts. Occasionally, Ocean’s film references are esoteric; who is “Novacane’s” “model broad with the Hollywood smile”? With her “stripper booty and a rack like wow,” it’s not that much of a stretch to read his “brain like Berkley” pun as a cheeky wink to Elizabeth Berkley in “Showgirls” (Paul Verhoeven, 1995). Pitchfork’s own Ryan Dombal described Ocean’s 2012 album “Channel Orange” as a “‘Magnolia’-style cross-wired heartbreak epic,” with its collage of multiple narratives connected by the thematic through-line of unrequited love, and indeed Paul Thomas Anderson’s film would fit neatly within the canon of new New Hollywood movies from the 1990s that Ocean references. But “Channel Orange” and Ocean’s 2011 mixtape “Nostalgia, Ultra” don’t just engage with independent films — they also reference Gen X blockbusters and big-budget, conservative films like “Pretty Woman” and “Forrest Gump.” Ocean — a bisexual black millennial — uses these films to insert himself into a distinctly American mythology. He is neither fanboy nor voyeur. He is Richard Gere in a tux. He is Jenny Curran. He is “Leaving Las Vegas.” He is the history of American movies, revised.

Tweet(s) of the Day:

This Article is related to: News and tagged , , , , , , , , , , , , , , , , ,