Criticwire’s Daily Reads brings today’s essential news stories and critical pieces to you.
1. Why Is Stan Lee’s Legacy in Question? For those who are wondering to whom they can attribute the creation of all these superheroes that are dominating multiplexes everywhere, look to Stan Lee, the 93-year-old creator of Marvel Comics. By all accounts, Stan Lee should be wildly heralded in the comics community, but in recent years, his legacy has been questioned. Vulture’s Abraham Reisman explores Stan Lee’s work and legacy, and why he’s become something of a tragic figure.
People are almost always surprised when I tell them Stan Lee is 93. He doesn’t scan as a young man, exactly, but frozen in time a couple of decades younger than he is, embodying still the larger-than-life image he crafted for himself in the 1970s — silver hair, tinted shades, caterpillar mustache, jubilant grin, bouncing gait, antiquated Noo Yawk brogue. We envision him spreading his arms wide while describing the magic of superhero fiction, or giving a thumbs up while yelling his trademark non sequitur, Excelsior! He’s pop culture’s perpetually energetic 70-something grandpa, popping in for goofy cameos in movies about the Marvel Comics characters he co-created (well, he’s often just said “created,” but we’ll get to that in a minute) in the 1960s. But even then, he was old enough to be his fans’ father — not a teenage boy-genius reimagining the comics world to suit the tastes of his peers but already a middle-aged man, and one who still looked down a bit on the form he was reinventing. And yet, Lee has no superhuman resistance to the aging process. “My eyesight has gotten terrible and I can’t read comic books anymore,” he recently told Britain’s “Radio Times” in a rare moment of departure from his usual cheerful, product-promoting talking points. “Not only a comic book, but I can’t read the newspaper or a novel or anything,” he said. “I miss reading 100 percent. It’s my biggest miss in the world…It’s awful to feel a thousand years old.” A comic-book Methuselah, Lee is also, to a great degree, the single most significant author of the pop-culture universe in which we all now live. This is a guy who, in a manic burst of imagination a half-century ago, helped bring into being “The Amazing Spider-Man,” “The Avengers,” “The X-Men,” “The Incredible Hulk,” and the dozens of other Marvel titles he so famously and consequentially penned at Marvel Comics in his axial epoch of 1961 to 1972. That world-shaking run revolutionized entertainment and the then-dying superhero-comics industry by introducing flawed, multidimensional, and relatably human heroes — many of whom have enjoyed cultural staying power beyond anything in contemporary fiction, to rival the most enduring icons of the movies (an industry they’ve since proceeded to almost entirely remake in their own image). And in revitalizing the comics business, Lee also reinvented its language: His rhythmic, vernacular approach to dialogue transformed superhero storytelling from a litany of bland declarations to a sensational symphony of jittery word-jazz — a language that spoke directly and fluidly to comics readers, enfolding them in a common ecstatic idiom that became the bedrock of what we think of now as “fan culture.” Perhaps most important for today’s Hollywood, he crafted the concept of an intricate, interlinked “shared universe,” in which characters from individually important franchises interact with and affect one another to form an immersive fictional tapestry — a blueprint from which Marvel built its cinematic empire, driving nearly every other studio to feverishly do the same. And which enabled comics to ascend from something like cultural bankruptcy to the coarse-sacred status they enjoy now, as American kitsch myth. All of which should mean there’s never been a better time to be Stan Lee. But watching him over the last year — seeing the way he has to hustle for paid autographs at a convention, watching him announce lackluster new projects, hearing friends and collaborators grudgingly admit his personal failings — it’s hard to avoid the impression that, in what should be his golden period, Lee is actually playing the role of a tragic figure, even a pathetic one. On the one hand, the characters associated with Lee have never been more famous. But as they’ve risen to global prominence, a growing scholarly consensus has concluded that Lee didn’t do everything he said he did. Lee’s biggest credit is the perception that he was the creator of the insanely lucrative Marvel characters that populate your local cineplex every few months, but Lee’s role in their creation is, in reality, profoundly ambiguous. Lee and Marvel demonstrably — and near-unforgivably — diminished the vital contributions of the collaborators who worked with him during Marvel’s creative apogee. That is part of what made Lee a hero in the first place, but he’s lived long enough to see that self-mythologizing turn against him. Over the last few decades, the man who saved comics has become — to some comics lovers, at least — a villain.
2. Why Netflix Doesn’t Release Its Ratings. Netflix has become a one-stop shop for its own brand of original content, with popular shows like “House of Cards,” “Orange is the New Black,” and “BoJack Horseman” dominating media coverage, but the streaming service doesn’t release its ratings to the public, often relying on the perception that everyone’s watching their shows to take the place of actual data. For The Atlantic, Jason Mittell examines why Netflix doesn’t release its ratings.
In January, NBC claimed to have discovered a way to estimate Netflix’s viewership, which revealed that NBC’s top shows are more popular than Netflix’s, and thus that the reported death of broadcasting has been overstated. Predictably, Netflix claimed NBC’s revelations were “remarkably inaccurate,” leading to the type of intra-industry feud that generates good headlines. But beneath the scuffle, the more interesting story is what the feud says about television popularity today, Netflix’s unique business model, and why viewers really should care. For starters, the business models for American broadcasters like NBC and streamers like Netflix (or Hulu, or Amazon) are drastically different. The core product for NBC and other commercial broadcasters is viewers — or at least estimates of viewership they can sell to advertisers via the crude common currency of Nielsen ratings. Broadcasters sell audience eyeballs to sponsors, and so they need to know what programs are popular so they can price their ads accordingly. For decades, the broadcast networks have been competing for ratings points to maximize their ad sales, a battle that became increasingly fierce in the 1980s with the rise of cable television. For NBC and its kin, popular programs mean profits, as they’re effective bait to deliver viewers to sponsors. Netflix, meanwhile, doesn’t care about viewers, only subscribers — its revenue comes from maintaining and expanding the ranks of people who find spending $10 a month to be a worthwhile investment. It accomplishes this not by creating individual hits, but by offering a slate of programs with broad appeal and reach, including original series and movies, as well as a back catalog of older television and film offerings. Like other online-streaming companies, its ultimate goal is to provide sufficient material to justify the ongoing subscription cost, persuading customers to buy into the brand itself. An individual hit is certainly useful toward that goal, but only insofar as it helps expand the service’s reputation and reach. Neither commercial broadcasters nor online streamers view “television programs” as their products — for both, programming is a means to their ends of selling audiences to advertisers or subscriptions to audiences, respectively. For broadcasters, popularity (at least as measured through the inexact ratings system) equals profit, since every increase in viewership means increased ad revenues with no additional costs to produce and procure programs. For streamers, actual popularity is less important than perceived popularity — Netflix gains the most by having its programming seem more popular than it is, as that helps generate interest from potential subscribers, and helps current subscribers justify their monthly fees for access to the hottest programs. Netflix’s refusal to release actual viewer numbers serves this end, as it can market a series as a “hit” without any reality checks to deflate that perception.
3. Why TV Is Finally Embracing the Realities of Race. If you’ve been paying attention to television recently, you’ll have noticed there have been a lot of buzzed about shows that tackle race head-on, like Netflix’s “Master of None,” ABC’s “Fresh off the Boat,” and FOX’s “Empire.” Not that long ago, television was afraid of even touching the topic of race let alone building a show around diverse casts and non-white leads. Variety’s Maureen Ryan examines why exactly television is embracing the realities of race right now.
The fuel of any comedy or drama is conflict, but for years, TV didn’t bother to tap into the rich vein of material that comes from being black or brown in America — in large part because most TV creators and directors are white men. “I really wanted to do a show about being the black guy in the writers’ room,” says Kenya Barris of the ABC sitcom he created, “Black-ish.” “The things that are said if you’re the only woman or you’re the only black guy or you’re the only heavyset person — the s—t that you hear sometimes is out of this world.”…Barris changed the dynamics to a degree — the comedy is mostly centered on Dre and Rainbow Johnson’s home — but he kept the “black guy in a white workplace” concept. In every episode, Dre (Anthony Anderson) consults his ad-agency colleagues, and in those meetings, he is usually either the only black person or one of two African-Americans present. Barris and his writers have wrung an admirable number of pointed jokes from the way the white and black characters talk past each other and cheerfully treat Dre with a combination of condescension, friendly obliviousness and needy insecurity (everyone wants Dre to be their token black friend). Perhaps it’s not surprising that those regarded as outsiders — men and women of color, white women and gay men and women — are often the ones running or starring in shows that are the most forthright and bold when it comes to matters of race, class, sexuality and gender. Like Barris, these artists have raw material to spare. In its first season, “Jane the Virgin” put an #immigrationreform hashtag on the screen, and this season, it has deftly folded in the undocumented immigrant status of Jane grandmother, Alba Villanueva, played by Ivonne Coll, into its ongoing narrative. “Empire’s” second-season premiere featured a rally that referred to the Black Lives Matter movement. “Fresh Off the Boat” jokes about the weirdness of suburban life in Orlando, especially “white-people food.” One of the core characters on “Quantico” is a Muslim woman who wears a hijab. Most of the shows in this current wave of racially aware comedies and dramas don’t spend a single episode “solving” a difficult problem, and certainly “Black-ish’s” Feb. 24 episode, which focuses on police brutality, does not try to be as tidy as Very Special Episodes of decades past. “Black-ish” allows characters to have differences of opinion, says Tracee Ellis Ross, who plays Rainbow Johnson. “Even hurt feeling are had,” she says.
4. Wouldst Thou Like to See The World: On “The Witch.” Robert Eggers’ new horror film “The Witch” has proved a hit amongst many critics, but has polarized audiences, some of whom feel disappointed by the hype while others have found plenty to like. The film itself has produced debate about “what it all means,” with everything from feminist to historical readings. On his blog, Bill Ryan explores the many readings of “The Witch” out there before presenting his own and why it carries “a very specific power” for him.
Early in “The Witch,” the new horror film that is the feature debut of writer/director Robert Eggers, a baby disappears. By this I don’t mean the infant boy’s crib is discovered empty one morning — I mean that while his teenage sister Thomasin (Anya Taylor-Joy) is playing peek-a-boo with him out in the yard that stretches from their family’s secluded farm in Colonial-era New England to the vast woods beyond, between closing her eyes and opening them, the baby, named Samuel, who had been on the ground, on a blanket, on his back, looking up at her, vanishes. Bewildered, Thomasin tells her parents William (Ralph Ineson) and Katherine (Kate Dickie), and they attempt a search (and it’s only the family who is able to search; there is no one else in this part of the country near enough to ask for help), but it comes to nothing. The assumption is that a wolf has taken Samuel, but the viewer of “The Witch” knows different. Shortly after the disappearance, Eggers cuts from the farm location to somewhere else, somewhere perhaps in those woods. We see Samuel, with a knife being brought slowly down over him. Soon after that, we see the figure who’d wielded the knife, an old woman, a witch (Bathsheba Garnett), chunks of bloody flesh in a pile near her, grinding these chunks, and presumably other matter, with a mortar and pestle. If, after that, you’re given to assume that just about anything goes in “The Witch,” you wouldn’t be far wrong. Eggers’s film is the most relentlessly, even cruelly unsettling horror film I’ve seen since Lars von Trier’s “Antichrist” came out in 2009. It’s absolutely mesmerizing in its horror, Satanic in its imagery; it shows an understanding of the genre’s potential that pretty much no other recent horror filmmaker appreciates, in almost every frame. I liked it. I thought it was good. “The Witch” has been much talked about, and was released wide, riding a wave of hype that many people were bound to think was undeserved, such is the nature of hype, but that sort of disappointment rarely has much to do with the film one is being disappointed in (such is the nature of disappointment). Still, some kind of backlash, whether stemming from honest objections to the film itself, as some of the criticism doubtless has, or from…something else, was inevitable. Currently, the big complaint people are having with “The Witch” is that it “isn’t scary.” Many of these people, from what I’ve seen, also consider the film “boring” and “so bad.” I have no plans to address those criticisms here, because, in the parlance of our times, “I can’t even,” which in this case is short for “I can’t even fucking live in this world anymore if this is the kind of conversation that’s going to dominate, please God, ease my pain, I’m sorry for swearing.” So that shit can fuck off. It’s boring and so bad, etc. What interests me more is trying to describe my own reading of the film, which, while I know is at least on some level shared by others, is nevertheless not the “interpretation” (and those quotation marks are more precisely used here than is the norm) that most people who’ve seen “The Witch” seem to favor. And please understand, by pointing that out, I’m not trying to toot my own horn — I’m still arguing with myself about this movie. Plus, when I brought this whole thing up on Social Media the other day, I used a rather more absolutist tone than I should have. It is foolish to claim, or to imply, as I did, that what I don’t see in “The Witch” isn’t there at all. In fact, it would not shock me in the least to discover that among the people who disagree with my take on the film is Robert Eggers himself. I haven’t bothered to find that out one way or another, because, and I mean this with all due respect, and I think he would understand, on a fundamental level I just don’t care. “The Witch” carries for me a very specific power, and like anyone who finds a piece of art that matters to them, and whose love for it comes from somewhere not shared by everyone, I’m not interested in being told I’m “wrong” by the artist.
5. Mainstream Creep: Keeping Feminist Film Criticism Subversive. Online film criticism has never been more democratic and more widespread, but its homogeneity still reigns strong. Most film criticism is still written by white men and are thus presenting a narrow picture of critical opinion. For Hazllitt, Kiva Reardon ponders the topic of feminist film criticism, and how online journals can solve the access issue without bowing to popular influence.
Film is deeply wedded to capitalism. There’s not only the sheer cost of making a movie, but how capital is used to define a film’s success: the box office. While album sales and New York Times best-sellers lists chart the money-makers in their mediums, the discussions aren’t often around how many millions it took to get the cultural product made in relation to how much it raked in. For film, profit is often central to discussions, regardless of the amount. (Take “Tangerine,” the rightly celebrated indie darling of 2015, which became a talking point not only for its trans storyline and the fact that it was shot entirely on iPhones, but also because it only cost $100,000 to make.) Not surprisingly, this close relationship to the bottom line affected the nature of film writing, too. The silent era saw the birth of fan magazines, which were folded into the industry machinery with the creation of the star system. The voyeuristic pleasure of a tabloid tale sells a magazine, which then fuels that star’s ticket-selling power, and on it goes, to this day, like an ever-ravenous ouroboros. As the film medium became more entrenched in popular culture, and as newspapers and other publications began to hire film writers, mainstream film criticism’s journey was one of working from the inner circle to the margins. Instead of lauding the stars and marvelling at new technologies, it involved sparring about taste, and, crucially, butting up against the basic premise that movie magic can be quantified by revenue. Feminist film criticism, by contrast, has different origins, removed from the pervasive power of the film industry: second-wave feminism. Take the grandmother of the feminist film periodical, “Camera Obscura,” which was launched in 1976 out of a discussion group of grad students at University of California at Berkeley and is now printed by Duke University. Its first issue states that the concept for the publication “evolved from the recognition of a need for theoretical study of film in [America] from a feminist and socialist perspective.” In its 40-year history, “Camera Obscura” has stayed true to this mandate, publishing pieces by the stalwart stars of graduate seminars like Kaja Silverman, Constance Penley, Mary Ann Doane and Wendy Hui Kyong Chun. Their work fits “Camera Obscura’s” idea that “feminist film analysis recognizes that film is a specific cultural product, and attempts to examine the way in which bourgeois and patriarchal ideology is inscribed in film.” But while “Camera Obscura” was born on the margins, it faces an issue that challenges academia regardless of subject matter: with costly subscriptions and limited hard-copy distribution, “Camera Obscura” only reaches the margins. Criticism such as this often caters to a certain, privileged set. My own first encounter with feminist writing on film was at university. In a course on French New Wave Cinema, taught by the brilliant scholar Alana Thain, the icons of cinematic history Jean-Luc Godard and Jean-Francois Truffaut were dismantled through a feminist lens; works of female filmmakers, such as Agnes Varda, were positioned as central. At the same time I was introduced to bell hooks and her cornerstone theoretical book on gender, race and media studies, Reel to Real, and Tanya Modeleski’s work on Alfred Hitchcock, which gave me a means to grapple with my love for this director’s often less than female-friendly films. I’m indebted to this time in my studies — and not just to the ideas I was exposed to, but also to my parents, who helped pay for the pricey course packs that held the printed pages that would come to shape me. Had I been born a few years earlier, things might have been different: I missed the heyday of self-publishing zine culture in the 1990s. These publications were radical but also ephemeral. Zines were not preoccupied, as Western literary culture is, with an occupation of physical space in order to demarcate legacy — zines were made in limited runs, sold at shows, and snail-mailed to friends. While researching specific feminist film zines, I found mention of them in a collection owned by Amy Mariaskin, which is now part of Duke University’s Culture Zine Collection. With no PDFs or scans, I can’t read them unless I road trip to North Carolina. Access once again became an issue. The Internet was supposed to fix all of that, but the process of digitization is also a hierarchical one — one that’s not outside of the influence of dominant power structures that privilege recording the stories of certain races, sexual orientations and genders.
6. “Dances With Wolves”: A Best Picture, 25 Years Later. At the 1990 Academy Awards, Kevin Costner’s epic Western “Dances With Wolves” took home Best Picture and Best Director. For The New Republic, Will Leitch examines Costner’s magnum opus 25 years later, and how the film had more to say about the 1990s than about the 1860s.
The movie, of course, takes place during the Civil War, but every frame of the film screams early ’90s. This is as much a relic of its time as Milli Vanilli, “2 Legit 2 Quit,” and Operation Desert Storm. The movie means to be a capsule of a specific time in American history, but it ends up being about a specific time in Hollywood — a time that seems just as far away. There are ’90s touchstones everywhere, starting with the opening credits, which show us Orion Pictures, the acclaimed studio that soon thereafter went bankrupt. There are mullets everywhere, even on Union soldiers who more likely would have been a little less Party In The Back. The movie casts Robert Pastorelli, a “breakout” sitcom star from “Murphy Brown,” in a “wacky” comedic role. (No offense to the late Pastorelli, who died of a morphine overdose in 2004, but there may be no more ’90s actor than him, with the possible exception of his “Murphy Brown” co-star Grant Shaud.) And it has the most ’90s artifact of all, Kevin Costner, back when he was the most assured, stable movie star in the world, the next coming of Jimmy Stewart, a guy who could headline both Oliver Stone paranoid nihilist thrillers and two-hankie Whitney Houston romances and not even break a sweat. The movie even lets him try an accent out a couple of times, which is as much a ’90s staple as a Dan Quayle gaffe. But more than anything, “Dances With Wolves” feels like a ’90s movie because that was the last time a big Hollywood studio would ever bankroll a movie like this. This was a big huge expansive ambitious insane project, with Kevin Costner and no one else anybody knew starring in a three-hour (three hours!) tale of a soldier’s personal journey. It was also directed by the star, who is particularly fond of giving himself lingering close-ups. (Costner loves close-ups of his own face the way Ben Affleck loves shots of himself shirtless and working out before cutting to a shot of a helicopter flying over downtown Boston.) The idea of a prestige movie this immense being released by a major studio now is absurd. The success of this film led to a few more hyper-long ambitious passion projects (namely, Oliver Stone’s “JFK” and Spike Lee’s “Malcolm X”), but you never saw that sort of thing again. This was the last time a movie star had so much clout that he could do something so patently crazy and expensive, and get a major studio to finance it — and thus the last time it could pay off. (During production, the movie was known as “Kevin’s Gate,” a reference to the famous flop “Heaven’s Gate,” which bankrupted United Artists.) Afterward, Mel Gibson could make his “Passion of the Christ,” and Angelina Jolie could make “By the Sea,” but only through independent studios or, increasingly these days, Netflix or Amazon. The very fact that “Dances With Wolves” exists makes it a relic of its time.
Tweet of the Day:
The crit buzzword of my youth was “subversive.” Then came transgressive. Now it’s immersive. I’ve spent 35 years in a Cole Porter lyric.
— Tom Carson (@TomCarsonWriter) February 23, 2016