Monday, January 31, 2011

Film Noir, Part One

The Big Combo’ (1955)

FILM NOIR — like pornography, no one can tell you what it is, but they know it when they see it.

Actually, that’s not entirely true. Problem is, if you ask 100 different film buffs for a definition, you’ll probably get 100 different answers. Here’s how Merriam-Webster takes a stab at it: “a type of crime film featuring cynical malevolent characters in a sleazy setting and an ominous atmosphere that is conveyed by shadowy photography and foreboding background music; also: a film of this type.” And that’s a pretty good place to start.

I, along with several other aficionados, would throw in a few other qualifiers. To me, a “true” film noir is a specifically Hollywood crime drama made sometime between the mid-1940s to late 1950s, characterized by cinematography with shadowy low-key lighting and an urban-inflected story with the strong potential to unnerve its audience.

Some refer to film noir as a “genre.” I don’t. To me, the hallmarks of noir aren’t as radically different from those of other kinds of crime films as, say, the western’s are from those of other kinds of action movies. I consider film noir to be a subgenre of the crime film, along with the whodunnit, the gangster picture, and the suspense thriller. (Originally a French term, film noir — “black film” — would be rendered in the plural as films noirs to be grammatically correct. But since the phrase has been adopted into English, I will Americanize the plural as “film noirs.”)

I learned about film noir early on in my readings about cinema. By the time I finished studying cinema history in college, I either had seen or was familiar with many of the canonized “classics” of this kind of film: Double Indemnity (1944), The Big Sleep (1946), Out of the Past (1947), In a Lonely Place (1950), and some others. However, my interest in film history lay elsewhere (the European and Japanese art films of the 1960s), so the subgenre largely escaped any extra scrutiny from me.

When I attended grad school, where feminism was a particularly strong influence in my female-run film-history department, my main impression of film noir was as a misogynistic kind of movie. I understood that the subgenre primarily and actively sought to return newly empowered American women of the post-World War II years — those employed in traditionally male jobs on the homefront while most able-bodied men were fighting overseas — to their customary roles as homemakers. After all, many antagonists in film noirs are manipulative women — femmes fatales — hoping to ensnare a fall guy (usually the protagonist) in some evil scheme: cunning witches whom the audience would want defeated and put in their place.

Although I think it’s important to recognize this aspect of film noir, it’s far from the be-all and end-all of the subgenre, and a portrayal of a wicked femme fatale need not contaminate the entire movie with irredeemable misogyny and thus render it totally without value. Besides, this perspective overlooks those entries with strong, sympathetic women at their centers: Laura (1944), Phantom Lady (1944), Mildred Pierce (1945), and others. Also, some recent feminist criticism of film noir reinterprets the femme fatale as a figure of women’s resistance against a male-dominated world.


‘Phantom Lady’ (1944)


What makes a movie a film noir and not something else? First, it’s important to remember that the term “film noir” was applied by French critics to certain American movies only in retrospect. At the time these films were released, popular audiences didn’t think of them as a special category of cinema. No U.S. viewer in the 1940s would say, “Hey, they’re showing a film noir at the Bijou tonight!” Only in the years after their first release were these movies seen, primarily by French critics, to share common and distinctive traits. And the vast majority of literature about film noir was written in the years after the subgenre’s “classic” period.

Why limit myself to movies made during the 1940s and ’50s when considering film noir? After all, lots of pictures released in the years since have been favorably hailed by critics — and others self-proclaimed — as neo-noir: films that update the hallmarks of classic noir to more contemporary settings and with contemporary attitudes, films such as Point Blank (1967), Body Heat (1981), and Bound (1996). Also, why limit myself to Hollywood movies? There are entire books published on “British film noir,” “Japanese film noir,” and the like. So, the term must have meaning for films beyond American shores. Right?

Well, if anyone wants to defend neo-noir as a direct continuation of the subgenre or stick up for noirish films made overseas, I won’t stop them. In fact, such defenders would probably have some intriguing observations to make. But such cinematic candidates for the mantle of film noir will be missing what is, to me, the most salient characteristic of the subgenre: its constraint and shaping by the Hollywood Production Code. (The parameters of film noir that I am about to lay out are my own, but many of the following observations about the subgenre, and about Hollywood cinema in general, have been made elsewhere by others.)

Film noir emerged at a time, Hollywood’s studio era, when most kinds of American movies sought to give their viewers a reassuring view of the world around them. This meant portraying the world of the characters — a surrogate for the world of the viewers — as a wholly contained, apprehensible, and ultimately comforting environment in which (ideally) to live one’s dreams. Westerns would give their audiences parables of a hero rising to the occasion to vanquish a villain. Musicals would tell stories of romantic couples falling in love and living happily and effortlessly ever after. Hollywood films would usually affirm by the final fade-out that the characters’ world — and thus the audience’s world — was ultimately an understandable and uncomplicated place to live. This mission was abetted by the Production Code, which would often snuff out any story element that would challenge such an unfissured worldview, especially when challenges took the form of sex, violence, or questioning institutional authority.

This imposed sense of optimism and prudishness was, film-wise, uniquely American. Adult-oriented British productions, for example, could pepper their dialogue with damns and hells — words usually unutterable on Hollywood soundtracks — and their costume pictures could include low-cut necklines on the ladies that could show off more d├ęcolletage than the Production Code would have allowed. Of course, a film like Gustav Machaty’s Czech production Ecstasy (1933) — with its scenes of a skinny-dipping, streaking, orgasming Heddy Lamarr — would have been precluded once the Code took earnest effect in 1934.

As a result, some Production Code-era Hollywood movies’ most unique or peculiar moments may be seen as inadvertent (or sometimes not so inadvertent) indications of the messy, splintered human realm outside the theatre, a view of the outside world that the Code tried to benignly suppress. For example, overly elaborate musical numbers — perhaps with lots of dancing bodies wearing ornate costumes — could be seen as a kind of overblown over-compensation, as a stealthy but showy stand-in for the sex act that the Code would never allow on the Hollywood screen.


Barbara Stanwyck and Fred MacMurray in ‘Double Indemnity’ (1944)


Perhaps nothing challenged Hollywood’s and the Production Code’s sanitized view of studio-era society as much as the horrors and carnage of the Second World War. The evils that Hitler’s Germany unleashed around the globe, and the blood and devastation it took to defeat the Nazi forces, impressed upon once-insular America that the human world was never as tidy as Hollywood’s world. James M. Cain’s novel Double Indemnity — with its libidinously motivated, murderous lead characters — had originally been serialized back in 1936 but was thought unfilmable by Hollywood because of the story’s cynical view of human appetites. However, news of the war and its repercussions reminded America day after day that such a pitiless vision of humankind had a basis in reality. The implication of grim and chaotic truths beyond the screen had crept into studio-era Hollywood movies before, but in the decade or so after the Second World War, they took on more ominous and threatening insinuations.

To me, rupturing the Production Code was film noir’s most important function, even if rupturing anything was never the films’ intention. But in so many noirs, the pat answers and happy endings that had provided story closure in so many earlier movies no longer seemed to be enough. And as the postwar world insinuated its way into Hollywood’s crime dramas, the movie conventions that upheld such a simple worldview appeared to break down. The new uncertainties manifested themselves as odd intrusions into the world of the stories. In The Big Sleep, one of the murders is never accounted for. In Murder, My Sweet (1944), the detective hero ends the story incapacitated. In Mildred Pierce, the title character ends up going back to her good-for-nothing husband with no assurance that her previous ordeals won’t repeat themselves. And most extravagant of all, in Kiss Me Deadly (1955), the world seems to be annihilated in a nuclear explosion. The inability of the Production Code to tie up such defiant loose ends is what gives these narrative disruptions their troubling resonance.

When the Code began to lose its hold on and authority over studio content in the mid-1950s, ultimately being scrapped in 1968 in favor of today’s age-tiered rating system, more direct portrayals of social anxieties could be shown on the screen, lessening the portent of their indirect references elsewhere in the narrative. The sexually charged homicidal couple in Double Indemnity smolder with erotic energy in their scenes together precisely because the film can’t acknowledge the intensity of their libidinal urges in an unambiguous way. The sexually charged homicidal couple in Body Heat catch fire in their soft-core sex scenes, but when clothed, their moments together lack steam — it having been unambiguously blown off in the bedroom. While contemporary Hollywood movies contain their own kind of ruptures, insinuations of a nasty, messy outside world is no longer one of the crime drama’s; recent crime films will usually just confront that nasty world head-on. While I think that greater permissiveness on the post-Code Hollywood screen is overall a positive development that can produce more liberating films, the removal of the Production Code also removed the need for the kind of indirection and insinuation that can so fascinatingly fissure a crime story.


Gene Tierney and Clifton Webb in ‘Laura’ (1944)

So, it’s difficult for me to think of a film like Bound, for all of its obvious indebtedness to the subgenre, as an example of film noir — most conspicuously because the story concerns two lesbians, and the Production Code would never countenance any overt portrayal of homosexuality. In my mind, a “true” film noir would try to skirt or conceal the very existence of gay love, and in doing so, complicate the narrative in an idiosyncratic way. One example is Otto Preminger’s Laura, which goes to great lengths to code its murderer, Waldo Lydecker (Clifton Webb), as tacitly gay, only to give him a disbelief-inducing heterosexual motive for his crime. Such intriguing ruptures are, to me, the essence of noir.

That is why I only regard film noir as postwar Hollywood (or American-targeted) movies made during the studio era. In my next installment on the subgenre (whenever I get around to writing it), I’ll catalogue other characteristics that I look for. Till then, I need to go down some very mean streets.

Saturday, January 29, 2011

Partying Like It’s the Clinton Years!


I guess I’m getting old. I’m just not as quick to anger about politics these days. It used to be that whenever I would read a newspaper’s opinion piece that I disagreed with, I would dash to the keyboard and peck out my angry response as a letter to the editor or as an Internet forum post. These days, I’m more apt to just roll my eyes and say, “There they go again.” (And the fact that I’m saying something so Reaganesque shows just how benthic my standards of expression have sunk.)

However, now and again, I’ll read something that I’ll just toss away at the time — but then, it will stay with me. And stay. And stay. And stay. And finally, I’ll decide to write something about it. But by then, a good chunk of time will have elapsed, enough to dull anyone’s interest in my response to what I read. But I guess that’s why God created the Internet. Now, I can blather on, to my heart’s content, about yesterday’s news blissfully ignorant of the fact that no one is interested in what I have to say. Isn’t technology wonderful?

’Round about New Year’s Eve, I was reading the San Fernando Valley’s Daily News, and my eye lit upon an op-ed titled “For New Year’s — Partying Like It’s 1999” by Tom Purcell. I found the premise of piece agreeable: how much better off the United States was at the end of the last decade of the twentieth century than it is at the end of the first decade of the twenty-first. The article begins with a checklist of things that made the twelvemonth of the Prince song so yearn-worthy:





• The unemployment rate was 4.2 percent in 1999.

• Dot-com stocks were still creating lots of paper millionaires.

• The U.S. deficit for that year was $1 billion — that’s right, “billion” with a “b,” a far cry from the $1 trillion to $2 trillion it is nowadays.  [Actually, according to U.S. News and World Report, the U.S. budget had no deficit but instead ran a surplus of $125 billion.  So, things were even better than Purcell says.]

• Things were going so well, we had to make up crises, such as Y2K, the Millennium Bug!

In casting doubt on the validity of the Y2K crisis, Purcell goes on to say: “Y2K, wrote The Wall Street Journal, was, essentially, a giant hoax.” But as far as I can tell, it wasn’t the reporters of the WSJ who pronounced Y2K a hoax; it was the Journal’s conservative opinion columnist James Taranto. That tidbit told me that Purcell was writing from a perspective that I probably wouldn’t agree with. The gist of Purcell’s piece was that things were going so well during the ’90s — and he itemizes a few of those things — that he would rather celebrate the bygone year of a decade ago than celebrate the upcoming one.

To my eye — and if you know me at all, you would probably guess that I would say this — Purcell never mentions the most significant factor between those idyllic times of the fin de si├Ęcle ’90s and today’s more worrisome atmosphere: the disastrous presidency of George W. Bush. In fact, Purcell never mentions Bush once in his entire article.

Purcell is a humor columnist (as well as a scribe for The Colbert Report), so his piece was obviously never intended as penetrating look into how we went from the prosperous ’90s to the floundering present. But to me, Purcell’s piece was a paean to the prosperity and good times of the Bill Clinton years — without giving Clinton any of the credit. In fact, Purcell only mentions Clinton once: in the context of him being an accomplice to the Y2K “hoax.”

If Purcell had decided to look more deeply into the causes of what made the ’90s so good, maybe he would have attributed a heaping helping of those good years to Clinton’s policies (yes, some of which were shaped in grudging concert with a Republican Congress), such as his tax rate on the wealthiest 2% of Americans. And if the humorist wanted to look more deeply into why things went so bad in the ’00s, maybe he would have incriminated Bush’s countermanding of those policies. But I suspect that such a conclusion would conflict with the writer’s right-of-center views.

It seems to me as though Purcell is bending over backwards to ignore Clinton’s decisive role as the shaper of the ’90s prosperity and Bush’s role as the prime mover of the undoing of that prosperity. I get the idea that Purcell believes that the ’90s would have been good years regardless of who was in the White House, and that the following years would have been bad irrespective of who inherited the Oval Office. If that is indeed Purcell’s perspective, I disagree with it. I think that much of America’s flourishing in the ’90s was due to the hard work of the Clinton Administration and its partners. And I think that most of this country’s reversal of fortune was due to the Bush Administration’s reversal of Clinton’s work.

There is, however, something that Purcell and I would agree on: times were so good during the ’90s that they allowed room for the fabrication of faux crises. But if I were to name an exemplar of such a counterfeit calamity, I wouldn’t name the Y2K frenzy, which, for all I know, may have been a legitimate looming threat that was successfully averted. I would point to something else: the decadent luxury that our government took in impeaching a president over his sex life.

Friday, January 28, 2011

The ‘Challenger’ Disaster

Today is the twenty-fifth anniversary of the Space Shuttle Challenger disaster. One conclusion reached about why the disaster took place is because the Challenger’s launch was allegedly rushed. Why would it have been rushed? That day — January 28, 1986 — was supposed to have been the day that President Ronald Reagan gave that year’s State of the Union address to Congress. And Reagan was well-known for manipulating spectacles to bolster his presidential image. I get the idea that the Challenger launch was rushed so that Reagan could use the event as American can-do dressing for his speech that night. (After the disaster, Reagan postponed his address until the following week.) Whenever I'm reminded of the Challenger, I wonder if those six astronauts — plus one civilian elementary-school teacher — may have given their lives because Reagan wanted to lend a speech of his some patriotic pizzaz.

Wednesday, January 26, 2011