Thursday, December 25, 2014

I Am a Tingler!



For as long as I can remember, a light touch to my head or face, or maybe even a certain sound of voice, would give me a quick tingling sensation in my head.  The tingling was very pleasurable, like swarming beads of euphoria dancing around in my cranium, and sometimes around my face, with joyous abandon.  This sensation was very infrequent — it would only occur a handful of times a year — and very fleeting.  I could only enjoy the ecstasy of the sensation for a few seconds before it was gone and I had to get back to whatever I was dealing with before it hit.  Because the sensation was so rare and so ephemeral, I barely thought about it at all, despite the immense pleasure it gave me.  And it never occurred to me to give the sensation a name.

That all changed last year.  It changed while I was surfing the Web.  I don’t remember how or why I came across what I did.  Given the importance of the discovery to me, you’d think that I would remember my path to it in great detail.  But no such luck.  What was my discovery? 

Last year, while exploring the Internet, I stumbled upon the phrase Autonomous Sensory Meridian Response, or ASMR for short.  And I learned that it described that tingling sensation that I had experienced all my life.  My electronic travels led me in short order to a series of YouTube videos by various users, videos whose purpose was to trigger the euphoric feeling in their viewers, usually by the video-maker whispering to the camera.  So, this pleasant tingling sensation, I realized, was something that others felt as well, and it was something that could be induced.  My mind was blown like Louis Armstrong’s trumpet.  

I’m trying to remember more about my reaction to this discovery.  But I recall so little about that day that it might as well have been 100 years ago.  Maybe I don’t remember it very well because I was so overwhelmed: not only did other people feel the sensation, but it was important enough to have a name.  The only other thing that I can remember about that day is my bewilderment about why it had never occurred to me to try and trigger the rapturous tingling in myself. 

YouTube’s most popular “ASMRtist,” GentleWhispering, who has over 300,000 subscribers

I checked out some of the ASMR videos, most made by whispering female YouTubers, often role-playing some moment of intimacy or tender attention, such as a haircut.  As I watched these videos, wave after wave of intense tingling swirled around the inside of my skull.  Although each wave lasted only ten seconds at most, and usually less, it would quickly be followed by another.  I felt as though I had discovered some high-inducing opiate. 

And now, I need to make the disclaimer that everyone who experiences ASMR is quick to note: Autonomous Sensory Meridian Response is not sexual.  Despite some of the ASMR video-makers’ affectations of intimacy, this is not a sensation that I experience below the belt.  While others (I have since learned) report of feeling it throughout their bodies, my own experience is usually confined to my head and never goes lower than my shoulders.  I wouldn’t be surprised if the brain’s activity during ASMR draws upon the same pleasure centers that are activated by sex, but the sensation itself is not erotic. 

While “Autonomous Sensory Meridian Response” is a scientific-sounding name, nothing is known about it scientifically.  Scientists have not found a way to measure ASMR.  Some studies have begun that examine the ASMR experience with an MRI, but last I heard, those studies were still in the preliminary stages and far from complete.  The phrase “Autonomous Sensory Meridian Response” was coined by one Jenn Allen.  She felt that such a phrase was needed because those who would talk about the sensation, lacking a better name, would liken it to a sexual experience and call it things like “brain orgasm.”  Allen wanted to distinguish the sensation from sex, so she invented a name without carnal connotations.   

An ASMR role-play video by the YouTube user VeniVidiVulpes

You might wonder why it’s taken me so long to post about ASMR.  Well, when watching my first batch of tingle-triggering YouTube videos, the intensity of the sensation began wearing off after about half an hour.  I returned to watching videos the next day, but the tingling, when I felt it at all, was barely noticeable.  I had to stop watching the videos for several days before the intensity of the tingling would return.  Because of this, I took a hiatus from ASMR videos for several months.  Only recently have I started watching them again, and only recently did I truly realize just how important this euphoric experience is to me. 

I have also joined an ASMR community on Facebook.  From what I have read there and elsewhere on the Internet, I get the idea that others who experience ASMR find it very easy to trigger in themselves.  If this is true, I seem to be someone in whom the sensation is difficult to induce and who, unfortunately, becomes inured to it relatively quickly.  Maybe I can find a way to change that.  At the moment, the YouTuber who can best induce ASMR in me is the user GentleWhispering (a.k.a. Maria), but other video-makers who specialize in triggering ASMR — or “ASMRtists,” as they call themselves — abound on YouTube.  For those who don’t experience ASMR, these videos will appear boring, perhaps not unlike the way pornography would be boring to a viewer without a sex drive. 

Since I have trouble remembering how I came across the phrase Autonomous Sensory Meridian Response, I’m embedding a video of a This American Life radio broadcast, in which the correspondent, who also experiences ASMR, recounts her discovery of the phrase and the community that has grown up around it.  I’ll also link to a couple other articles on the Web.


Two years ago at this time, I didn’t know the phrase Autonomous Sensory Meridian Response.  Now, I realize that I’m part of a small community that has the ability to induce a euphoric natural high in ourselves.  Because of that, I feel very lucky.  I hope that the scientific studies of ASMR bear fruit before too long, and we can all learn more about this intriguing and pleasure-provoking phenomenon.  

Thursday, December 18, 2014

The Death of Cinema Lives!


For a long time, I’ve been saying that Hollywood has now become driven by sequels and properties with pre-sold identities.  Now, writer Mark Harris is telling us that we are entering a time when sequels and franchises are the business, in a fascinating and rather scary article called “The Birdcage.”

Also, Jason Bailey of Flavorwire writes that Hollywood movies are now so expensive that they have priced such acclaimed filmmakers as Francis Ford Coppola, David Lynch, and John Waters out of the business.  His article is titled “How the Death of Mid-Budget Cinema Left a Generation of Iconic Filmmakers M.I.A.”  

It seems that the happening place to be for thoughtful, character-driven dramas these days is on television.  Breaking Bad, The Good WifeOrange Is the New Black, Mad MenHouse of Cards, and other critically acclaimed and popular series are sparking the water-cooler conversations these days as the feature films did back in the 1970s.  Meanwhile, big-screen movies are increasingly reserved for high-budget, high-grossing, lowest-common-denominator projects, with pre-existing identities, targeted mainly at children, adolescents, and very young adults.  Harris and Bailey second my opinion, reinforcing it with the facts on the ground.  If you love movies — and are uneasy about their future — both articles are must-reads.


P.S. Here is another offering from Mark Harris: “The Day the Movies Died.”

Friday, December 12, 2014

A Tortured Rationale for the Iraq War?


This week, the U.S. Senate Select Committee on Intelligence announced its report on the George W. Bush administration’s use of torture.  Here is how Wikipedia describes the document:
The 6,000-page report, which took five years and $40 million to compile, details abusive actions by CIA officials (amounting to systemic mistreatment of detainees) and various shortcomings of the detention project. On December 9, 2014 — eight months after voting to release parts of the report — the SSCI released a 525-page portion that consisted of key findings and an executive summary of the full report. The rest of the report remains classified. 
The report details actions by a number of CIA officials, including torturing prisoners and providing misleading or false information about CIA programs to government officials and the media. It also revealed the existence of previously unknown detainees, the fact that more detainees were subjected to harsher treatment than was previously disclosed, and that more techniques were used than previously disclosed. Finally, it offers conclusions about the detention project, including that enhanced interrogation techniques did not help acquire actionable intelligence or gain cooperation from detainees.
The report has been blasted by its critics as inaccurate, incomplete, egregiously one-sided, and politically partisan.  And it has reignited the recurring dispute in this country about the usefulness of torture and what it says about us as a people whose government is willing to employ it.  For example, Charles Krauthammer points to the absence of any new terrorist attacks since 9/11 as confirmation of torture’s effectiveness (Q.E.D.).  And former Vice President Dick Cheney, a strong supporter of what he called “enhanced interrogation techniques” while he was in office, now appears on news show after news show as an ubiquitous advocate of the practice. 

Of course, I’m no expert on the subject, but most of what I’ve read about it says that torture doesn’t work.  A tortured prisoner, according to intelligence experts, will only end up saying whatever his tormentors want him to say.  In the words of Ali Soufan, “a former FBI special agent with considerable experience interrogating al-Qaeda operatives”: “When they are in pain, people will say anything to get the pain to stop. Most of the time, they will lie, make up anything to make you stop hurting them. That means the information you're getting is useless.” 

The Senate’s torture report also dredges up memories of the Iraq War itself.  A number of news stories regarding people like former counter-terrorism czar Richard A. Clarke and Bush’s Secretary of the Treasury Paul O’Neill have said that Bush wanted to invade Iraq at the very beginning of his tenure.  Indeed, when Bush established his administration, he appointed several advocates of “regime change” in Iraq — such as Donald Rumsfeld, Paul Wolfowitz, Douglas Feith, and Richard Perle — to positions of power.  As the news interviews with Clarke and O’Neill suggest, the terrorist attacks of 9/11 provided Bush with an opportunity to implement his long-held ambition to overthrow Saddam Hussein.

Now, I would like to say something that the more misanthropic corner of my mind has suspected ever since the Abu Ghraib prison story broke.  I didn’t go around announcing my suspicion to everyone I knew because it would have made me sound like a far-left nutjob.  This is the suspicion: I’ve long feared that much of Bush and Chaney’s “intelligence” about Saddam Hussein’s weapons of mass destruction in Iraq came from tortured prisoners of the War in Afghanistan.  In other words, Bush and Chaney so badly wanted a rationale to invade Iraq that they had prisoners of war tortured until those detainees parroted the regime-change hawks’ notions of WMD.  We now know that these notions of a nuclear-armed Iraq were erroneous, but I can imagine “enhanced” interrogators tormenting a detainee about WMD and the prisoner saying that Hussein was acquiring them just to make the pain stop.  And this is the main reason, my sardonic side suspects, that Bush and Chaney so vigorously defend torture: it (along with other debunked “evidence”) helped to provide the Iraq War’s false casus belli.   

This all sounds extremely cynical, I know, and as much as I disliked the Bush administration, I still gave it the benefit of the doubt that it wouldn’t go quite so far to achieve its dubious ends.  But a news story in the National Journal now gives credence to my formerly far-fetched suspicions.
December 9, 2014 — A Senate investigation into the CIA’s use of brutal interrogation practices released Tuesday suggests that at least one detainee supplied false intelligence contributing to erroneous claims by the Bush administration that former Iraqi dictator Saddam Hussein possessed weapons of mass destruction and was working with al-Qaida. 
A footnote buried in the Senate Intelligence Committee’s 500-page report references a Libyan national known as Ibn al-Shaykh al-Libi who “reported while in ... custody that Iraq was supporting al-Qaida and providing assistance with chemical and biological weapons.” 
Some of that intelligence from al-Libi was used by former Secretary of State Colin Powell during a speech to the United Nations attempting to justify the 2003 invasion of Iraq, according to the footnote, despite al-Libi later recanting the claim. 
That speech by Powell, delivered on Feb. 5, 2003, was a pivotal part of the lead-up to the invasion of Iraq, wherein the secretary discussed Iraq's “deadly weapons program” and the country’s “involvement in terrorism.” 
No weapons of mass destruction were ever discovered in Iraq, nor was Hussein found to have deep, crucial ties to al-Qaida. It is unclear how significant al-Libi's testimony was to the Bush administration's insistence that Hussein possessed them.
To be sure, the story does not say that al-Libi supplied the false intelligence as a direct result of torture, and the article states that it’s not certain to what extent, if any, al-Libi’s “testimony” contributed to Bush’s justification for the invasion of Iraq.

But the fact that this news story comes as close as it does to confirming my worst suspicions is bad enough to get me to write this post.  The known history of the Iraq War — its being based on false intelligence, the hasty and heedless way that Bush rushed into it, its incompetent mismanagement — already makes it an egregious calamity.  If the false intelligence used to justify it turns out have been the result of torture, this would boost it to a cataclysmic tragedy. 

For a long time now — and the new Senate report now says that I had a good reason — whenever I listened to Chaney or any other regime-change apologist defending the use of “enhanced interrogation techniques,” this is what I heard between the lines of what they said: Torture is good because tortured detainees tell us what we want to hear 

Sunday, December 7, 2014

007’s Next Nemesis: Stephen Hawking?


I’m not the world’s biggest James Bond 007 fan, but here’s an intriguing idea: the fictional secret agent’s next movie antagonist should be played by Stephen Hawking.  Yes, that Stephen Hawking — the British theoretical physicist and cosmologist, who expressed interest in portraying such a role in a recent interview

Virtually everyone who knows anything about him greatly admires Dr. Hawking’s expansive intelligence and his perseverance in the face of a terrible degenerative disease, and we are thankful that a computerized speech synthesizer allows him to continue communicating with the rest of the world.  But the modern imagination is also mindful of how a withering of one’s physical abilities and an over-dependence upon technology can run the risk of sundering us from our own humanity.  Imagining the dark side of someone with Professor Hawking’s genius whose deteriorative illness has alienated him from the rest of the world — as he speaks to it through an impersonal-sounding artificial device — sets the stage for an intriguing story of good against evil.  So, the idea of Stephen Hawking playing a Bond villain makes for a fascinating concept.

However, production has just this month commenced on the 24th James Bond actioner, titled Spectre, for distribution late next year.  If Hawking is ever going to play a Bond bad guy, such an endeavor won’t come about for at least two years.  Many years ago, Hawking was told that he didn’t have long to live, but he beat the odds anyway.  I hope that his health continues to hold out, and that he and the people behind the Bond franchise can make this intriguing concept a reality.  Yes, let’s see Professor Stephen Hawking play the secret agent’s adversary in the 25th James Bond film!

Friday, November 7, 2014

‘I’ve Just Seen a Face’



“I’ve Just Seen a Face” is one of my favorite Beatles songs, and I know I’m not alone in saying that.

However, I’m not sure how many listeners realize just how demanding that song’s rhyme scheme actually is.  And a big reason for this is how the English word “been” is pronounced on both sides of the Atlantic.

In America, the word “been” is homonymous with “bin.”  But in most of Britain (and parts of Canada), “been” is homonymous with “bean.”  When pronounced the British way (as Paul McCartney does on the Beatles’ recording), “I’ve Just Seen a Face’s” three-syllable rhyme scheme becomes more apparent:


Had it been another day
I might have looked the other way
And I’d have never been aware
But as it is, I’ll dream of her
Tonight

Not every verse follows this pattern, but when they do, the rhyming absolutely soars!

Still, when American singers cover the song, they usually pronounce “been” as “bin,” and the elaborate rhyming recedes a little.  So, I’m wondering if most of the song’s listeners are as tuned into the way that “I’ve Just Seen a Face” rhymes.  Or whether it’s something they think about at all.


Originally posted on BeatleLinks Fab Forum in 2011.

Wednesday, November 5, 2014

Saturday, October 25, 2014

A Horror Film for Halloween: ‘The Innocents’


To celebrate Halloween, many filmgoers will often search for a scary movie to help get themselves in the mood for a time of ghosts, goblins, monsters, children trick-or-treating, and adults partying in costumes they’ll regret wearing in the morning.  So, with Halloween only a week away, I’d like to recommend my favorite horror film for viewing, as something to help folks get into the Halloween spirit (so to speak): The Innocents, a black & white gothic ghost story from 1961. 

The Innocents is masterfully helmed by English director Jack Clayton, which is surprising since it’s only his second feature, following Room at the Top (1959), the celebrated “kitchen sink” drama credited with helping to launch the British New Wave.  Although The Innocents is based on Henry James’s famous 1898 novella The Turn of the Screw, the movie is more directly drawn from a 1950 stage adaptation by William Archibald, also called The Innocents, from which the film gets its variant (and more descriptive) title.  Archibald also collaborated on the film’s script with Truman Capote (in a rare screenwriting stint) and with additional dialogue by John Mortimer. 

As I’ve said before, I’m not especially big on horror films, even though the classic Universal monster movies of the 1930s and ’40s spawned my youthful interest in film.  One of my reasons for not liking horror anymore is because the genre is based on fearing things rather than understanding them.  But I become intrigued when the source of the horror is within the protagonist — rather than being something external — because such stories encourage us to examine our concepts of identity and self.  So, although I probably wouldn’t sit still for a horror movie about a main character battling monsters, a film told from the perspective of a werewolf (The Wolf Man [1941, 2010], The Curse of the Werewolf [1961], etc.) or any other “resist the beast” protagonist would more easily grab my attention. 


Until very recently, I understood that a horror film had to contain some sort of fantastical or otherworldly element — the dead returning to life, humans transformed into other creatures, beings from other worlds, and so on — to qualify for the genre.  If a film’s story concerned only subject matter that could be found in the lived world — serial killers or the witchfinder generals of history, for example — then it wasn’t a horror movie.  Such a film might be a thriller or a frightening mystery movie, I thought, but an absence of any supernatural theme disqualified it as horror.  However, conventional wisdom now says that some films about deranged humans, such as Alfred Hitchcock’s Psycho (1960) and Tobe Hooper’s The Texas Chainsaw Massacre (1974) and the typical slasher flick, or other scary movies with real-world evils, can now also be counted as horror.  Moreover, two silent films frequently categorized as horror, The Hunchback of Notre Dame (1923) and The Phantom of the Opera (1925), both starring Lon Chaney, exclusively concern dramatis personae that are ostensibly mortal humans; the title characters’ deformity or disfigurement, things that can happen in the lived world, serve as the films’ only “terrors.”  So, I seem to stand corrected.  What does all this have to do with The Innocents?

Because Clayton’s film is based on the well-known Turn of the Screw, it’s not too much of a spoiler to say that the film’s driving force — as in its literary source — is the uncertainty whether the movie fits at all into my earlier definition of what a horror film can be.  Are the happenings on screen a supernatural story of ghosts that are “real” within the narrative?  Or are the happenings only the product of the protagonist’s repression-fueled imagination?  The Innocents never answers these questions in any unambiguous way.  I think that the film gives slightly more weight (but not too much) to the all-in-the-head side, but if more were done to enhance the real-ghost-story side, this would probably have made The Innocents look like a generic horror movie, which is something Clayton wanted to avoid. (He made the picture in response to the superficiality of Hammer Films’s popular monster movies, one of the most conspicuous worldwide examples of British cinema at the time.) 

Deborah Kerr as Miss Giddens
The Innocents’ story concerns Miss Giddens (Deborah Kerr), an unmarried minister’s daughter approaching middle age in Victorian England.  Suddenly needing a livelihood, the inexperienced Miss Giddens accepts a position as governess to the orphaned niece and nephew of the absentee owner of a country estate (Michael Redgrave in a cameo).  Being a man-about-town and world traveler, and now saddled with the children upon the death of his brother, the bachelor uncle makes it clear that he does not want them in his life and that he is never for any reason to be bothered with whatever goes on at the estate.  Miss Giddens travels to the large country mansion, where she meets her grade-school-aged charges, Miles (Martin Stephens) and Flora (Pamela Franklin).  The children are charming, but they also act disturbingly mature — at one point, Miles kisses Miss Giddens goodnight lingeringly on the lips — as well as secretive.  As the days go by, Miss Giddens (but no one else) sees two spectral figures, a man and a woman, appearing and disappearing on the estate.  She learns that the young governess who preceded her was in an abusive relationship with the uncle’s brutish valet, which included indiscreet sex throughout the mansion, and when the valet mysteriously died, the young governess drowned herself.  Without anyone else’s corroboration, Miss Giddens becomes convinced that the figures she sees are the ghosts of the valet and the young governess, who are trying to possess the bodies of Miles and Flora in order to continue their sexual relationship.  Miss Giddens takes it upon herself to exorcize the ghosts from the children by getting the young ones to acknowledge their (implied) past sexual abuse by the valet and former governess. 

If my synopsis makes The Innocents sound like heavy going, it isn’t.  The thorny issues are only subtext that enhances the film’s inchoate sense of dread.  Released in the U.S. by Twentieth Century Fox in 1961, the movie needed to be passed by the bowdlerizing Hollywood Production Code, which had been faltering and liberalizing since the 1950s but which was still in force.  As a result, the sexual abuse is only insinuated, and some viewers contest whether any such abuse is part of the story at all.  But the implication adds to the idea that Miss Giddens is motivated by sexual repression. However, the Production Code’s approval slightly hampers the mood when, at one point, the children are said to be speaking in profanities, and the strongest language that the audience hears is when Miles calls Miss Giddens a “damned hussy.” 


The main reason why I’m recommending The Innocents is because this has been the only film I’ve ever seen to really scare me.  I first saw this movie on television when I was very young — in grade school myself, I think — and very much into monster movies.  Hearing that The Innocents was a horror film, I made an effort (in those pre-VCR days) to see it on TV when it was shown.  Back then, there was a certain formula that I wanted horror movies to follow: monster comes (back) to life; monster causes mayhem; monster is killed at the film’s conclusion (at least, in certain cases, until the next sequel) — a tidy way for a kid to mentally “control” whatever is frightening, don’t you think?  And most vintage horror films did indeed follow the life-mayhem-killed pattern.  One reason why The Innocents unnerved my younger self so much is because it not only didn’t follow the pattern, but it threw the whole pattern into question by problematizing the concept of what exactly a monster was. (Being so young when I first saw The Innocents, I didn’t consciously pick up on the pervasiveness of the film’s sexual themes, which, as I said, were muted to begin with.) 


Beginning with the sound of Flora’s a-cappella voice singing a mournful song of lost love and death over the Twentieth Century Fox and Cinemascope logos, The Innocents hints at haunting things to come, and it soon delivers.  Moreover, the film contains some of the most unsettling images (by cinematographer Freddie Francis) I’ve ever seen, but they’re not unsettling in any obvious way: even the most brightly lit scenes convey an air of menace.  To this day, the close-up of a bug crawling out the mouth of a decorative stone cherub stays with me as the cinema image that did the most to send chills up my spine.

Also crucial to the film’s effectiveness are the preternaturally precocious performances of Martin Stephens and Pamela Franklin as the children, the “innocents” of the sardonic title.  With their well-behaved manners but their simultaneous ability to suggest a dark side, the youngsters balance on a knife’s edge between the adorable and the uncanny.  At the film’s beginning, Miles is away at boarding school but is expelled for hurting and swearing profanities at the other boys.  When Miss Giddens meets him, Miles is impeccably polite and well spoken, the very picture of good behavior — he couldn’t possibly be guilty of the accusation!  But he is evasive when she questions him about his expulsion, and he sometimes turns her queries back on herself with apparent adult-like cunning. 

Pamela Franklin as Flora, Kerr, and Martin Stephens as Miles

In scenes such as these, we get the idea that Miss Giddens’ visions may be the product of her repressed attraction to the uncle.  (Why else would the film hire a well-known star like Michael Redgrave for such a small role?)  But the characters of Miles and Flora are so schizophrenic, and the young Stephens’ and Franklin’s performances are so disquieting, that we might also think they are indeed possessed by demons.  (How could Flora foretell that Miles would soon be returning from school?)  By maintaining such a precise equilibrium between the psychological and the seemingly supernatural, The Innocents keeps us guessing — in an intriguing and entertaining way — what’s really going on.

And it makes for enthralling viewing for film-lovers in the mood for a horror movie, whether it’s Halloween or not.

Thursday, October 23, 2014

Defacing Renée Zellweger


If you’ve been surfing the Web these last four days, you’ve undoubtedly had at least a glancing encounter with the negative buzz over Renée Zellweger’s new appearance.  You know that she attended Elle magazine’s 21st annual Women in Hollywood Awards on October 20, exhibiting a face that had obviously been retouched by plastic surgery, retouched to the point where it was virtually impossible to recognize the star of such popular movies as Jerry Maguire and Chicago.  The 45-year-old actress doesn’t seem to have had an enormous amount of work done — merely eyelid surgery, Botox injections, and a brow lift, according to one plastic surgeon — but the work that she did have done altered what was most distinctive about her face.  

I haven’t been following the controversy (if that’s the right word for it) very closely, but my takeaway is this: lots people on social media criticized her new appearance (many of whom, inevitably, used snarky and tactless comments), and others responded to these critics, blasting them for “shaming” the practice of cosmetic surgery per se.  Zellweger herself has responded in a way that neither confirms nor denies that any facelifting occurred.  However, the idea that she may not have had plastic surgery brings to mind clichés about bridges in Brooklyn.

The “shaming” brouhaha raises once again the issues of how society views the aging female body in general, and of Hollywood’s adoration of the youthful female body in particular.  Observers “shame” Zellweger’s new appearance, and her defenders shame the shaming.  The defenders ask what a Hollywood female star past the first flush of youth, a star who hasn’t been seen on the screen for quite some time, must do to keep her career afloat.  And the discussion unearths the entangling root issue of women’s objectification by men.  Anne Helen Petersen writes in BuzzFeed:


Hollywood is horrible to aging women, broadly, but it’s particularly horrible for women [such as Zellweger] whose images are rooted in a youthful form of themselves. It’s not just Lindsay Lohan, in other words, who has to struggle with expectations pinned to a much-younger version of herself. That’s why Julia Roberts and Reese Witherspoon keep playing variations on the same roles, praised for their apparent agelessness, and why Demi Moore and Nicole Kidman struggle to reinvigorate their stardom. Indeed, the most flattering form of praise for a longtime female star isn’t “Look at their varied and complex career!” but “[Insert Star Here] Doesn’t Age!” ...
The last time we really “saw” [Zellweger on screen], she was [her youthful] image. Now she’s labeled a distortion of it, even though, in truth, it’s society’s reaction that’s the dark mirror of our expectations — not Zellweger’s still beautiful face. 

However, without contradicting the insightful observations that several writers have made about the predicament of female stars in Hollywood, and what this says about the predicament of women in the larger society, I think this focus on “shaming” cosmetic surgery overlooks the obvious.  I think that Renée Zellweger’s surgically altered face has become such a big story precisely because she looks so radically different.  In particular, the surgery removed her face’s most distinguishing characteristics — especially her hooded eyelids and full cheeks — which gave her on-screen persona its unique personality.  Instead of Zellweger’s familiar attractive-in-a-slightly-quirky-way face, we now see a sculpted face lacking any truly special features.  Yes, as Petersen says, Zellweger is still beautiful, but she’s beautiful in a bland, uninteresting fashion.  She looks more like Daryl Hannah than the Renée Zellweger we’ve all come to know and appreciate.  

By altering her appearance so drastically, I feel as though Zellweger defaced a national treasure, a treasure that would not have been defaced by time, a treasure now forever gone.  I get the idea that a lot of other moviegoers feel the same way, and that is what touched off the social-media frenzy, much more so than criticisms of plastic surgery in general.  This isn’t a matter of shaming.  It’s a matter of mourning.


Remembrance of things past

Tuesday, October 21, 2014

Dan Savage: Halloween Is America’s Carnival

Halloween costumes from Victoria’s Secret

The fact that I’m posting something about Halloween more than a week before the event says something about the higher profile that the unofficial holiday has acquired over the last decade or two.  From what I can tell, decorations for Halloween — images of jack o’ lanterns, ghosts, black cats, cobwebs, etc. — are adorning places of business almost as long before the holiday itself as those for Christmas, another indication of Halloween’s importance.  But despite the ubiquity of spooky iconography in all of the decorations, Halloween’s newfound prominence seems to derive from adults being able to dress in extravagant costumes that they would never wear on any other day.  In particular, Halloween allows women the opportunity to flaunt their sexuality more openly, and the holiday’s ultimate symbol now appears to be the plethora of revealing and/or suggestive — the adjective “slutty” is frequently heard — women’s costumes that are available to buy.  This shift from spooky to sexy strikes me as the main reason for the holiday’s recent greater standing, at least among single adults. 



A lot of ink has been spilled over the pros and cons of women’s sexy/slutty Halloween outfits (for the most part, I’m pro), so I won’t add to the opinions already out there.  But Halloween’s new emphasis on sex looks to me like America’s back-door (so to speak) acquisition of something that the country has heretofore lacked but which is observed in much of the rest of the world: Carnival, country-wide celebrations “which mark an overturning of the norms of daily life.”  The best-known example of Carnival is its celebration in Brazil, but most other non-English-speaking countries take time out of the year to overlook the rules of social decorum, such as Fasching in Germany and Fastelavn in Denmark.  During these occasions, adults can show off their inner lives, which often means masquerading as their secret selves and unleashing their libidos (putting the carnal in Carnival).  Here in the States, Mardi Gras, confined exclusively to the city of New Orleans, was as close as we got.


Dan Savage
I thought of writing something about Halloween as America’s version of Carnival, but once again, I’ve been beaten to the punch.  In an article from 2009, “Happy Heteroween,” sex-advice columnist Dan Savage compares Halloween not only to Carnival but also to gay-pride parades, viewing the holiday as the heterosexual version thereof.  So, I thought that I would use this blogpost to link Savage’s entertaining account, which also explores some of the controversies that result in Halloween’s new change from the sinister to the sensual.  I hope that you enjoy the read.

Oh, and if I’m not too early — Happy Halloween!


From a 2013 episode of ‘Totally Biased with W. Kamau Bell

Sunday, October 12, 2014


Whether “No Bra Day” is real or fake, it still seems like a cool idea.

Sunday, October 5, 2014

‘Gilmore Girls’


The dramedy Gilmore Girls (2000-07), which debuted on the late W.B. Television Network 14 years ago today, was a favorite show of mine.  I watched it loyally from its second season (I got hooked by its lead-in to Smallville) until its series finale five years later.  Ten years ago, I even bought the first three seasons on DVD, but I ended up not watching them as often as I thought I would.  I knew that Gilmore Girls developed a devoted following, but I was unprepared for the Internet’s explosion of excitement when the series was made available for streaming on Netflix at the beginning of this month. 

Thanks to this welcome cyberspace hullabaloo, I was inspired to watch my Gilmore Girls DVDs for the first time in a long while.  And I was pleasantly reminded of why I like the show so much: its cozy ambience, its superb cast, its witty rapid-fire dialogue, its fully realized characters.  I’m now happy to be reacquainted with the series.  It feels a bit like catching up with a good friend after a long absence. 

Lauren Graham as Lorelai (left)
and Alexis Bledel as Rory
For those who don’t know, Gilmore Girls follows the day-to-day travails and triumphs of Lorelai Gilmore (Lauren Graham), an inn manager in the (fictional) town of Stars Hollow, Connecticut.  Lorelai’s wealthy mother and father, Emily (Kelly Bishop) and Richard (Edward Herrmann), live very well in Hartford’s high society.  Years ago, they groomed Lorelai to be a part of that world, but she was too much of a free spirit to belong in such a conformist, repressive environment.  When she was 16, Lorelai became pregnant, and after the birth of her daughter — also named Lorelai but who goes by the nickname Rory — she ran away from home to get away from her controlling parents.  As the series begins, Rory (Alexis Bledel) is now 16 herself (making Lorelai a very youthful 32) and an excellent student accepted to the prestigious (and very expensive) Chilton private preparatory school.  Unable to afford Chilton on her own, Lorelai goes cap in hand to her parents.  To get their estranged daughter and grandchild back into their lives, they loan Lorelai the money on the condition that she and Rory have dinner with them every Friday.  These dinners become the locus of most of the story-driving tensions from episode to episode. 

Lorelai’s life in the quirky town of Stars Hollow includes her managing job at the Independence Inn and her occasional romantic relationships, especially her will-they-or-won’t-they flirtation with hunky local diner owner Luke Danes (Scott Patterson, who also played the “spongeworthy” guy in that episode of Seinfeld).  Meanwhile, Gilmore Girls also chronicles Rory’s days in the equally repressive and controlling environment of Chilton (and after she graduates, Yale University), particularly her ambivalent association with her frenemy Paris Geller (Liza Weil).  Another story line concerns Rory’s on-again/off-again relationships with town good boy Dean Forrester (Jared Padalecki) and city bad boy Jess Mariano (Milo Ventimiglia). Gilmore Girls’s hook is the wisecracking, pop-culture-referencing rapport between the whimsical Lorelai and the more down-to-earth Rory, a relationship more like sisters than mother-daughter.  When selling the series, creator Amy Sherman-Palladino pitched a show where a mother and her daughter were best friends.

Because Gilmore Girls is decidedly woman-centered (a number of commentaries credit it for passing the Bechdel test with flying colors), I’m not surprised that — as in the days of the show’s first airings — its following is mostly female.  In the years when the show was on the tube, I would surf the Web for Gilmore Girls discussion groups.  In all cases, the cyber communities were overwhelmingly populated by fans with XX chromosomes.  This was fine, but their on-line discussions tended to be limited to the romantic relationships on the show (the favorites among the fans were Lorelai-Luke and Rory-Jess), which didn’t leave much room for discussion of the show’s other merits.  If I had first heard of Gilmore Girls through these discussion boards, I would have thought that the series in question was nothing more than a soap opera.  Also, popular culture at the time (e.g., Saturday Night Live sketches) seemed to harbor the idea that male viewers of Gilmore Girls were mostly gay and equally fixated on Lorelai’s and Rory’s love lives. 

I thought that Gilmore Girls’s reputation as a show with an almost exclusively female and gay-male audience was a huge disservice to such a well-crafted and searingly insightful show.  Despite Gilmore Girls being undeniably estrogen-powered, there was no reason, I thought, why my fellow heterosexual men couldn’t be equally spellbound by the series and rid it of its undeserved reputation as a “mere” chick show, as a show primarily about romantic relationships, tugging heavily and blatantly on the heartstrings.  With its fully fleshed-out characters and its keen, nuanced glimpses into the machinations of social hierarchy, among other elements, Gilmore Girls was so much more than that.  (Obviously, it’s a statistical probability that the series must have had many other straight male viewers, but we seemed to be M.I.A. whenever the show was discussed by the media.)  So, on one of the Gilmore Girls discussion boards, I posted a satirical piece about straight men being excessively stigmatized for liking the show.  I titled what I wrote (drawing upon Gore Vidal’s neologism for heterosexuals) “Grims for Gilmore Girls,” which I’d love to repost, but the discussion board has vanished from the Internet, and I can’t find a copy of the piece anywhere nearby. 

The third Gilmore girl:
Kelly Bishop as Emily
Anyway, one of Gilmore Girls’s most perceptive elements was how it portrayed upper-middle class life as downright Machiavellian: the rich (i.e., the world of Richard and Emily Gilmore) were always manipulating someone in order to acquire or preserve whatever piece of turf was at stake.  One episode that emphasized this was “Tick, Tick, Tick, Boom!,” in which the otherwise amiable Richard cold-bloodedly screws over his business partner in order to save himself from bankruptcy. In fact, the premise of the entire series (a recalcitrant Lorelai compelled to have weekly dinners with her frosty parents) was an effort by the elder Gilmores to manipulate their estranged daughter back into their lives.  I wanted to write something on this aspect of the show, but someone beat me to it, and did a very nice job.  In New York magazine, Lilly Loofbourow has written an article titled “What Gilmore Girls Gets Right About Money and Love,” in which she insightfully details the characters’ intertwining of economics and affection: “Money is rarely [only] about money in Gilmore Girls; it’s about coercion, its about power, but it’s also about creating financial channels for love where other methods have failed.”  The piece is a refreshing change from the ubiquitous fan-authored Internet odes about how Lorelai and Luke should get married, or about how dreamy bad-boy Jess is, and I enthusiastically recommend it. 

Michael Winters as Taylor Doose
Thinking of the show also reminded me of the number of times during its production that I bumped into cast member Michael Winters, who played Stars Hollow busybody Taylor Doose, at my local diner and made small talk with him.  The weekend after the first broadcast of the episode “They Shoot Gilmores, Don’t They?,” I ran into him at a local restaurant, and we had a brief conversation about the episode.  He spoke highly of its director, Kenny Ortega

At the same diner on another day, I also ended up sitting at a table next to Alexis Bledel.  She munched on a salad and never took her eyes off her laptop, which she viewed with an increasingly consternated look on her face.  It appeared as though she were reading bad news on the display.  Had she looked more relaxed, I might have gone up to her and told her how much I enjoyed Gilmore Girls.  But since she didn’t look happy, I gave her space.  

On the other hand, I have no amusing anecdotes about the time I saw cast member Keiko Agena — who played Lane Kim, Rory’s best friend — at my local Fatburger.  Other than I get to say “Fatburger,” which is a funny word.  


The ‘almost’ Sookie: Alex Borstein as Ms. Swan
on ‘MadTV’
However, Gilmore Girls came within a hair’s breadth of making me never want to watch it.  The role of Sookie St. James, Lorelai’s best friend and eventual business partner, was played to bubbly perfection by Melissa McCarthy, who has since gone on to become the closest thing Hollywood has had to a plus-size female movie star in decades, thanks to her scene-stealing success in the big-screen comedy Bridesmaids (2011).  Fans of Gilmore Girls who love McCarthy’s work on the show wonder why it took the rest of the movie-going world so long to catch on to the actress’s charismatic dexterity with both comedy and drama.  But even though any other actress is now unthinkable in the role, McCarthy wasn’t Sherman-Palladino’s first choice to play Sookie.  That distinction goes to Alex Borstein, the comedian who made her name on the comedy-sketch series MadTV (1995-2009), most infamously as the recurring character of Ms. Swan (formerly Ms. Kwan), a stereotypical, English-mutilating Asian manicurist that Borstein played in yellowface.  Borstein’s unironically rehashing shopworn Asian stereotypes was bad enough, but she and the creators of MadTV went an extra step: when complaints about the character came pouring into the show from irked Asian American viewers, Borstein and the creators went out of their way to deny that Ms. Swan was ever meant to be Asian, making credibility-crushing arguments in the process.  As a result, ever since her debut as Ms. Swan (a character that I never found in the least bit funny), I have detested Borstein’s on-screen presence.  Since MadTV wouldn’t let Borstein out of her contract, she wasn’t able to accept the role of Sookie, so as a consolation, Sherman-Palladino cast her in a handful of supporting roles throught Gilmore Girls’s run.  I can bear Borstein’s appearances in these small roles, but had she been cast in the major role of Sookie, I don’t think that I could have watched Gilmore Girls at all.  So, I’m glad that the casting of Sookie worked out the way it did.

Bledel and Graham reunited in 2010 for an ‘Entertainment Weekly’ photo shoot

Welcome back, Gilmore Girls.  It’s been too long.  And I’m glad to see that a TV show with crackling dialogue and such well-drawn female characters is still so fondly remembered and can generate such enthusiasm and excitement. 


A haiku that I wrote in response to the most incredulity-inducing aspect of ‘Gilmore Girls’: Every week, Lorelai and Rory would be shown continually chowing down on cheeseburgers, pizza, tater tots, and assorted sweets without exercising, but they would still look svelte in each episode.  I suppose that cathode-tube fantasy is the female equivalent of James Bond bedding virtually every woman he meets.  

Gilmore Girls, Interrupted:  Since so many viewers were disappointed by the show’s seventh, and last, season (the only season without Sherman-Palladino), especially its awkwardly retrofitted series finale, the aether has swarmed with rumors of a Gilmore Girls feature film that would conclude the saga of Lorelai and Rory in a more polished manner, closer to the way envisioned by its creator.  After luxuriating in all of the buzz about Gilmore Girls’s new availability for streaming on Netflix, I hope the Powers That Be in Hollywood copper-boom and such a movie soon sees the light of day. 



Update, December 31, 2014: I’m sorry to report that if a Gilmore Girls movie is ever made, its cast will not include Edward Herrmann, who died today after a struggle with brain cancer.  I — and certainly many others — will always fondly remember his seven seasons as Richard Gilmore, as well as his definitive portrayals of President Franklin Roosevelt in the TV movies Eleanor and Franklin (1976) and its sequel, Eleanor and Franklin: The White House Years (1977).  Rest in peace, Mr. Herrmann.