Wednesday, January 21, 2015

Second Thoughts on Ozu’s ‘Late Spring’

If anyone follows my blog (stop laughing), they would know that unlike many other cinephiles, I’m not a big fan of Japanese director Yasujirô Ozu: his films strike me as disagreeably reactionary, implicitly yearning for a “return” to a Japanese society based on patriarchy and filial piety.  However, I still watch his films to take in their unusual cinematic “grammar” — close-to-the-ground camera angles, characters speaking almost straight into the lens, abrupt cuts, unpeopled transition shots — and how it might change my positioning as a viewer.  I also remain intrigued by the sense of the transcendental or ethereal in his black & white films, a sense harder to detect, as I said before, in his color movies. 

I started watching Ozu’s films again while reading the book Transcendental Style in Film: Ozu, Bresson, Dreyer by Paul Schrader.  The book discusses how Ozu (in Japan), Robert Bresson (in France), and Carl Theodor Dreyer (primarily in Denmark) use a form of storytelling and imagery that suggest a realm or state of mind where there are no distinctions between humanity and nature: the Transcendent.  I won’t go into all of Schrader’s book here, because my focus is on Ozu, but its argument, in looking at the Japanese director’s films in relation to a concept of something seen to stand outside culture — the Transcendent — also risks naturalizing his conservative way of looking at the world. 
Yasujirô Ozu (1903-1963)

Indeed, the references to seasons and times of the day or year in so many of the Japanese director’s (English) titles — Late Spring (1949), Early Summer (1951), Early Spring (1956), Tokyo Twilight (1957), Equinox Flower (1958), Good Morning (1959), Late Autumn (1960), The End of Summer (1961), and An Autumn Afternoon (1962) — not only sound almost comically repetitive, but by linking his characters to the inevitable passage of time, Ozu seems to imply that the rightness and desirability of traditional Japanese culture are equally inevitable. 

Recently, I watched, for the first time, Late Spring (晩春, 1949), the first of Ozu’s family dramas (shomingeki) to have a season/time-based title, and my first impression of the film was that the narrative was in the same conservative mold as the director-screenwriter’s other films: Noriko (Setsuko Hara), a single woman in contemporary Japan, is recalcitrant to the idea of getting married, which her extended family pressures her to do, but she finally and reluctantly gives in after her widower father (Chishu Ryu) lies to her, saying that he plans to get married (again) himself.  Furthermore, the father’s speeches to his daughter on the benefits of marriage sound like didactic lectures straight out of an after-school special.  So, Noriko, whose single status calls the patriarchal social order into question, capitulates to an arranged marriage after constant badgering and being told a falsehood, consequently assuming her designated role in society.  And to me, the film seems to portray her capitulation in a positive light.  Not my kind of movie. 
Setsuko Hara as Noriko in ‘Late Spring’

Noriko and her father live alone together in the same house, with her performing the domestic role usually undertaken by a wife, such as picking up her father’s clothes as he drops them on the floor while changing into his yukata.   Noriko is satisfied with her role in the house, and when asked why she doesn’t get married, she says that her father would be “lost” without her.  When I heard this line of dialogue, I said to myself, “That doesn’t sound like a reason; that sounds like an excuse.”  That told me Noriko’s character hadn’t really been developed.  Also, Noriko’s Western-marked and sometimes obnoxious friend Aya (Yumeiji Tsukioka) married for love but later got divorced, which seemed to me like a plug for traditional arranged marriages. 
Chishu Ryu as Noriko’s father (wearing his yukata)

I watched Late Spring on a DVD put out by the Criterion Collection, a company whose meticulous attention to picture quality and supplemental materials of canonized classics makes it a cinephile’s best friend.  One of the supplemental materials on the disc was a commentary track by Richard Peña, an associate professor in film studies at Columbia University.  After listening to his remarks on the commentary track, I began to reconsider my first impression of the film.

To begin with, I didn’t take into account that Late Spring is contemporaneously set in the immediate aftermath of World War II (four years afterwards).  The fact that Noriko has assumed the role of the mother within her father’s house (in which the two live alone) only struck me as an example of filial piety.  A single woman past the ideal marital age (Noriko is 27) taking on a wife’s capacity in her father’s house (outside the bedroom, of course) would have struck a contemporary Japanese audience, according to Peña, as an odd arrangement.  (In recent decades, the ideal age for a Japanese woman to get married has been no later than 25, so young Japanese women of marriageable age are sometimes snidely called “Christmas cakes”: after the 25th [of December/birthday], “no one wants them,” or so the thinking goes.) 

This arrangement also (to more alert viewers) calls attention to the absence of both the mother and any male siblings, which, given the context of 1949, might have led Japanese viewers to infer that they died, directly or indirectly, because of the war.  And the absence of a larger family structure makes Noriko’s reason for not wanting to leave her father’s home more credible.  Noriko’s servitude — at least to her father — is not as positively portrayed as I first thought. 

And then there is Ozu’s singular cinematic style: the camera angles of most shots imitating the POV of someone seated on a tatami, full-frontal (as opposed to angled) close-ups of the characters, shots without any people in them, somewhat disorienting transition shots, etc.  This unusual approach to filmmaking is the reason why so many cineastes (myself included) keep returning to Ozu, and his champions say that this approach encourages the viewer not to take
the on-screen proceedings at face value.  After all, if Ozu wanted to make a film merely propagandizing Noriko’s capitulation to marriage, wouldn’t he want to use a film style as “invisible” to the audience as possible?

Noriko and her father at a Noh play
I’m very aware of Ozu’s critical view of the Western influence on Japanese culture, and how it compares unfavorably to his seemingly more positive view of traditional Japanese culture.  Late Spring early on associates the father with the accoutrements of traditional Japan: often clad in a yukata, often seated at a chabudai, enjoying a Noh play, etc.  By contrast, even though we first see her at a traditional tea ceremony and dressed in a kimono (albeit, as Peña points out, clutching a Western purse rather than its traditional Japanese equivalent), Noriko is thereafter associated with Western things: she almost exclusively wears dresses, and her room has occidental décor.

In the scene where the aunt (Haruko Sugimura) insistently has her first serious conversation with Noriko about the younger woman’s prospects for marriage, and the aunt refuses to let Noriko laugh off the idea (as she had previously), both start out seated at a chabudai.  When the conversation turns serious, a stubborn Noriko gets up from the chabudai and petulantly plops herself down on a Western-style chair. Because of these character markers, I get the idea that Noriko’s recalcitrant attitude towards marriage is a Western influence.  (Japan was governed at the time of Late Spring by the allied occupation, which imposed many Western ideas on Japanese society.)  Near the film’s end, when we see Noriko dressed in traditional Japanese wedding garments and formally thanking her father for his care, I’m left with the impression that this heretofore Western-styled woman has capitulated not only to marriage but also to a traditional Japanese social role.  

To me, the film seems to say that Noriko was wrong not to immediately accept her family’s desire for her to get married: in other words, she was in the wrong from the get-go.  And along with other telltale signs of criticism, I’m left with the message that Western culture is a force that has vitiated a “purer,” more positive Japanese culture. 
Noriko’s father persuades her to get married

But Peña says that Late Spring is more complicated than that.  According to him, Ozu’s singular cinematic style is not the only element to elicit a critical viewer; so does the story’s structure.  After all, why begin the film with a seemingly ideal mate for Noriko (her father’s younger assistant), only to take him out of the running early on?  Why leave gaps in the story that the viewer must fill in?  And most intriguing of all, if Noriko’s capitulation to marriage is portrayed so positively, how come the wedding itself is never shown?  Furthermore, Peña regards the father’s didactic-sounding speech on the positive aspects of marriage as something that the man doesn’t entirely believe himself; the speech’s very didacticism, to Peña, is so out of character that the commentator believes it to be a mere piece of “theatre” between the family members.  Ozu, to Peña and others, is too much of a modernist to be an effective propagandist, so, it follows, propaganda must not be his goal. 

To his champions, Ozu’s films are too thematically rich merely to advocate nostalgia for a Japan that may never have existed.  To them, Ozu does not naturalize Japanese culture or imply that his characters’ social circumstances are unavoidable.  What Ozu sees as inevitable — and this is reflected by his films’ similar titles — is the passing of time, and the ephemera it takes with it, which, of course, is indisputably inevitable.  What concerns Ozu, then, is how his characters occupy their time on Earth and the emotional consequences of the decisions they make.  I see a dichotomy in Ozu’s films between traditional Japan and the West.  But Ozu’s defenders say that the director’s portrayal of Japanese life is too unusual and too complex to invoke a mere dichotomy. 

Therefore, what I’ve seen as positive portrayals of things traditionally Japanese, to Ozu fans, aren’t straightforwardly positive; these are instead wistful, non-prescriptive observations of how the characters inhabit their space, making their pent-up emotions too intricate to be attributed to a single story-driving cause (as is often the case in Hollywood cinema).  For example, I see Noriko’s marriage as endorsed by Ozu, but Peña says in his commentary that Ozu, without judgment, evokes the sense that events in their lives may have very well turned out differently.  Furthermore, the marriage, according to Peña, is not the point of Late Spring because the film never shows us the wedding.  Like Peña, Ozu’s defenders say that his films, instead, communicate the poignant evanescence of all life — the very Japanese notion of mono no aware.  

Noriko in traditional Japanese
bridal garments
Late Spring and Ozu’s subsequent films were commercially successful in Japan, enabling the director to make on average a film a year for the rest of his career, until his death at age 60.  I get the feeling that his audience didn’t go to his films to have their sense of traditional Japan — a sense of tradition impaired by the loss of the war and by the allied occupation — challenged or questioned.  I think that they went to Ozu’s films to revel in specifically Japanese subjects and to approve the traditional and conservative choices his characters (usually) make.  So, I can’t help wondering if this audience viewed Ozu’s unusual film style as consciously anti-Western, as an attempt to discover a traditionally Japanese discourse within (to them) the implicitly Western medium of cinema. 

In his book A Certain Tendency of the Hollywood Cinema, 1930-1983, Robert B. Ray says that the most successful American feature films have differing aspects to them that different viewers can key into.  A successful film will have both conservative elements that conservative viewers can appreciate and liberal elements that liberal viewers can enjoy.  I get the idea that Ozu’s conservative-styled stories entertained his tradition-minded audience, while his unorthodox cinema grammar engaged his less tradition-minded audience. 

But Peña’s commentary about the modernist aspects of Late Spring made me realize how much of an oversimplifying dichotomy my liberal/conservative approach to Ozu’s films has been.  Just because a movie portrays something regarded as traditional or conservative without expressly criticizing it, that doesn’t automatically denote the film’s approval.  And even if Ozu’s intentions were thoroughly conservative (as I understand his politics were), his unusual shooting and editing styles blatantly rupture the “invisibility” of Hollywood film grammar and invite critical readings of his films’ conservative elements.  Finally, the understated performances that Ozu coaxes from his actors, portraying people with weighty feelings they can barely express, endows those characters with emotions more complicated than the usual story-driven Hollywood offerings.  Yes, emotions are complex, and Ozu’s underplayed and taciturn characters give us a better sense of that than most actorly monologues. 

So, I’m willing to give Ozu another chance to impress me.  Maybe I’ll eventually join the multitudes of movie lovers enraptured by his films.  Because so many voices that I respect sing Ozu’s praises so highly, there’s got to be something more going on. 

Sunday, January 18, 2015

Which is the news show, and which is the comedy show?

Thursday, December 25, 2014

I Am a Tingler!

For as long as I can remember, a light touch to my head or face, or maybe even a certain sound of voice, would give me a quick tingling sensation in my head.  The tingling was very pleasurable, like swarming beads of euphoria dancing around in my cranium, and sometimes around my face, with joyous abandon.  This sensation was very infrequent — it would only occur a handful of times a year — and very fleeting.  I could only enjoy the ecstasy of the sensation for a few seconds before it was gone and I had to get back to whatever I was dealing with before it hit.  Because the sensation was so rare and so ephemeral, I barely thought about it at all, despite the immense pleasure it gave me.  And it never occurred to me to give the sensation a name.

That all changed last year.  It changed while I was surfing the Web.  I don’t remember how or why I came across what I did.  Given the importance of the discovery to me, you’d think that I would remember my path to it in great detail.  But no such luck.  What was my discovery? 

Last year, while exploring the Internet, I stumbled upon the phrase Autonomous Sensory Meridian Response, or ASMR for short.  And I learned that it described that tingling sensation that I had experienced all my life.  My electronic travels led me in short order to a series of YouTube videos by various users, videos whose purpose was to trigger the euphoric feeling in their viewers, usually by the video-maker whispering to the camera.  So, this pleasant tingling sensation, I realized, was something that others felt as well, and it was something that could be induced.  My mind was blown like Louis Armstrong’s trumpet.  

I’m trying to remember more about my reaction to this discovery.  But I recall so little about that day that it might as well have been 100 years ago.  Maybe I don’t remember it very well because I was so overwhelmed: not only did other people feel the sensation, but it was important enough to have a name.  The only other thing that I can remember about that day is my bewilderment about why it had never occurred to me to try and trigger the rapturous tingling in myself. 

YouTube’s most popular “ASMRtist,” GentleWhispering, who has over 300,000 subscribers

I checked out some of the ASMR videos, most made by whispering female YouTubers, often role-playing some moment of intimacy or tender attention, such as a haircut.  As I watched these videos, wave after wave of intense tingling swirled around the inside of my skull.  Although each wave lasted only ten seconds at most, and usually less, it would quickly be followed by another.  I felt as though I had discovered some high-inducing opiate. 

And now, I need to make the disclaimer that everyone who experiences ASMR is quick to note: Autonomous Sensory Meridian Response is not sexual.  Despite some of the ASMR video-makers’ affectations of intimacy, this is not a sensation that I experience below the belt.  While others (I have since learned) report of feeling it throughout their bodies, my own experience is usually confined to my head and never goes lower than my shoulders.  I wouldn’t be surprised if the brain’s activity during ASMR draws upon the same pleasure centers that are activated by sex, but the sensation itself is not erotic. 

While “Autonomous Sensory Meridian Response” is a scientific-sounding name, nothing is known about it scientifically.  Scientists have not found a way to measure ASMR.  Some studies have begun that examine the ASMR experience with an MRI, but last I heard, those studies were still in the preliminary stages and far from complete.  The phrase “Autonomous Sensory Meridian Response” was coined by one Jenn Allen.  She felt that such a phrase was needed because those who would talk about the sensation, lacking a better name, would liken it to a sexual experience and call it things like “brain orgasm.”  Allen wanted to distinguish the sensation from sex, so she invented a name without carnal connotations.   

An ASMR role-play video by the YouTube user VeniVidiVulpes

You might wonder why it’s taken me so long to post about ASMR.  Well, when watching my first batch of tingle-triggering YouTube videos, the intensity of the sensation began wearing off after about half an hour.  I returned to watching videos the next day, but the tingling, when I felt it at all, was barely noticeable.  I had to stop watching the videos for several days before the intensity of the tingling would return.  Because of this, I took a hiatus from ASMR videos for several months.  Only recently have I started watching them again, and only recently did I truly realize just how important this euphoric experience is to me. 

I have also joined an ASMR community on Facebook.  From what I have read there and elsewhere on the Internet, I get the idea that others who experience ASMR find it very easy to trigger in themselves.  If this is true, I seem to be someone in whom the sensation is difficult to induce and who, unfortunately, becomes inured to it relatively quickly.  Maybe I can find a way to change that.  At the moment, the YouTuber who can best induce ASMR in me is the user GentleWhispering (a.k.a. Maria), but other video-makers who specialize in triggering ASMR — or “ASMRtists,” as they call themselves — abound on YouTube.  For those who don’t experience ASMR, these videos will appear boring, perhaps not unlike the way pornography would be boring to a viewer without a sex drive. 

Since I have trouble remembering how I came across the phrase Autonomous Sensory Meridian Response, I’m embedding a video of a This American Life radio broadcast, in which the correspondent, who also experiences ASMR, recounts her discovery of the phrase and the community that has grown up around it.  I’ll also link to a couple other articles on the Web.

Two years ago at this time, I didn’t know the phrase Autonomous Sensory Meridian Response.  Now, I realize that I’m part of a small community that has the ability to induce a euphoric natural high in ourselves.  Because of that, I feel very lucky.  I hope that the scientific studies of ASMR bear fruit before too long, and we can all learn more about this intriguing and pleasure-provoking phenomenon.  

Thursday, December 18, 2014

The Death of Cinema Lives!

For a long time, I’ve been saying that Hollywood has now become driven by sequels and properties with pre-sold identities.  Now, writer Mark Harris is telling us that we are entering a time when sequels and franchises are the business, in a fascinating and rather scary article called “The Birdcage.”

Also, Jason Bailey of Flavorwire writes that Hollywood movies are now so expensive that they have priced such acclaimed filmmakers as Francis Ford Coppola, David Lynch, and John Waters out of the business.  His article is titled “How the Death of Mid-Budget Cinema Left a Generation of Iconic Filmmakers M.I.A.”  

It seems that the happening place to be for thoughtful, character-driven dramas these days is on television.  Breaking Bad, The Good WifeOrange Is the New Black, Mad MenHouse of Cards, and other critically acclaimed and popular series are sparking the water-cooler conversations these days as the feature films did back in the 1970s.  Meanwhile, big-screen movies are increasingly reserved for high-budget, high-grossing, lowest-common-denominator projects, with pre-existing identities, targeted mainly at children, adolescents, and very young adults.  Harris and Bailey second my opinion, reinforcing it with the facts on the ground.  If you love movies — and are uneasy about their future — both articles are must-reads..

Friday, December 12, 2014

A Tortured Rationale for the Iraq War?

This week, the U.S. Senate Select Committee on Intelligence announced its report on the George W. Bush administration’s use of torture.  Here is how Wikipedia describes the document:
The 6,000-page report, which took five years and $40 million to compile, details abusive actions by CIA officials (amounting to systemic mistreatment of detainees) and various shortcomings of the detention project. On December 9, 2014 — eight months after voting to release parts of the report — the SSCI released a 525-page portion that consisted of key findings and an executive summary of the full report. The rest of the report remains classified. 
The report details actions by a number of CIA officials, including torturing prisoners and providing misleading or false information about CIA programs to government officials and the media. It also revealed the existence of previously unknown detainees, the fact that more detainees were subjected to harsher treatment than was previously disclosed, and that more techniques were used than previously disclosed. Finally, it offers conclusions about the detention project, including that enhanced interrogation techniques did not help acquire actionable intelligence or gain cooperation from detainees.
The report has been blasted by its critics as inaccurate, incomplete, egregiously one-sided, and politically partisan.  And it has reignited the recurring dispute in this country about the usefulness of torture and what it says about us as a people whose government is willing to employ it.  For example, Charles Krauthammer points to the absence of any new terrorist attacks since 9/11 as confirmation of torture’s effectiveness (Q.E.D.).  And former Vice President Dick Cheney, a strong supporter of what he called “enhanced interrogation techniques” while he was in office, now appears on news show after news show as an ubiquitous advocate of the practice. 

Of course, I’m no expert on the subject, but most of what I’ve read about it says that torture doesn’t work.  A tortured prisoner, according to intelligence experts, will only end up saying whatever his tormentors want him to say.  In the words of Ali Soufan, “a former FBI special agent with considerable experience interrogating al-Qaeda operatives”: “When they are in pain, people will say anything to get the pain to stop. Most of the time, they will lie, make up anything to make you stop hurting them. That means the information you're getting is useless.” 

The Senate’s torture report also dredges up memories of the Iraq War itself.  A number of news stories regarding people like former counter-terrorism czar Richard A. Clarke and Bush’s Secretary of the Treasury Paul O’Neill have said that Bush wanted to invade Iraq at the very beginning of his tenure.  Indeed, when Bush established his administration, he appointed several advocates of “regime change” in Iraq — such as Donald Rumsfeld, Paul Wolfowitz, Douglas Feith, and Richard Perle — to positions of power.  As the news interviews with Clarke and O’Neill suggest, the terrorist attacks of 9/11 provided Bush with an opportunity to implement his long-held ambition to overthrow Saddam Hussein.

Now, I would like to say something that the more misanthropic corner of my mind has suspected ever since the Abu Ghraib prison story broke.  I didn’t go around announcing my suspicion to everyone I knew because it would have made me sound like a far-left nutjob.  This is the suspicion: I’ve long feared that much of Bush and Chaney’s “intelligence” about Saddam Hussein’s weapons of mass destruction in Iraq came from tortured prisoners of the War in Afghanistan.  In other words, Bush and Chaney so badly wanted a rationale to invade Iraq that they had prisoners of war tortured until those detainees parroted the regime-change hawks’ notions of WMD.  We now know that these notions of a nuclear-armed Iraq were erroneous, but I can imagine “enhanced” interrogators tormenting a detainee about WMD and the prisoner saying that Hussein was acquiring them just to make the pain stop.  And this is the main reason, my sardonic side suspects, that Bush and Chaney so vigorously defend torture: it (along with other debunked “evidence”) helped to provide the Iraq War’s false casus belli.   

This all sounds extremely cynical, I know, and as much as I disliked the Bush administration, I still gave it the benefit of the doubt that it wouldn’t go quite so far to achieve its dubious ends.  But a news story in the National Journal now gives credence to my formerly far-fetched suspicions.
December 9, 2014 — A Senate investigation into the CIA’s use of brutal interrogation practices released Tuesday suggests that at least one detainee supplied false intelligence contributing to erroneous claims by the Bush administration that former Iraqi dictator Saddam Hussein possessed weapons of mass destruction and was working with al-Qaida. 
A footnote buried in the Senate Intelligence Committee’s 500-page report references a Libyan national known as Ibn al-Shaykh al-Libi who “reported while in ... custody that Iraq was supporting al-Qaida and providing assistance with chemical and biological weapons.” 
Some of that intelligence from al-Libi was used by former Secretary of State Colin Powell during a speech to the United Nations attempting to justify the 2003 invasion of Iraq, according to the footnote, despite al-Libi later recanting the claim. 
That speech by Powell, delivered on Feb. 5, 2003, was a pivotal part of the lead-up to the invasion of Iraq, wherein the secretary discussed Iraq's “deadly weapons program” and the country’s “involvement in terrorism.” 
No weapons of mass destruction were ever discovered in Iraq, nor was Hussein found to have deep, crucial ties to al-Qaida. It is unclear how significant al-Libi's testimony was to the Bush administration's insistence that Hussein possessed them.
To be sure, the story does not say that al-Libi supplied the false intelligence as a direct result of torture, and the article states that it’s not certain to what extent, if any, al-Libi’s “testimony” contributed to Bush’s justification for the invasion of Iraq.

But the fact that this news story comes as close as it does to confirming my worst suspicions is bad enough to get me to write this post.  The known history of the Iraq War — its being based on false intelligence, the hasty and heedless way that Bush rushed into it, its incompetent mismanagement — already makes it an egregious calamity.  If the false intelligence used to justify it turns out have been the result of torture, this would boost it to a cataclysmic tragedy. 

For a long time now — and the new Senate report now says that I had a good reason — whenever I listened to Chaney or any other regime-change apologist defending the use of “enhanced interrogation techniques,” this is what I heard between the lines of what they said: Torture is good because tortured detainees tell us what we want to hear 

Sunday, December 7, 2014

007’s Next Nemesis: Stephen Hawking?

I’m not the world’s biggest James Bond 007 fan, but here’s an intriguing idea: the fictional secret agent’s next movie antagonist should be played by Stephen Hawking.  Yes, that Stephen Hawking — the British theoretical physicist and cosmologist, who expressed interest in portraying such a role in a recent interview

Virtually everyone who knows anything about him greatly admires Dr. Hawking’s expansive intelligence and his perseverance in the face of a terrible degenerative disease, and we are thankful that a computerized speech synthesizer allows him to continue communicating with the rest of the world.  But the modern imagination is also mindful of how a withering of one’s physical abilities and an over-dependence upon technology can run the risk of sundering us from our own humanity.  Imagining the dark side of someone with Professor Hawking’s genius whose deteriorative illness has alienated him from the rest of the world — as he speaks to it through an impersonal-sounding artificial device — sets the stage for an intriguing story of good against evil.  So, the idea of Stephen Hawking playing a Bond villain makes for a fascinating concept.

However, production has just this month commenced on the 24th James Bond actioner, titled Spectre, for distribution late next year.  If Hawking is ever going to play a Bond bad guy, such an endeavor won’t come about for at least two years.  Many years ago, Hawking was told that he didn’t have long to live, but he beat the odds anyway.  I hope that his health continues to hold out, and that he and the people behind the Bond franchise can make this intriguing concept a reality.  Yes, let’s see Professor Stephen Hawking play the secret agent’s adversary in the 25th James Bond film!

Friday, November 7, 2014

‘I’ve Just Seen a Face’

“I’ve Just Seen a Face” is one of my favorite Beatles songs, and I know I’m not alone in saying that.

However, I’m not sure how many listeners realize just how demanding that song’s rhyme scheme actually is.  And a big reason for this is how the English word “been” is pronounced on both sides of the Atlantic.

In America, the word “been” is homonymous with “bin.”  But in most of Britain (and parts of Canada), “been” is homonymous with “bean.”  When pronounced the British way (as Paul McCartney does on the Beatles’ recording), “I’ve Just Seen a Face’s” three-syllable rhyme scheme becomes more apparent:

Had it been another day
I might have looked the other way
And I’d have never been aware
But as it is, I’ll dream of her

Not every verse follows this pattern, but when they do, the rhyming absolutely soars!

Still, when American singers cover the song, they usually pronounce “been” as “bin,” and the elaborate rhyming recedes a little.  So, I’m wondering if most of the song’s listeners are as tuned into the way that “I’ve Just Seen a Face” rhymes.  Or whether it’s something they think about at all.

Originally posted on BeatleLinks Fab Forum in 2011.