Monday, December 31, 2007
Happy New Year!
Have a good one, everybody.
Sunday, December 16, 2007
Michael Medved’s Persecution Complex
I don’t usually read USA Today, but a copy happened to be available for me to peruse yesterday. One of the articles in the issue was conservative media critic Michael Medved’s panning of the cable-TV special The Lost Tomb of Jesus. Now, I haven’t seen the special (I don’t have cable TV), and I don’t fault Medved for the particulars of his criticism of the show’s content: he makes good points about the special’s allegedly shoddy use of archeology and history. However, sprinkled into his review are allusions to this tiresome victim mentality that some religious believers adhere to.
The first is that the “entertainment industry” as a whole is hostile towards religion.
“The entertainment industry in particular has developed a curious strategy of attempting to connect with America’s massive, ardent Christian audience with pulpy projects that openly undercut key tenets of Christianity.” —Medved
This is a gross overstatement. American entertainment media have never stopped making projects that are religiously themed. True, they haven’t been quite as high-profile as they use to be when the country was more religiously homogeneous, but from Jesus of Nazareth (1977) on through other Bible-based TV mini-series, Hollywood has never lost sight of the religious audience.
If religious themes largely faded from the movie screens in the 1970s, ’80s, and ’90s, it was because the film industry was too busy courting a youth market, and religious topics weren’t particularly high up on that questioning demographic’s tastes. Now that Mel Gibson has proven the economic viability of religious stories on the big screen, the major studios have rushed to create their own “faith-based” specialty divisions.
However, the most dismaying aspect of Medved’s writing is his insinuation that religious believers deserve the same consideration as racial minorities, effectively equating criticism of religious tenets with racial discrimination.
“...Some offended Christian callers to my radio show expressed the conviction that this project represented one more component in the aggressive secularist counterattack on traditional religious beliefs, along with best-selling books such as The God Delusion and Letter to a Christian Nation, and tireless efforts to remove crosses and Ten Commandments monuments from public places.” —Medved
I have a difficult time seeing how religious believers being “offended” by criticism of religion can be a cause for alarm. Questioning religious beliefs should be a good thing. To the secularist, questioning religion can be a refreshing reminder that free speech exists in this country. To the believer, the questioning ought to provoke thought and, in doing so, affirm one’s beliefs.
Should we be respectful of people’s religious beliefs? Of course, to an extent. However, when those same people use their religious beliefs in an irreligious way — for example, in order to deny gay people equal protection under the law — then their religion stops being merely a respectable conviction and starts being something worthy of criticism. But in cases like these, it is not religion per se that is being criticized, but the idea that religious belief should trump the Constitution.
Also, Medved falls back on a very disingenuous argument: religious beliefs, however unreasonable, should be catered to because religious believers make up a majority of this country.
“According to a Newsweek poll for its ‘From Jesus to Christ’ issue of March 2005 ... 78% of Americans say they believe ‘Jesus rose from the dead.’ The Lost Tomb of Jesus largely ignores this prevailing faith....” —Medved
There is something to be said for a majority opinion (for example, Gore winning the popular vote in the 2000 presidential election), but majorities aren’t the whole story. Democracies are also about protecting minority rights. And just because a majority of people believe something, especially something unprovable, that doesn’t necessarily mean that the belief is a good one to have.
Finally, Medved says that popular media are quick to challenge religious beliefs but slow to acknowledge historical findings that support the Bible:
“Dore Gold’s excellent new book, The Fight for Jerusalem: Radical Islam, the West, and the Future of the Holy City, is also full of dramatic proof that blows away prevailing scholarly skepticism about the historicity of King David's reign. But these richly documented discoveries never received the intensive coverage offered to feebly supported speculations that ‘disprove’ the Bible.
“Another fascinating book, The Exodus Case: New Discoveries Confirm the Historical Exodus by Swedish scientist Lennart Moller, provides gripping evidence about deliverance from Egypt and the real location of Mount Sinai. It also has inspired an ambitious feature film now in production. Considering general media instincts to slam rather than support biblical narratives, it will probably struggle to impact pop culture.” —Medved
Medved discusses these titles (among others), titles that support such plausible happenings as the reign of King David and the exodus of the Jews from Egypt, in an article whose over-arching criticism is of the proposed debunking of Jesus’ resurrection, something less plausible. To coin a phrase, this is comparing apples to oranges. I guarantee you that if a peer-reviewed book came out called Historical and Scientific Proof of Christ’s Resurrection, the media would be all over it in an instant.
—————————————
By the way, when I described the Exodus as “plausible,” I was referring to the plausibility of a massive amount of people going from one place to another. I wasn’t defending the plausibility of the exact narrative of the Exodus in the Bible, such as the Red Sea parting or Moses’ people wandering in the desert for 40 years.
Friday, December 14, 2007
President Gore Impeached over 9/11
Here is something else that I posted on LiberalsOnly.com. It’s the only fiction that I’ve written in quite some time:
April 6, 2002
PRESIDENT GORE REMOVED FROM OFFICE
9/11 Cited as Chief Cause
WASHINGTON — Yesterday, Al Gore became the first U.S. president to be removed from the White House via impeachment. By a vote of 67-33, a two-thirds majority of senators in the Republican-led Senate agreed to force the 43rd president from office. The devastating terrorist attacks of last September 11, and the Gore administration’s inability to prevent them, was cited as the main reason.
Speaking at a joint press conference following the momentous vote, Senate Majority Leader Trent Lott (R-Miss.) and House Majority Leader Tom DeLay (R-Texas) argued for the correctness of the vote’s outcome. “The American people deserve a president who doesn’t let terrorist attacks take place on American soil,” Sen. Lott said. “The vote was a victory for the American people.”
Last month, the House of Representatives pushed for articles of impeachment against President Gore after a bipartisan committee headed by former vice president Dan Quayle, a Republican, and former Georgia governor Zell Miller, a Democrat, concluded that Mr. Gore “demonstrated gross negligence and egregious incompetence for allowing these monstrous attacks to happen.”
Last week, a contingent of House Republicans presented its arguments for impeachment on the Senate floor under the watchful eye of U.S. Supreme Court Chief Justice William H. Rehnquist. Led by Rep. Bob Barr (R-Ga.), the Republican contingent made impassioned arguments in favor of Mr. Gore’s ouster against a backdrop of photographs of 9/11 victims and the smoldering World Trade Center.
The recent tide of public opinion against Mr. Gore is in marked contrast to the mood of the country immediately after the attacks. In those tense days, the President made several televised speeches about the need for the country to “be brave” and bring al-Qaeda, the Afghanistan-based terrorist organization responsible for hatching the attacks, to justice. Poll after poll of U.S. citizens showed public approval of Mr. Gore’s handling of events in near-astronomical territory.
However, a growing chorus of critical voices — including such high-profile figures as Rush Limbaugh, The Wall Street Journal, and various personalities from Fox News — began to blame Gore for the attacks. “If you want a good reason why Al Gore has no business being president,” Mr. Limbaugh famously said last year on his nationally syndicated radio show, “I can give you 3,000 of them who died on 9/11.”
“Gore won the election in 2000 with barely 51 percent of the vote,” Mr. Limbaugh continued. “There’s no way you could call that a mandate.” Mr. Limbaugh echoed popular conservative sentiment that Mr. Gore’s narrow victory in the 2000 presidential election was undeserved.
These mounting critical voices influenced Congress to initiate the Quayle-Miller committee, which led to the impeachment.
While the President and his closest advisors have remained mum about the impeachment in recent days, several White House aides were stunned over the results of the lopsided vote. “There’s absolutely no reason for the vote to have turned out this way,” said a low-ranking aide who spoke on condition of anonymity. “The President did everything he could to prevent any terrorist attacks. He formed that task force on terrorism immediately after getting that intelligence memo titled ‘Bin Laden Determined to Attack Inside the U.S.’ No one could have foreseen al-Qaeda using airplanes as missiles; that was totally outside the box.”
On the floor of the Senate last week, Rep. Barr argued that Mr. Gore should have anticipated such a plot.
The White House aide went on to list the Gore administration’s accomplishments since September 11, including its military invasion of Afghanistan shortly afterwards. “What about the President’s sky-high standing after the attacks? He put a human face on America’s suffering and on our determination to get the people who did this. He successfully ousted the Taliban from the government in Afghanistan in October and captured Osama bin Laden a month after that. Doesn’t that count for anything?”
Current Vice President Joseph I. Lieberman is set to take the oath of office at noon tomorrow. In the name of “national unity,” some Republican lawmakers are asking Mr. Lieberman to name a GOP vice president once in office. Mr. Lieberman has remained non-committal on the issue.
At his press conference, Sen. Lott added his voice for the country to move on. “If America learned anything from 9/11, it’s that you can’t be too lax on national security. Here’s one thing you can take to the bank: If there had been a Republican in the White House on 9/11,” Sen. Lott continued, “those attacks never would have happened.”
Saturday, December 1, 2007
Sushi Dan
Sushi Dan inhabits the suite that used to house Sushi on Tap, which had to have been one of the strangest restaurants I’d ever set foot inside. At Sushi on Tap, while you were eating your raw fish and noodles, the waiters and the sushi chefs would leap out onto the restaurant floor and go into some song-and-dance numbers. The now-defunct L.A. New Times rated Sushi on Tap “Best Sushi Restaurant with a Wacky Floor Show.” And so it was. However, the difficulty of making the sushi came to preoccupy the chefs’ time, and the floor show soon degenerated into a few local hoofers doing some unshowy improvised moves. Next thing I knew, Sushi on Tap had danced away.
Today, Sushi Dan is after a younger, hipper crowd. Rock music is always blaring from the sound system, and the über-moody lighting looks like it’s designed by Helen Keller. Settling into my table, it was hard to keep my mind on the subject of food because the restaurant boasted two exceptionally attractive waitresses. One looked lovely in a stately statuesque way with fine chiseled features; the other had a very appealing elfen air about her. Entranced by these two dazzling beauties at the same time, I could easily understand why some societies condoned polygamy, and it was all I could do to keep my stolen glances at them as furtive as possible.
Another amusing occurrance was that everyone whom the waitresses seated at the small table next to mine kept asking to move. First, a couple on a date sat there for a few minutes, then asked to be moved to the sushi bar. Next, a couple of guys were seated there, and it wasn't long before they asked to be moved, too. Then, two women were seated there, and the next thing I knew... Well, you get the idea. I didn't see why sitting next to me was so objectionable. Maybe I need to shower more often.
Anyway, one of the specials that night was a lobster Langoustine in dynamite sauce over California roll. It was relatively expensive but too intriguing to pass up. The taste was pleasant enough, but I don’t think that it really succeeded as a sushi dish.
Sushi Dan is becoming known for their loud flavors, marked by their specializing in oil-rich tempura rolls with a splattering of sweet sauce. The result is usually a heavy taste that goes against the more subtle flavors that are sushi’s best-loved trait.
The lobster and dynamite sauce had a tongue-pleasing tang, but I don't think that they went especially well with the California roll, whose flavors were smothered by the heaviness of the lobster. Also, it was all I could do to keep the generously sized cuts of sushi and lobster in my barely closed mouth, so the logistics of chewing my food distracted me from fully enjoying the flavor of my meal. And I kept thinking that their lobster Langoustine would probably taste — and be eaten — better between two slices of bread as a lobster roll.
Since the price of the roll didn’t leave me completely destitute yet, I also ordered another Sushi Dan special: a blue crab handroll. Again, the taste was agreeable, but the flavor of the crab was so heavy that there wasn’t much room for the rice and nori to work as complements. Sushi Nozawa also serves a crab-salad handroll, but its flavor is much lighter, so the finely balanced combination of crab, rice, and seaweed works well together as a whole dish. But that’s another restaurant.
And Sushi Dan capped off my dining experience by forgetting to charge me for my glass of wine. You just can't beat free food.
While they might not serve the best sushi I've ever eaten, Sushi Dan does provide some intriguingly off-beat offerings and uncommon flavors. Head chef Danny Kim is a stand-up guy, and his wife Michelle makes a charming hostess. Not being a sushi perfectionist, I’ll probably go back again soon to sample some more of the dishes and to soak up the atmosphere. Oh, yes, and to scope out the waitresses.
Thursday, November 8, 2007
The Beatles
I’ve been a fan of the Beatles ever since the age of six, and while most of the music I listened to in my years growing up has lost its punch, I’ve never lost my ear for Beatles songs. Even the tunes I don’t particularly like are still very listenable and hold my interest. As a result, Beatles songs — whether performed by the four guys themselves or covered by other artists — are like mother’s milk to me, and just a few familiar notes from one of their songs can give me a solace that few other things can.
Recently, my admiration of the Beatles and their enormous musical legacy intensified — so much so that I joined an on-line Beatles discussion board, “BeatleLinks Fab Forum,” to discuss with other fans what we got out of their music. One of the site’s members started a thread titled “I Always Knew the Beatles Were Different,” in which he put his appreciation for the band in rather abstract terms. This was all perfectly fine as far as what the original poster wanted to say, but I wished to continue his thread by putting my appreciation in a language of things more tangible. Written before Phil Spector’s arrest on murder charges, my post read:
I’ve been thinking about what makes the Beatles special as well. What made them more than a flash in the pan? What makes them so enduring? I don’t think that anyone can come up with a completely 100% satisfactory answer. After all, they didn't come out of nowhere. If it hadn’t been for the precedents of Elvis Presley and Phil Spector, I doubt that the Beatles as we knew them would have come about.
Also, the contributions of George Martin to their records can’t be overestimated. A number of people still laugh up their sleeves at Decca Records passing on the group after they made their demos for the company in 1962. But if the Beatles had signed with Decca, what guarantee is there that the guys wouldn’t have remained a novelty group, one which was not allowed by the label to grow? It’s a bit of kismet that the Beatles and Martin were mutually open to influence and growth — even when such growth seemed to threaten making their records a “hit.”
If I were to name only three elements that set the Beatles apart from the competition, I would limit myself to these:
1. Merging rhythm & blues with Spector's “wall of sound”: I think that the most distinctive part of the Beatles’ early sound was in adapting the elaborate tinkerings characteristic of Phil Spector’s “wall of sound” to the relatively spare instrumentation traditionally associated with rhythm & blues. When you try to reproduce Spector-like fripperies with a four-piece rock & roll band, what you get is music like the Beatles played. The addition of relatively complicated three-part harmonies on most of their songs — harmonies more accessible than Elvis’ near-baritone and not as sugary as the Beach Boys’s — enhances the instrumentation. To me, this is the secret of the “Liverpool sound.”
2. Catchy hooks and chord changes: For better or worse, virtually all of pop music depends on these. If you don’t capture the listener with a riveting series of notes and chords, chances are that the song will not be a hit. And the Beatles became the masters of the hooks. In their early songs, the most noticeable of the compelling chord changes are the ending chord, a sixth chord, on “She Loves You” and the change to a minor chord in “I Want to Hold Your Hand.” (Ringo’s beat is helpful, but not essential.) In a documentary, Roger McGuinn says that the Beatles’ chord changes are closer to those found in folk music, rather than in the rock & roll of the era. [And in a recent documentary on the making of the Beatles’ film A Hard Day’s Night, cast member Kenneth Haigh says that their melodic style was specific to the folk music of northern England.] Of course, as the Beatles progressed, their music became less dependent on this pop-music idiom, but without such a foundation, would their later experiments have found an audience?
3. An unwillingness to stand still: The Beatles could have spent their entire careers rewriting “Please Please Me” over and over again, but they didn’t. Even in their earlier, poppier songs there was a certain restlessness, whether it was in writing a love song in the second person (“She Loves You”) or incorporating electric-guitar feedback into a pop song for the first time (“I Feel Fine”). By constantly seeking new sounds, and new subject matter for their lyrics, the Beatles expanded rock & roll from a relatively primitive kind of music for juveniles to something much larger. Of course, there were outside influences, such as Bob Dylan, the Beach Boys, and the Byrds, but the Beatles chose not to ignore them. By the time the Beatles broke up in 1970, rock & roll was almost unrecognizable from its origins.
And while the Beatles were a particularly special case, something needs to be said for the great musical ferment of the 1960s. The Beatles may have been at the head of the pack, but they were in very good company — not only Dylan, the Byrds, and the Beach Boys, but also the Rolling Stones, Simon & Garfunkel, Jimi Hendrix, Motown, and more. The reasons for this artistic flowering were many and varied, and I won't go into them now. But I wonder if we will ever see such a creative explosion again.
Monday, November 5, 2007
Re: ‘Intaxication’
Economist Paul Krugman |
On August 15, 2007, Phil posted a meditation on taxes, specifically how someone in his hometown of Emporia, Kansas, a Mr. John Peterson, was proposing a tax on the wealthy to bring in some needed revenues. “Intaxication” was Phil’s rejoinder. I submitted a response to his piece, but it never appeared on Phil’s blog. Here is what I wrote:
No one likes to pay taxes. On the other hand, Americans (most of us) like Social Security, Medicare, the Federal Deposit Insurance Corporation, and other federal programs. Americans also like to have access to police officers or fire fighters when they’re needed. And Americans like to drive on paved roads free of potholes and have traffic lights that don’t cause any car crashes. These everyday social fixtures that we take for granted must be paid for. What pays for them? I’ll give you a hint: it’s a five-letter word and a favorite bugbear of the Republican Party.
Much of what makes our society livable — not everything, but much — depends on taxes. However, you wouldn’t get that idea from reading “Intaxication.” Instead, Phil likens taxes to “theft,” which he defines as “taking something that belongs to someone else and using it for another’s purpose, agenda, or pleasure.” But aren’t the social fixtures that I mention above a benefit to all of us, not just “another”?
The tenor of the post — especially its references to “theft,” “coveting,” and “mugging” — implies that the best kind of taxation is no taxation at all, or at the very least, a minimal form of taxation that would make April 15 just another day on the calendar. How would this kind of taxation pay for Social Security, policemen, and pothole patches? The post doesn’t say. I get the idea that anti-tax activists think that if you eliminate all taxes, these social necessities will pay for themselves. Such a utopian belief isn’t far from the one that says functioning Jeffersonian democracies will flourish throughout the Middle East just by toppling Saddam Hussein in Iraq.
Now, I’m not writing to defend John Peterson’s particular tax idea. Being no economist, I have no idea whether it would be suitable or not for the good people of Emporia (although, as a rule, I find flat taxes regressive). I’m also not saying that some taxes shouldn’t be scrutinized or adjusted or even abolished when appropriate. But some taxes must be — must exist — for the sake of society. The Great Depression (caused by the stock-market crash of 1929) showed us the need for a social safety net: Social Security, the FDIC, unemployment relief, etc. To continue on our present course of taxation could very well force us to slash these social services severely. And I don’t see any acknowledgement of that in “Intaxication.”
Phil’s post pointed me to a 2003 article, “The Tax-Cut Con” by Paul Krugman.
There. I’ve probably lost at least two-thirds of any conservative readers out there just by mentioning Krugman’s name. He is a liberal economist and New York Times columnist who alerts his readers to the flaws (to use a value-neutral word) in the Bush economy. He is also the bête noir of many on the political right, and a small conservative cottage industry has sprung up to refute everything he says as a “lie.” Krugman’s name carries as little credibility with many conservatives as the name of right-wing think-tanker Thomas Sowell (whose writings Phil recommends and whose syndicated column I sometimes read in the Daily News) does with me. Still, I thought that I would compare some of Krugman’s comments in “The Tax-Cut Con” — and a few of my own — with some of Phil’s statements in “Intaxication”:
“We’re taxed on every hand. The federal government taxes us; the states tax us, municipalities tax us. … They even tax us when we die.” —Phil
Phil’s statement is obviously a reference to what is officially known as the estate tax or inheritance tax, but which has lately become more commonly known as the “death tax.” However, Krugman explains how the estate tax was cut by the Bush administration and why the phrase “death tax” (reportedly coined by GOP strategist Frank Luntz) is a misnomer: “[Bush’s] 2001 tax cut phases out the inheritance tax, which is overwhelmingly a tax on the very wealthy: in 1999, only 2 percent of estates paid any tax, and half the tax was paid by only 3,300 estates worth more than $5 million. The 2003 tax act sharply cuts taxes on dividend income, another boon to the very well off. By the time the Bush tax cuts have taken full effect, people with really high incomes will face their lowest average tax rate since the Hoover administration.”
Further on, Krugman says: “As demonstrated, the estate tax is a tax on the very, very well off. Yet advocates of repeal began portraying it as a terrible burden on the little guy. They renamed it the ‘death tax’ and put out reports decrying its impact on struggling farmers and businessmen — reports that never provided real-world examples because actual cases of family farms or small businesses broken up to pay estate taxes are almost impossible to find. This campaign succeeded in creating a public perception that the estate tax falls broadly on the population.”
“By the time they’re done with us they’ve taken 50% or more of the money we’ve earned by the sweat of our brows.” —Phil
Krugman: “Very few Americans pay as much as 50 percent of their income in taxes; on average, families near the middle of the income distribution pay only about half that percentage in federal, state and local taxes combined.”
“Prior to the ratification of the Sixteenth Amendment, government controlled about seven percent of our gross domestic product and employed about four percent of the total workforce.” —Phil
The Sixteenth Amendment was ratified in 1913. That year, World War I hadn’t even begun, and the G.D.P. of America was a fraction of what it became after World War II. Today, the United States is a world superpower, with a powerhouse economy, and needs a government (which is financed by our taxes) to match. Is looking at the tax needs of the U.S. today alongside the U.S. of 1913 a fair comparison?
“By 1995, our federal bureaucracy employed nearly twenty million Americans and controlled one-third of gross domestic product. That’s as staggering as it is sobering.” —Phil
Krugman: “To assess trends in the overall level of taxes and to compare taxation across countries, economists usually look first at the ratio of taxes to gross domestic product, the total value of output produced in the country. In the United States, all taxes — federal, state and local — reached a peak of 29.6 percent of G.D.P. in 2000. That number was, however, swollen by taxes on capital gains during the stock-market bubble. By 2002, the tax take was down to 26.3 percent of G.D.P., and all indications are that it will be lower [in 2003 and 2004]. This is a low number compared with almost every other advanced country. In 1999, Canada collected 38.2 percent of G.D.P. in taxes, France collected 45.8 percent and Sweden, 52.2 percent.
“Still, aren't taxes much higher than they used to be? Not if we're looking back over the past 30 years. As a share of G.D.P., federal taxes are currently at their lowest point since the Eisenhower administration. State and local taxes rose substantially between 1960 and the early 1970s, but have been roughly stable since then. Aside from the capital-gains taxes paid during the bubble years, the share of income Americans pay in taxes has been flat since Richard Nixon was president.”
“In about a year and a half the page of history will turn and in all likelihood there will be a complete shift of political power in America. I can only imagine how much more of our property and wealth will be redistributed when that day dawns.” —Phil
The Bush tax cuts have fallen disproportionately on the wealthiest Americans. If more government revenues are needed — and I think they are — the best way to get them would be to roll back these particular tax breaks, sparing most folks in the middle class or below. But which tax breaks (if any) will be rolled back probably depend more on the influence of wealthy campaign contributors — to whom both Democrats and Republicans are beholden — than they do on the candidates themselves. Krugman says: “Wealthy campaign contributors have a lot to gain from lower taxes, and since they aren't very likely to depend on Medicare, Social Security or Medicaid, they won't suffer if the [social safety net] gets starved.” Krugman continues: “And more broadly, the tax-cut crusade [since the Reagan years] will make it very hard for any future politicians to raise taxes.”
“I suppose we might ask our founding fathers, who funded a revolution without levying taxes. In fact, wasn’t one of the principle reasons we shook off the tyranny of George III the matter of taxation without representation[?] How did we ever manage to secure our liberty without so much as a hint of an Internal Revenue Service?” —Phil
Is comparing the needs of America today with those of the America of 1776 any more appropriate than comparing them to those of the America of 1913? And is the above sentence suggesting that we Americans (other than the civilian residents of Washington, D.C.) have taxation without representation today? If so, then what do you call all of those people who work on Capitol Hill? Aren’t they called “representatives”?
“Maybe if [government leaders were to read the books of supply-side economists, those leaders would] see that tax reductions actually produce increased government revenues ….” —Phil
Krugman: “[President Bill] Clinton did exactly the opposite of what supply-side economics said you should do: he raised the marginal rate on high-income taxpayers. In 1989, the top 1 percent of families paid, on average, only 28.9 percent of their income in federal taxes; by 1995, that share was up to 36.1 percent. Conservatives confidently awaited a disaster — but it failed to materialize. In fact, the economy grew at a reasonable pace through Clinton's first term, while the deficit and the unemployment rate went steadily down. And then the news got even better: unemployment fell to its lowest level in decades without causing inflation, while productivity growth accelerated to rates not seen since the 1960’s. And the budget deficit turned into an impressive surplus.”
Elsewhere in his article, Krugman says: “It is not that the [economic] professionals refuse to consider supply-side ideas; rather, they have looked at them and found them wanting. A conspicuous example came [in 2003] when the Congressional Budget Office tried to evaluate the growth effects of the Bush administration’s proposed tax cuts. The budget office’s new head, Douglas Holtz-Eakin, is a conservative economist who was handpicked for his job by the administration. But his conclusion was that unless the revenue losses from [Bush’s] proposed tax cuts were offset by spending cuts, the resulting deficits would be a drag on growth, quite likely to outweigh any supply-side effects.”
“…It is our magnificent cultural and economic system which gives [the wealthy] the opportunity to be rich. They should give back.” — John Peterson, quoted by Phil
Throughout “Intaxication,” I don’t see any persuasive argument that refutes this statement. Someone whose perspective approximates Mr. Peterson’s was the industrialist and philanthropist Andrew Carnegie, who said, “Wealth is not chiefly the product of the individual, but largely the joint product of the community.”
I also dispute “Intaxication’s” contention that “the power to tax is becoming increasingly the power to control.” Isn’t the disproportionate accumulation of wealth by private corporations, and their allocation of the products we depend on, a power to control as well, a power less accountable to our political processes than taxation is? (Not that I have anything against private corporations as such.) For example, don’t the skyrocketing prices of gasoline (those prices exclusive of any gas taxes) influence our driving habits? However, this is too big an issue to go into here.
Just a few weeks ago, Minneapolis experienced the tragic collapse of a bridge on Interstate 35 that took eleven lives at last count. The devastation was a heart-rending reminder of how our very lives can be dependent on an infrastructure funded by tax dollars — and tax skeptics will correctly remind us that those revenues need to be spent wisely. Was the bridge catastrophe directly traceable to the Bush administration’s regressive tax policies, which leave municipalities around the country needing to stretch the monies they spend on infrastructure? Probably not. But I believe that the tragedy does foretell the multiple mini-Katrinas that could take place around this country if Bush’s tax policies stay on the books.
Suffice it to say that the United States suffers from a peculiar kind of schizophrenia: Americans love, enjoy, and take for granted many of the benefits that come from taxation. However, Americans hate paying the taxes themselves — so much so that even a few middle-income folks passionately denounce taxes that impact only the very rich.
Everybody wants to go to Heaven. Nobody wants to die.
Sunday, November 4, 2007
‘Letter to a Christian Nation: Counter Point’
Religion has become such a political weapon in this country that any high-profile discussion of a religious subject usually devolves into either a shouting match or some kind of cynical debate where scoring points over one’s opponent becomes more important than expressing an idea. Among religious debaters, popular notions include how “contemptuous” those who don’t hold fundamentalist beliefs are to those who do, and how official institutions (academia, the media, etc.) are actively “hostile” to religion. These are notions intended to kill any peer-reviewed challenge to biblical authority, and they are not conducive to an intelligent discussion. Unfortunately, they are among the positions that some of Harris’ responders fall back on.
One of Harris’ responders is the evangelical Protestant minister R.C. Metcalf with his book Letter to a Christian Nation: Counter Point. I bought the book from Amazon.com because I wanted to read an enlightened rejoinder to Harris’ original book, hopefully written in a logical, respectful tone. But while reading Metcalf’s book — which is outwardly respectful — I found myself disagreeing with most of his arguments. After I put the book down, I felt compelled to write a review on Amazon, but the website ended up not posting it. So, I thought that I would post it here instead:
I didn’t buy R.C. Metcalf’s Letter to a Christian Nation: Counter Point with the expectation of writing a review. I bought it because Sam Harris’ books The End of Faith and Letter to a Christian Nation make powerful arguments in favor of atheism, arguments that I would like to see addressed.
As an agnostic, I am about as close to an atheist as you can get while still believing in something called God. In short, I don’t believe in a personal God, and I make no claims for an afterlife. So, why do I believe in God at all? Why don’t I just throw out the bathwater altogether? I hoped to be able to answer these questions better by reading God-believing writers responding to Harris, even if the writers’ particular beliefs didn’t exactly match my own. For this reason, I turned to Counter Point hoping that the book would answer Harris’ empirically based, logical arguments in a spiritual manner equally empirical and logical. But I am severely disappointed by what Metcalf has written.
Metcalf begins his book with a fundamental flaw: his assertion that atheists “have nothing on which to ground their morality” (RCM, xi). But this is not accurate because Harris quotes the Jainist saying “Do not injure, abuse, oppress, enslave, insult, torment, torture, or kill any creature or living being” (SH, LTACN, 23). Along with Harris, I think that mere compassion for another — sincerely seeing one’s own well-being in the experiences a fellow creature — is a legitimate grounding for a moral philosophy, one that need not subscribe to the existence of an unseen deity in order to prosper.
Another fundamental flaw is Metcalf’s defining “Christianity” in a way that isolates his brand of Protestantism from Christianity’s excesses — such as the Inquisition and the Crusades, subjects of Harris’ criticism — and then blames these excesses on the failings of the “Roman papacy” (RCM, xiii). But Harris has already warned against scapegoating “perversions of the ‘true’ spirit of Christianity” in order to claim one’s particular idea of the religion as the only one that is correct and true (SH, 11-12). Harris is right that Christianity, taking all of its denominations together, is a “muddled and self-contradictory” belief system that can be used to affirm or condemn its own various aspects. How do we know that Metcalf’s version of evangelical Protestantism is the rightful view of Christianity to hold? In other words, if Catholicism was wrong about the Inquisition and the Crusades, how do we know that evangelical Protestantism isn’t wrong about, say, same-sex marriage?
Counter Point makes some reasonable contributions to the debate about God by putting some of Harris’ biblical quotations in context. However, given Metcalf’s flawed foundations, it’s no surprise that Counter Point’s analytical structure collapses. For example, Metcalf takes Harris to task for assuming that “many of the Old Testament laws [e.g., Deuteronomy] continue to bind Christians today” (RCM, 4). However, Harris isn’t saying that at all; the reason why he goes on at length about the Old Testament’s barbaric punishments is to illustrate how Christians are cherry-picking which biblical passages to believe and which to ignore. To Harris, such cherry-picking calls into question any belief in the Bible as moral in its entirety. But Metcalf says that Jesus fulfilled ancient Hebraic law (including Deuteronomy) by changing it with his martyrdom. In this way, the Counter Point author tries — unsuccessfully — to rationalize a context in which cherry-picking and the morality of the entire Bible are not mutually contradictory. Metcalf and Harris are not on the same page.
Other aspects of Metcalf’s argument don’t sit well with me. He often quarrels with the particulars of Harris’ statements, instead of engaging the larger issues. For instance, Metcalf doesn’t address the issue of slavery in its larger context — in particular, Harris’ charge that the antebellum American South could use scripture to sanctify this inhuman custom (SH, 19). What Metcalf does instead is define biblically sanctioned “slavery” down to the way it was practiced in ancient times, a way that doesn’t apply to the enslavement of Africans in America (RCM, 20-25).
Also, I don’t share Metcalf’s absolutist view of lust and pornography as inherently degrading (RCM, 26-27). I think of sexual desire as a God-given gift, and while this gift should never be abused (everything from the rude stare to coercive sex), I see nothing intrinsically wrong with respectfully appreciating another’s appearance in a sexual way.
Moreover, I think that Metcalf’s fleeting reference to eugenics is unworthy of a serious discussion about belief in God. Metcalf makes an aside about this dubious philosophy of human engineering: he relates how the first person to conclude scientifically that prayer was ineffectual, the anthropologist Francis Galton, also coined the word “eugenics” and was Charles Darwin’s cousin (RCM, 52-53). This strikes me as a cheap shot to imply that a disbelief in prayer or a belief in evolution leads ultimately to Nazi-style genocide. This unnecessary swipe at evolution ignores the fact that proto-eugenic philosophies existed well before Darwin’s cousin, that Darwin himself did not endorse eugenics, and that Adolf Hitler was a self-identified Christian.
In addition, Metcalf falls back on the familiar rhetorical device of conflating archaeological evidence of some biblical historical figures with evidence that the Bible is literally true (RCM, 39). Other believers in the historicity of the Bible’s stories have made this same hasty argument, but scientific evidence of some of the Bible’s characters does not necessarily confer scientific evidence onto the Bible’s various narratives. In other words, the archaeological discovery of Caiaphas’ bones does not automatically prove that Jesus rose from the dead.
Letter to a Christian Nation: Counter Point is not a convincing rebuttal to Sam Harris. I give Metcalf credit for attempting to address Harris’ challenge and for the clarity of his writing. But his book doesn’t meet that challenge in a straightforward, reasonable, persuasive way.
***
I have also bought another response to Sam Harris, Letter from a Christian Citizen by Douglas Wilson. I’ll try to post a review of that book later.
Saturday, November 3, 2007
Yasujiro Ozu’s ‘Tokyo Twilight’
I’ve recently been on a Japanese-movie kick (I seem to come down with one once a year). Early last month (October), one of the Japanese titles that I rented from Netflix was Tokyo Twilight (1957), co-written and directed by Yasujiro Ozu. I’m not a big Ozu fan — as many foreign-film buffs are — but I can appreciate his austere, Zen-like style of filmmaking that breaks so many of Hollywood’s cinematic laws. Still, as agreeably unconventional as Ozu’s way of shooting a film is — low camera angles, characters almost addressing the camera, eyeline mismatches, “empty” shots unrelated to the characters’ actions — the narratives that his movies convey are utterly conservative through and through. However subtle and restrained, his films use their exacting style to impart stories that are dispirited by Japan’s postwar modernity and nostalgic for a return to “traditional” Japanese values rooted in feudalism. For this reason, Yasujiro Ozu is not one of my favorite directors. His use of an innovative filmmaking strategy to tell such old-fashioned stories is a bit like a heavy-metal rock song extolling the virtues of prohibition.
As I watched Tokyo Twilight, nothing about the film changed my opinion of Ozu. The story tells of a Father (Chishu Ryu) whose wife deserted him, early in their marriage, for another man, leaving the Father with two small daughters to raise. Now grown to adulthood, the elder daughter, Takako (Setsuko Hara), is in an unhappy marriage, arranged by her Father, and she is the mother of a toddler daughter. The situation at Takako’s house has become so stressful that she and her daughter are staying at her Father’s house until she can figure out her next step. The unruly younger daughter, Akiko (Ineko Arima), is now in her late teens, still living with her Father, and verging on delinquency. In her wanderings around Tokyo, Akiko comes across an older woman, Kisako (Isuzu Yamada), who oversees a low-rent gaming establishment. We later find out that Kisako is Takako and Akiko’s mother, newly returned to Tokyo (the man she left the Father for was killed in World War II). Akiko learns that she’s pregnant, but the young man responsible won’t commit to her. She gets a back-alley abortion, but her experience leaves her emotionally wrecked. When she sees how indifferent the young man is to her ordeal, Akiko kills herself. After Akiko’s funeral, Takako goes to see Kisako and angrily tells her that she is responsible for Akiko’s death. Viewing Akiko’s disobedient personality and her death as what happens to a child raised by only one parent, Takako leaves her Father’s house and returns with her child to her husband.
Got that? A cautionary tale against premarital sex and abortion, women unhappy with their husbands compelled to stay with them, mothers blamed for intergenerational problems but fathers let off the hook — assuming that he could read the subtitles, Dan Quayle would love this movie.
Three days after I mailed the DVD of Tokyo Twilight back to Netflix, I saw in the entertainment section of the Los Angeles Times that there would be a revival of that same Ozu film in just a couple of days at the Los Angeles County Museum of Art. The announcement came in a capsule review of the movie written by Kevin Thomas. In his article, Mr. Thomas downplayed the film’s conservative messages to concentrate on the singularity of Ozu’s cinematic style and the director’s “compassion” for his characters. I had no intention of starting a debate with Mr. Thomas (which never happened, in any case) nor of discouraging anyone from seeing Ozu’s films, but having just viewed Tokyo Twilight a few days ago, I thought that I would write Mr. Thomas about my differing thoughts on the film. Here is the e-mail I sent:
Dear Mr. Thomas:
I was surprised to see your review of Yasujiro Ozu’s Tokyo Twilight in [the entertainment section] because I had just rented the same film on DVD.
I have to disagree with your review, however. You say that Ozu doesn’t judge his characters. Watching Tokyo Twilight, I sensed that Ozu implicitly condemns [Kisako] for deserting her family in a way that he does not criticize the father for arranging an unhappy marriage for [Takako]. To me, Ozu’s message is clear: if [Kisako] hadn’t deserted the family, [Akiko] would have neither had an abortion nor died. Ozu also seems to belittle falling in love — the reason [Kisako] left the family — in a way that he does not criticize miserable arranged marriages, like [Takako’s].
You also say that Ozu “has faith that children have the capacity to learn from their parents’ mistakes and not repeat them.” To me, Ozu seems to be saying that the younger generation has everything to learn from its elders and nothing to teach them. After seeing the tragic results of [Kisako] leaving the father, [Takako] decides to return to her “hard-drinking, thin-skinned” husband for the good of their child — with no adjustment on the husband’s part. What kind of marriage will she be returning to? Will the problems that drove [Takako] from her home be ameliorated in any way? Ozu doesn't seem to care.
For all the critical praise of Ozu’s “radical” style, few have noted how deeply conservative Ozu’s stories are. To me, his later films say that Japan has been corrupted by postwar values and modernity, and that the only way to undo this corruption is a return to traditional Japanese values of patriarchy and filial piety.
I’m not criticizing Ozu for failing to pass some liberal litmus test — there are several movies I like which could be considered conservative in one way or another — but Ozu’s films, like few others, really clobber me in an unpleasant way with their anti-modernity, anti-feminist sentiments. I wonder if some Ozu champions (I’m not saying that you’re one of them) overlook how reactionary his stories are by focusing on the uniqueness of the Japanese culture.
Oh, well, at least Ozu makes distinctive films. And I’m not trying to discourage anyone from seeing them. (Fun fact: Isuzu Yamada [who plays Kisako] is only three years older than Setsuko Hara [who plays Takako, Kisako’s daughter].) I just thought I’d pass along my own thoughts about a film that I just saw this past weekend. Thanks for reading.
Friday, November 2, 2007
Cinema
Lived experience is in three dimensions — film is a two-dimensional light image projected upon a flat surface. Lived experience is usually perceived with binocular vision through two eyes — film usually portrays its on-screen subjects with a monocular image shot through a single camera lens, and our eyes experience the difference between binocular and monocular images. Lived experience is viewed continuously through our two eyes — films rely on editing, dramatically changing perspectives from shot to shot in a way totally alien to how our eyes work. Lived experience makes itself known to us as we discriminate among the hubbub of input from our five senses — film uses photography, sound mixing, and editing to filter through what the movie-making equipment has captured in order to emphasize what the filmmakers think is important. And most obvious of all: lived experience is never in black & white and does not come with a soundtrack played by an invisible orchestra.
So, when it comes to movies that I like, I usually latch on to how these films are different from the outside world, not how successfully they mimic it. Consequently, my taste in movies leans decidedly towards features, documentaries, and experimental cinema that emphasize and skillfully harness the artificiality of the medium. What about movies where identification with the lead character is (ahem) paramount, and where the cinematography, sound, and editing try to be as “invisible” as possible to facilitate the audience’s empathy? A few movies like that I count among my favorites, but they are firmly in the minority.
My love for cinema started at an early age. Growing up in the Washington, D.C., area, the first kind of movies that I really got into as a kid was old horror films, especially the ones from Universal in the 1930s and ’40s. I think that the attraction to horror movies by kids is easily understandable: children have all of these inchoate fears about the big, bad world, and monsters become a momentary physical embodiment of such fears. I checked books out of my local library to read up on the history of horror movies — and on occasion, the history of a few non-horror movies would slip through the cracks. I learned about the history of the twentieth century by learning about the history of movies: for instance, my knowledge about World War II largely came from my realization that American movies made during that super-patriotic time were somewhat different than films made during other times. And in my childhood days before home video, catching an old black & white movie on broadcast television was easier than it is today. As I grew, monster movies would point me in the direction of other kinds of film: an interest in horror movies led to an interest in Abbott and Costello, which led to an interest in the Marx Brothers, which led to an interest in Golden Age Hollywood in general, which led to an interest in foreign films…
However, as my interest in monsters waned, my fascination for film never went away. Perhaps the greatest whetter of my celluloid appetite was a T.V. show on channel 26, the Washington-area PBS station, called Cinema 26. As I said, these were the pre-VCR days, and the nearest revival theatre was a healthy hike away. But here was a movie showcase, broadcast both on weekday afternoons and on Saturday evenings, that featured the magnificent catalogue of films distributed in the U.S. by the arthouse exemplar Janus Films. My first viewings of François Truffaut’s The 400 Blows, Akira Kurosawa’s Rashômon, Federico Fellini’s La Strada, several films by Ingmar Bergman, several Ealing comedies, and many others all were on Cinema 26 — often on a small portable black & white set on the kitchen counter (even on such a small screen, these films lost none of their impact). On much rarer occasions, I was able to get out to the American Film Institute Theatre in Washington, where, on the big screen, I first saw Alain Resnais’s Hiroshima mon amour and Orson Welles’s Citizen Kane.
First viewing Citizen Kane at age 15 especially enraptured me. I left the theatre that afternoon feeling that I had never seen a better motion picture; all these years later, my feeling hasn’t changed. Welles’s masterpiece (his first feature-length film, directed at age 25) has faced some challengers for my personal top spot over the years — Carl Theodor Dreyer’s The Passion of Joan of Arc, Stanley Kubrick’s Dr. Strangelove, Werner Herzog’s Aguirre, the Wrath of God, Chris Marker’s Sans Soleil, Sergei Paradzhanov’s Shadows of Forgotten Ancestors and The Color of Pomegranates, the films of Wong Kar-Wai — but Kane is still my favorite.
Because of cinema, I wound up making a pilgrimage to Hollywood and studying film history and theory at the University of Southern California. After getting my Master’s degree there, I worked for a while in the lower echelons of the motion-picture industry. At the moment, I’m still in Los Angeles, between jobs, recovering from major surgery, and tapping away at a keyboard writing something for people I don’t know. Funny what the movies can do.
Tuesday, October 30, 2007
‘Culture Wars’
Before I started my own Internet blog, I would occasionally leave liberal responses on a conservative blog titled “Another Man’s Meat” (it sounds like the title of a gay website, I think, but it’s really conservative) written by one Mr. Phil Dillon of Emporia, Kansas. Although I disagree with Mr. Dillon’s politics, I would respond to his posts because he expressed his right-wing opinions so well and without the belligerent bombast that is typical of conservative punditry. I tried to keep my feedback respectful in tone. I didn’t want to pick any verbal fights — after all, I was a guest on his blog, and guests shouldn’t be rude.
Back in 2005, in one of his posts, Mr. Dillon made a casual aside about “culture wars” – too casual, I thought, for such a violent turn of phrase — so I left a comment taking issue with the term. Although I signed the comment “Rob in L.A.,” Mr. Dillon thought that I was submitting it anonymously. To my surprise, he titled his next post “Letter to Anonymous,” which was addressed to me and defended his use of the phrase “culture wars.” In turn, I posted another reply, and it remains my longest articulation of my thoughts on the state of contemporary politics, and I thought I’d repost it here:
Dear Phil:
I’m surprised you thought my post was worth responding to, but thank you for doing so. Unfortunately, I haven’t had the time to read every single post on your extensive website, and my impressions about both you and your writings are probably not as complete as would be ideal. So, I appreciate your comments on the issue of compromise. However, there are some views on your blog that I don’t agree with, and I’ve posted responses to them when appropriate. I do not write on this site to stir up acrimony — there are enough trolls out there — and I hope that no one thinks I have.
In fact, the reason that I post on this site in the first place is because your own writings are so thoughtful. As I’ve said before, I have no illusions of converting opponents to my way of thinking. I post here because your writing inspires me to clarify my own thoughts. I mean that as a compliment. I also post here just to remind certain readers — in a non-confrontational way, I hope — that other points of view exist.
First, I don’t mean to hide behind the nom du lâche “Anonymous.” It must have escaped notice, but I sign all of my postings “Rob in L.A.” For those reading who wonder why I do this, it’s because my name is Rob and I live as an East Coast transplant in a certain southern California city. I’m a writer in my 40s recovering from major surgery — in other words, with way too much time on my hands. While trying to figure out how to get my career back on track (or at least pretending to), I lollygag the hours away surfing the ’Net....
On the subject of “culture wars,” I understand that the term has been hyped by the media as an exaggerated way to describe social disagreements within America. However, the term has taken on a disturbing ring of reality. Where I think that our republic (do we still say “democracy” after the 2000 election?) thrives on reasoned and considered discussion, the political media thrive on hyping arguments to the extreme. Why watch a level-headed debate on TV when more people will tune into a shouting match? Why put readers to sleep with a lucid argument when you can boil their blood with bombast? Heat sells; light doesn’t. (Fortunately, this doesn’t apply to your own website.) I think this has had an adverse effect on public discourse. As a result, most media-savvy politicians and pundits speak in absolute terms which don’t allow much room for compromise, terms repeated by their audiences. By positioning uncompromising audiences as far apart as possible, modern political speech leaves little space to come together, and this is what gives the phrase “culture wars” a meaning beyond the metaphorical.
Consider this statement by Bush political advisor Karl Rove to the New York state Conservative Party earlier this summer [2005]:
“Conservatives saw the savagery of 9/11 and the attacks and prepared for war; liberals saw the savagery of the 9/11 attacks and wanted to prepare indictments and offer therapy and understanding for our attackers. In the wake of 9/11, conservatives believed it was time to unleash the might of the United States militarily against the Taliban; in the wake of 9/11, liberals believed it was time to submit a petition.... Conservatives saw what happened to us on 9/11 and said, ‘We will defeat our enemies.’ Liberals saw what happened to us and said, ‘We must understand our enemies.’”
Putting aside Rove’s apparent disbelief in the old adage “Know thine enemy” for the moment, this is appalling talk for someone so high up in the administration. For starters, it’s untrue: Most liberals — including myself — were horrified by the 9/11 attacks, put aside petty partisanship, and supported President Bush and Prime Minister Blair in their call to hold the Taliban in Afghanistan accountable – by military force, if necessary. True, not everyone wanted to rush into a shooting war, but most agreed that some kind of tough, judicious action was required. Hey, if the Taliban (however uncharacteristic it would have been of them) had voluntarily turned over the al-Qaeda leaders to the U.S., a shooting war wouldn’t have been needed. Rove suggests that anything less than a shooting war would have been wimpy.
Secondly, Rove’s speech is extremely divisive in ways that are different, I believe, from pre-9/11 partisan rhetoric. Now, I realize that all political parties have had their “red meat” speeches to true believers that exaggerate differences between themselves and other political organizations, but Rove’s pronouncement went further. Rove basically said that “liberals” (e.g., Democrats) countenanced a murderous attack on our own soil that left [almost] 3,000 people dead. He basically said that we liberal Americans are unwilling to do what it takes to defend our own country. He basically said that to be liberal was to be disloyal.
I believe that if Bush and Rove really regarded this country as being at “war,” they would recognize the need for bipartisanship on issues of national security, and they would be doing more to bring this country together. Instead, they are hyping 9/11 for short-term Republican Party gain. This doesn’t look like leadership in a time of war; it looks like crass partisan politics.
This kind of fiery rhetoric, fanned by much of the media, is hurting this country. Every time I hear the phrase “culture wars,” I’m reminded of the bellicose tone of political discourse. It seems to me that this war-like posturing could turn into the real thing if we’re not careful. If we could think of our cultural disagreements in terms other than “wars,” it would go a long way towards positioning those disagreements as reconcilable, as something other than an all-or-nothing confrontation.
...
Regarding George W. Bush’s rhetoric about bringing freedom to the Middle East, his words don’t resonate with me. His strained and convoluted oratory (if that’s not too generous a word) about Arab democracy sounds to me like a fall-back position. As I’ve said on this site before, we went into Iraq militarily because we needed to “disarm” Saddam Hussein of weapons of mass destruction [WMD], which he was going to use against us at any moment — or so we were told. Throughout 2003, I gave Bush the benefit of the doubt whenever he asserted that Hussein had weapons. Now that no WMD have turned up after more than a year of looking for them, now that Bush’s rationale for going to war has proven untrue, now that we know we were egregiously unprepared for the aftermath of the invasion, Bush is changing the subject. All of his talk about bringing democracy to a troubled region sounds to me like Bush is making the best of a bad situation — a situation made bad in the first place because, I believe, Bush thought that a shooting war with Iraq would be quick, easy, and good for him politically in the 2004 election.
As I said before on this site, if the U.S. had some kind of moral obligation to remove Hussein from power as its primary causus belli, Bush should have just made that argument from the very beginning and tried to convince the American people to go to war with Iraq for that reason. The fact that he hyped WMD at the start suggests that a majority of the American people needed another reason for removing Hussein militarily. In other words, Bush’s new rhetoric about democracy in the Middle East after dire warnings about Hussein’s (non-existent) WMD comes off as bait and switch. Today, I’m astounded that so many Americans are giving Bush a pass on how wrong he was about WMD and how badly he’s mismanaged the war. I guess that some Americans (present company excluded?) were just so eager for revenge after 9/11 that they are willing to forgive Bush anything as long as the U.S. got to show some muscle in Iraq.
This post is already pretty long, and I haven’t addressed all of the topics that you mentioned in your blog post. Pornography especially is a complicated topic that doesn’t lend itself to short discussions. Suffice it to say: While I’m sure there is plenty of pornography out there that I would find distasteful, I don’t think that the genre — be it Playboy magazine or something stronger — is a 100% bad thing. Also, while the First Amendment, like pre-war intelligence, can be abused, I’d rather have it abused than not have it at all. Erotic entertainment has always been with us, though it was more carefully closeted before the 1960s. Its new availability via the Internet and other electronic delivery systems presents new problems regarding parents controlling what their children consume — problems that I expect will eventually be solved. But in this new availability of pornography, I do not see the fall of Western civilization.
To conclude, my bottom line is this: The more that we talk about cultural differences as “wars” and the more we present them as an either/or choice between two polar extremes, the more we lose sight of our ability to negotiate and compromise. Inflaming passions needlessly may boost ratings or rouse a partisan audience, but it also frays social civility and our need to live together. Some may think that I’m obsessing too much over a trivial and harmless catch-phrase like “culture wars,” and they may be right. But I have never seen this country so polarized, both politically and culturally. Call me over-imaginative, but I can picture this metaphorical war turning into a non-metaphorical one if we continue not to listen to each other. In the years to come, I hope that we Americans can stop talking about “culture wars” and start talking about “culture peace.”
Sincerely and respectfully,
Rob in L.A. (a.k.a. Anonymous)
***
I haven’t left comments on “Another Man’s Meat” for quite some time — in part, because Mr. Dillon did not turn out to be the reasonable conservative that I thought he was. All the more reason to start my own blog, I suppose.
Sunday, October 28, 2007
As I said, my name is Rob, and I live in L.A. I have a Master’s degree in cinema history from the University of Southern California. I’m currently unemployed at the moment, but I’m not too worried about money — for the time being, at least.
I’m also politically liberal and a registered Democrat. I think that the impeachment of President Bill Clinton was an utter waste of time and said more about the intolerance of some in the Republican Party than it did about the guy in the Oval Office at the time. In 2000, I was appalled that the Republicans would nominate for president someone as insubstantial and word-mangling as George W. Bush. And I was floored by the way that he got into the White House after Al Gore won the popular vote. I guess to the Republicans, it’s not how you play the game but whether you win or lose. I’ve long thought that Bush was the worst president in modern history, even when lots of people around me praised him for his handling of the so-called “war on terror.” Now, I’m glad to see that a supermajority of the citizens in this country agree with me. What took you so long, people?
As far as religion goes, I don’t belong to anything organized. I’m an agnostic but probably about as close to an atheist as you can get while still believing in something called God. At the same time, I’ve been alternately infuriated and dismayed by the stranglehold that religion — especially of the conservative variety — has on U.S. politics. But with the recent popularity of such books as The God Delusion by Richard Dawkins and The End of Faith by Sam Harris, there now seems to be some pushback by atheists, agnostics, and free-thinkers that’s been long overdue.
Oh, well, I hope to say more about entertainment, politics, and religion (especially when the three of them get together) in the future. In the meantime, thanks for reading!
Monday, September 17, 2007
The World Is My Dervish
Hello, this is Rob in L.A. beginning my first blog. My friend have been encouraging me to start a blog for years now, and it has finally happened.
The reason that my blog is titled “Adventures in Vertigo” (a friend thought of the name) is because I have it. Yes, to me, “vertigo” is more than the title of a Hitchcock film. My vestibular nerves were damaged by major surgery. Now, it’s not quite as easy to move around as it used to be. Also, the California DMV took away my driver’s license.
I’m not much of a poet. But having nothing better to do, I wrote a poem about how I see the world. I didn’t write it to create a landmark of world literature. I wrote it just to play a game with words. Here it is:
THE WORLD IS MY DERVISH
The world is my dervish
Around it whirls
Straight lines turn curvish
Placid air swirls
Flat floors turn spherical
I walk like a clown
It's by some miracle
I don't fall down
When not inert I go
’Round in my spell
Doc calls it vertigo
I call it hell
Life is now tribular
Havoc played on
My nerves vestibular
Balance is gone
Huge helps of humble pie
Pass as I'm hurled
Onward I stumble, my
Dervish-whirled world