Saturday, December 30, 2006

Yikes, I'm glad I'm not a pre-adolescent

Part of the pathos here should simply be chalked up to this being Long Island, but sheesh:

They writhe and strut, shake their bottoms, splay their legs, thrust their chests out and in and out again. Some straddle empty chairs, like lap dancers without laps. They don’t smile much. Their faces are locked from grim exertion, from all that leaping up and lying down without poles to hold onto. “Don’t stop don’t stop,” sings Janet Jackson, all whispery. “Jerk it like you’re making it choke. ...Ohh. I’m so stimulated. Feel so X-rated.” The girls spend a lot of time lying on the floor. They are in the sixth, seventh and eighth grades.

As each routine ends, parents and siblings cheer, whistle and applaud. I just sit there, not fully comprehending. It’s my first suburban Long Island middle school talent show.

There is something wrong with our culture and its nonexistent standards when relatives are cheering at this like fools. Kind of makes you wonder whether adolescence was worth defining as a stage of life and whether the sexual revolution has been taken to its most absurd end.

Friday, December 29, 2006

Top Ten Cool Things of the Year (in the U.S.)

This wasn't such a great year. The situation in Iraq deteriorated even more, and yet OJ Simpson's absurd book seemed to generate more outrage among some--namely the over employed entertainment media. So, what was great about 2006? Not a whole lot. Doing my best to remember what on earth happened this year, I'll try to extract the few cool things that did happen, in no particular order:

1. Stephen Colbert at the Annual White House Correspondents Dinner: In May, Colbert helped draw attention to an event that usually serves as a bad inside joke for a few hundred people by humorously critcizing the president's famous ego. He mocked Bush's supposed steely resolve--"Events can change; this man's beliefs never will"--cutting it down to the publicity stunt it is:

I stand by this man. I stand by this man because he stands for things. Not only for things, he stands on things. Things like aircraft carriers, and rubble, and recently flooded city squares. And that sends a strong message: that no matter what happens to America, she will always rebound—with the most powerfully staged photo ops in the world.

Not only was Colbert's bit itself hilarious, but it became a watershed media event. YouTube, 2006's ubiquitous website, proved especially useful in the days after the event, allowing people to access what really went down at the dinner rather than taking the mainstream media's word for it. Their word was often non-existent, because many outlets initially determined the event not newsworthy--including The New York Times and The Chicago Tribune--despite the ramifications of the event: one of the country's most clever comedians had pointedly skewered one of the most insulated of modern presidents, who was sitting just a few feet away. It should be no surprise then that the second target of Colbert's routine was the lackadaisical media--many in attendance at the dinner--who have so often failed to vigorously question George W. Bush about his agenda:

Over the last five years, you people were so good—over tax cuts, WMD intelligence, the effect of global warming. We Americans didn't want to know, and you had the courtesy not to try to find out. ... And then you write, 'Oh, they're just rearranging the deck chairs on the Titanic.' First of all, that is a terrible metaphor. This administration is not sinking. This administration is soaring. If anything, they are rearranging the deck chairs on the Hindenburg!

The event was remarkable for fusing politics and entertainment for constructive good, for being accessibile to a mass audience, and for serving as a jumping off point for debate about the usefulness of the media. The Colbert performance at the White House Correspondents' Dinner is perhaps the emblematic event of a year of consumer participation in newsmaking.

2. The Democratic Victory in the 2006 Congressional Elections: One can easily imagine the media narrative that would have persisted had the Republicans maintained control of Congress after the November 2006 elections: that Americans were satisfied with the direction of the War in Iraq despite visible reservations, that the economy wasn't bad enough to require a renewed focus on growing inequity--like the recent cuts in student aid and the failure to increase the minimum wage--and that, in general, Americans were satisfied with the Republican social agenda. A wisdom that had prevailed for years changed on November 7.

Up against a gerrymandered Congress, the Democrats still won big in the House. They also managed to eke out victory in the Senate, knocking out some formidable incumbents like George Allen of Virginia and Conrad Burns of Montana. Furthermore, they won with great help from the so-called "liberal bloggers," whose fundraising, organizing, and message generating did justice to the term netroots. The Democrats in 2006 were not the Democrats of 1996, who relied upon the contributions of moneyed coffers and directed most of their energies to retaining the presidency rather than making inroads in Congress.

The new Congress is focusing on a different course in Iraq--though it is still up to the intransigent President to assume one--and on economic issues that were ignored under one-party rule. Plus, the 2006 election was also a victory against apathy, with the number of voters up in the usually less participatory midterm elections. Especially encouraging: 24 percent of Americans 18-30 voted, the largest percentage in 20 years.

3. Harvard, then Princeton and UVA, End Early Decision: In September, Harvard and then Princeton and the University of Virginia moved to end the early admission option, deciding that it favored savvier applicants from more advantaged backgrounds. Although other top schools do not yet see the need to eliminate early decision, Harvard at least brought to attention the need for universities to be attentive to the disparate sophistication of prospective students with the college application process. Although some schools, like my Alma mater Northwestern, have made some reasonable points on why early decision still suits them, it is good for universities to publicly discuss institutional inequities. Hopefully, Harvard and its peers will figure out how better to assist both low income and middle class students with the costs of their education.

4. TV Programming Continues to Improve: Although there is still drek on television--there always will be--some TV programming continues to delight. While I think some of the critics favorite shows, particularly "Weeds" and "Scrubs" are overrated, "The Office," "The Colbert Report," "The Daily Show," and HBO programming continue to shine. "The Office" was rightfully awarded an Emmy for its fantastic ability to find humor in the mundane and for the genuine performances put forth by its cast. After a reportedly bumpy first year, "The Office" figured out how to adeptly adapt the British version to reflect the nature of the American workplace in year two. (Now, there is a French and German version of "The Office," showing that the workplace is a universal source of humor, though not necessarily the same humor). Office romances, career dissatisfaction, narrow-mindedness, and the American work ethic are depicted realistically and hilariously. To think, 10 years ago, the only good things going on network TV were "Frasier," "Friends" (meh) and "Seinfeld," and the sitcom (shudder) reigned supreme. Hopefully, the cool event of 2007 will be that cable companies lose their regional monopolies and prices decrease (I wish).

5. Some Bad People Went to Prison: This year, Jack Abramoff and Jeffrey Skilling, two men who enjoyed far too much power in the 1990s, were convicted to formidable prison terms. Abramoff was convicted of conspiracy to bribe a public official, defrauding a client, and tax evasion, and was sentenced to the minimum of 10-years, in anticipation that he will cooperate with investigations of other corruption cases. He and his lobbying firm contributed to the excess of Republican-controlled Washington. Abramoff was intimately involved with disgraced former majority leader Tom DeLay, who his lobbying firm treated to trips to the North Mariana Islands, and Abramoff himself secretly funded the trips of other Republican Representatives. Abramoff's corruption is too extensive to document here, but concurrent to his demise went the careers of a lot of disgraceful people, including DeLay, Randall "Duke" Cunningham, and Robert Ney.

Jeff Skilling, former CEO of the defunct Enron Corporation might be the person most single-handedly responsible for the California electricity crisis of 2000-2001, in which the state's recently deregulated electricty grid allowed for Enron employees to manipulate the energy markets in there, causing prices to spike and supply to be unevenly distributed. Too bad Skilling can't provide redress for all of the pension plans that were decimated by the Enron collapse as well as for the electricty bills and costs incurred by the state of California during the energy crisis.

6. Immigrant Rallies Across the Nation: Often defined by their demagogic opposition, tens of thousands of immigrants and those in favor of immigrant interests rallied across the nation in April, drawing attention to their contributions and advertising their presence to Congress, which was then considering immigration reform. For a group that fills mostly the least desirable jobs in the country, immigrants get a lot of flack, despite that the United States is a nation of immigrants. A typical complaint against illegal immigrants is that they use up public services, but wouldn't pushing for legalization and thereby getting them to pay into the system be a good way to prevent this? Of course, this would have to be accompanied by pay approximating a living wage...

7. FDA Approves HPV Vaccine and Plan B OTC: The current administration has not wholly succeeded in stymieing progress of its agencies. The FDA approved the Human papillomavirus vaccine and over the counter (OTC) Plan B contraception. Though religious zealots will probably tell you otherwise, there is never anything wrong with vaccinating against a sexually transmitted disease. The idea that it encourages unprotected sex or sex in general is like saying that a flu vaccine encourages not washing one's hands. The approval of Plan B also generated controversy, but it will make an unpleasant process easier to handle for many people.

8. An Inconvenient Truth Brings More Attention to Global Warming: In a year of erratic temperatures and a hostile climate, Al Gore used his influence and celebrity for good, bringing attention to the force behind climate fluctuations. It is amazing that the existence of global warming needs such a forceful defense, even though it is not in scientific dispute, but that is why Gore's contribution was so constructive. Though of course not as watchable as a movie like Casino Royale, Gore does an admirable job of meticulously deconstructing the arguments against global warming and detailing those for it. If parts of our country are submerged under water in the not too distant future, Al Gore can (somberly) say I told you so.

9. Google Buys YouTube: In an acquisition that didn't reek of evil, the search engine giant bought out the user-generated content giant. Not only does YouTube's searchable video content model seem commensurate with Google's model, but Google appears to be an encouraging parent corporation, managing to maintain the integrity and while innovating its other acquisitions, like this here software I'm using. (Meanwhile, the telecom market is consolidated by another dispiriting merger, as AT&T's absorption of Bell South Corp. is approved by the FCC).

10. Bob Woodward's State of Denial Released: The tipping point of popular opinion against George W. Bush must surely have been reached when Bob Woodward--establishment journalist and former Bush administration cheerleader--released State of Denial: Bush at War, Part III. Woodward's book seemed to cement the view that the War in Iraq was a mistake, and that its execution was poorly managed at best, providing an indictment of all of the key actors in the administration who consistently lied and misled over the years to save face.

This was a pretty difficult list to come up with, because this was a pretty difficult year, but I know I'm forgetting things. Please comment with any additional suggestions or any recommended subtractions.

Wednesday, December 27, 2006

Krugman introduces new considerations of fiscal policy

In his most recent column, Paul Krugman considered the Democrats' position in terms of making fiscal policy, cautioning them not to hasten back to "Rubinomics," the doctrine named after Bill Clinton's Treasury Secretary that emphasized reducing budget deficits. The policies of Rubinomics are credited with lowering interest rates and in turn precipitating the economic boom of the 1990s and with replenishing the federal treasury with the revenue that comes with a good economy, which thus garnered a greater surplus than even Clinton officials had hoped.

In hindsight, the focus on deficit-reduction is a rare triumph of long term considerations over short ones. When the Clinton 1993 deficit-reduction package was introduced, many in Congress opposed it because it raised taxes on the upper tax bracket. In just a few years, the upper and upper-middle class--the interests who had opposed tax increases in 1993--became some of the biggest beneficiaries of Rubinomics, though what made the 90s distinct from the 80s was that more of the population shared in the boom. (Remember when "Help Wanted" signs were everywhere?). The deficit reduction act of 1993 did not promise immediate benefits, as the Bush tax cuts did, but rather it stemmed long term growth, and the benefits were more immediate than people expected. It is pretty hard to believe that Congress and the White House, albeit for a brief period, were far-thinking.

Today, Democrats, remembering the success of Rubinomics, seem eager to take it up again. They're up against an even larger budget deficit than the one that faced Clinton in 1993, but the party itself seems to have coalesced around deficit reduction policy.

In 2000, budget surpluses were projected through 2004.

And yet, Paul Krugman is warning Democrats that it might not be their best bet:
[I]t's now clear that while Rubinomics made sense in terms of pure economics, it failed to take account of the ugly realities of American politics.

And the lesson of the last six years is that Democrats shouldn't spend political capital trying to bring the deficit down. They should refrain from actions that make the deficit worse. But given a choice between cutting the deficit and spending more on good things like health care reform, they should choose the spending.

The unforeseen--and how could it be foreseen?--problem with Rubinomics is that it left a budget surplus to the Bush Administration to be held up wrongfully as an example of government largess in what former Clinton economist Brad DeLong has referred to as President Bush's "right-wing class war." In our politically charged environment, argues Krugman, we should not give Republicans the opportunity to squander a budget surplus again:

I'm for pay as you go. The question, however, is whether to go further. Suppose the Democrats can free up some money by fixing the Medicare drug program, by ending the Iraq War and/or clamping down on war profiteering, or by rolling back some of the Bush tax cuts. Should they use the reclaimed revenue to reduce the deficit or to spend it on other things?

The answer, I now think, is to spend the money--while taking great care to make sure it is spent well, not squandered--and let the deficit be.

At first read, I was somewhat appalled by Krugman's prescription, which seemed to put economic policy at the dangerous behest of politics. The "state of our politics," says Krugman, is not healthy enough to leave a budget surplus to the other party, but when has the state of our politics ever been healthy. Must we always make fiscal policy based on our fears of what the other party will do with the budget?

And yet, realities are realities, and this country needs health care reform. One reality is that there will always be interests in this country that want the government to take no action on issues that concern the nation's general welfare--like health care--because they perceive it drawing unfairly from their incomes. Never mind that our current health care system--with the uninsured driving up premiums for everyone else, with small businesses straining to pay for employee benefits, and with an often sub-standard level of care for even those who are insured--is taxing everyone. Yet, the anti-government interests will always be well-represented and will probably always see their interests in the short-term, advocating tax cuts (for them) and decreased regulation. When their surrogates get into power, it will be against their interest to encourage good government. Now that some of them have been thrown out of power, maybe it's best that Democrats use their political capital to solve problems that need to be solved rather than leave behind a treasury to be pillaged. I don't blame Krugman for seeing it this way.

Tuesday, December 26, 2006

The Borat debate reemerges

It being the time when film critics offer their top ten list of movies from the last year, the debate over whether the Borat movie is funny, manipulative, racist, or just plain dumb rages on. I still submit that it is highly overrated and lacks most of the humor of the television show. I have not yet seen a satisfactory rejoinder on why it is funny for Borat to wreck some harmless man's antiques or what exactly Baron Cohen is skewering about Southern gentility by re-emerging from the bathroom with a bag of feces at a dinner party. Many of the episodes in Borat seem trying, as if Baron Cohen could not get his subjects to be sufficiently ignorant, coarse, or racist without him acting obnoxious. Even then, they don't always come off badly: the befuddled shop antique owner seems helpless, the members of the Southern dining group understandably end the dinner early. The thesis of the movie--that Americans are an ignorant lot--seems to have been established before the movie was shot. On this score, Jonathan Rosenbaum of The Chicago Reader makes a good point:

I keep reading that this movie is a sly (or not so sly) critique of racism and intolerance based on ignorance, but Sacha Baron Cohen's apparent semi-ignorant intolerance of the Kazakhs is almost always factored out of the discussion. It's pretty easy to paint them as a pack of pathetic anti-Semites if you know nothing about them, but isn't that the kind of glibness Borat is supposedly attacking?

To me, the interesting aspect of the television episodes of "Borat" is the inhibitions revealed even in the most bigoted of his interview subjects. When Borat is driving with a blatant anti-Semite, the man goes off on Jews in a frighteningly absurd rant, going so far as to sanction the act of rounding up Jews. He then stops himself to say that we don't do that in this country. Oftentimes, when Borat starts baiting people with questions that could incense their latent racism, he receives variations on that same response: that we do not say those things in this country. Borat's bluntness is often met with cautious reluctance to disclose true beliefs, as when he gets a Republican primary candidate to hesitantly acknowledge that the logic of his beliefs mean that Jews will go to hell. In "Borat," Baron Cohen reveals a nation where political correctness has been internalized, and this may not be such a bad thing. Borat the film appears to be getting accolades for being "offensive," but that's no feat if the clever humor of the television show is lacking.

Sunday, December 24, 2006

Apocalypto wasn't accurate?!?!

An article in Salon from Mayan expert Marcello Canuto about Apocalypto--nay--Mel Gibson's Apocalypto and questions over its accuracy:

The movie tracks a young Mayan man who is captured in a surprise raid on his village. Forced to abandon his family, he and his companions are taken to the nearby city to be sacrificed. He manages to escape and, pursued by his captors, attempts to return to his village to save his family. During his getaway, he reaches a beach where he witnesses the arrival of Spaniards.

This final scene tells us that the movie focuses on Maya society on the eve of Spanish contact in the 16th century. Yet the Maya city portrayed in the movie, central to its plot, dates roughly to the 9th century. This is akin to telling a story about English pilgrims founding the Massachusetts Bay Colony, and showing them living in longhouses described in "Beowulf." In fact, Gibson incorporates Maya images from as far back as 300 B.C. Throughout the movie, these anachronisms make Maya civilization seem timeless, and undermine the idea that the Maya could and did respond to change.

Another question I had more about the logistical accuracy of Apocalypto: how did main character Jaguar Paw so quickly and expertly find his way back to his village in the forest of the Yucatan peninsula from the city center? And who is supposed to symbolize Jesus--Jaguar Paw, or his water-birthed child? Also, why was the forest so close to the cornfields as well as the beaches that the Spaniards land on?

Actually, I was disappointed that Apocalypto did not actually tackle head-on the downfall of Mayan civilization, whose supposed causes--pestilence, colonization, depression of trade, slave revolt--are in dispute; however, maybe it's for the best that Mel Gibosn is not the one to do this. As Canuto says of Gibsons enormous inaccuracies:

If there were ever an apocalypse in the history of the Maya -- and herein lies the ultimate demoralizing irony of the movie -- it would be because of European contact. But in the movie, after two hours of excess, hyperbole and hysteria, the Spaniards represent the arrival of sanity to the Maya world. The tacit paternalism is devastating.

Nonetheless, I loved Apocalypto only because it was totally INSANE; and fortunately it has compelled me, as the writer hopes, to look into the real history of Maya civilization rather than trust Mel Gibson.

Tribune's silly obsession with the Obama-Rezko connection could be turned into something more useful

I'm convinced that at least half of the reason that Chicago has the reputation for politics more scandalous than other parts of the country is because scandal is drummed up by the local media. Rags like the Tribune and the Sun-Times use the language of "appearances" of corruption and guilt by association to imply that which is most sinister.

Witness the Tribune's current efforts to hang the sign of a man named Antoin Rezko on Senator Barack Obama. Rezko was involved in some state government scandals, and the Tribune has since decided to impute Obama based on association. Their most recent implication is that Obama's office gave a college student an internship as a favor to Rezko, who appears to be a family friend of this intern. Obama's office is of course cornered into denying that Obama has ever done any favors for Rezko. Honestly, I wouldn't be at all surprised if the intern got her job through Rezko, not because Obama is uniquely bereft of an ethical compass but because that's how internships generally work on the Hill.

Perhaps the Tribune could use its considerable but dwindling resources for reportage to cover the more interesting angle of how internships are ferreted out in Washington. The current thought is that these relatively inconsequential jobs would serve as good reward for loyal (and big) donors to the Congressman. I don't think there is anything wrong with hiring someone loyal to a candidate-- a campaign worker, for instance, is a natural choice to fill a post-election job--but the kin of a big donor is someone who has gotten a job entirely because of blood association to wealth.

This leads to what I see as a larger problem of entitlement by some of the wealthy and well-connected in our society. It is revolting when people get jobs or admissions to college because of their family connections. My revulsion is directed not just at the party who is doing the admitting but also at the donor expecting a quid pro quo. If you donate to a university or candidate, it should be out of belief in the institution or the cause, not to tip the scales in favor of your child. From what I have heard from people who have worked in admissions at colleges, it is general policy to give a big donor's black sheep kin the benefit of the doubt. Such practices reconcile us all to playing the game, "networking," valuing superficial connections over merit and repeating the mantra that this is just how things work. Wouldn't it be nice though if some of these privelged people could lead from the top and not expect special treatment for their children?

Saturday, December 23, 2006

The blogging landscape of 2006

Bloggers were the targets of many verbal floggings this year. The view towards the act of blogging became more hostile and more mocking, hostile towards those who would deign themselves worthy enough to express opinions on that which they are not experts, mocking of those who feel the need to publish the mundane events of their daily lives to make them available to the world.

As political bloggers on the center-left became more influential this election year, they were derided by professional columnists like David Broder and David Brooks. Broder accused them of vituperation and general ineffectiveness back in June when the YearlyKos convention was taking place. "Thus, we have blogger Jerome Armstrong, a Kos partner, arguing for mounting campaigns everywhere, no matter the odds," he chided then. And yet, some of the most unlikely success stories of the Democratic upset were those supported by the bloggers, and some of the most stunning defeats were those picked by the establishment. Thus emerged Senators-elect Jon Tester and Jim Webb. Thus languished Tammy Duckworth and Harold Ford, Jr.

Still, bloggers did not catch a break in 2006. Instead, establishment writers took it upon themselves to drown out the cacophony. Bloggers became representative of an advent of "narcissim," as another professional columnist, George Will put it. "So much of what is done on the web is people getting on there and writing their diaries as though everyone ought to care about everyone’s inner turmoils. I mean, it’s extraordinary," Will lamented earlier this month. Why all of this fulminating over the massification of writing? Broder, Brooks, and Will imply that writing is for an elite, and when the masses do it, it becomes incendiary, uncivil, and artless. Expressing one's opinion through blog channels is an act of narcissim (never mind that Will does it every week in his column for the Washington Post).

Little-acknowledged in all of the commentary about blogs and bloggers is what to make of the phenomenon it evidences: that millions of people across the world are driven to write in their free time. Writing, which we feared a dying skill, mauled by the allure of cognitive disengagement encouraged by visual media, particularly the television, and the laziness of the current vernacular, is enjoyable to millions. Sure, the instantaneousness of blogging does not promote revision, but it does not hinder it either. Though Time's person of the year might have been lame, I find it amazing and gratifying that those without access to the establishment channels of communication--television, newspapers--are embracing the Internet and yes, writing, and it is worth celebrating, not excoriating.

Friday, December 22, 2006

A little perspective for the holidays

Weather is such a mundane part of existence--the stuff of small talk when we've run out of other small talk--but it can destroy us all. That is hardly mundane. I am reminded of this again with the snows in Denver, or "fresh pow," as I have recently understood it to be called by ski bums (or just mythical ski bums). About a year ago, my mom and I set off to Boston to see my brother's biannual sketch comedy show. As it happened, the night, we left a big snow storm was predicted to hit both Chicago, and--if I recall correctly--Boston. (A sidenote: I don't know that I have ever been to Boston when it hasn't precipitated heavily or been very cold).

My mom and I could have gone into the evening with one of two mindsets: (1) What an inconvenience for us, we better get out of Chicago, getting delayed is terrible or (2) This is out of our control, we should only fly if the airlines assure us it's safe, and the fact that we're even getting driven to the airport amidst a snow storm and that planes are attempting to fly out is above and beyond what we should expect. It indicates something impressive about modern technology and in turn the way we become accustomed to convenience. We chose the latter approach, which made the experience much less vexing. We sat in a cab for maybe two hours and were delayed another two hours or so at O'Hare. That's all. Somehow we made it to Boston the same evening.

I try to think of that night every time I face an inconvenience that is beyond my control. It's amazing how much our society requires that everything move smoothly, that I think every once in awhile, it is worth stepping back and awing at it all. This year, if it happens to snow in Chicago when I fly back, I know I'll be frustrated, especially if I can't get there until the middle of next week--as is the case for people going to Denver this year--but I can't help but think that because we rely on things to be convenient all of the time, we forget how unnatural such convenience is, how hard people have to work to keep stores open 24/7 and airports running through the holidays, a time when inclement weather is hardly rare. Maybe every once in awhile, it's valuable for us to recognize that there are forces out of our control, especially in the form of our environment, and that we should therefore leave ourselves some legroom when those situations occur.

For more perspective on the holidays, better than that which I can offer, watch It's a Wonderful Life. It's a movie that's all about putting life into perspective, and it is always appropriate at this time of year.

Tuesday, December 19, 2006

When Academics is Absurd

The academic fields in the humanities and sometimes the social sciences (especially sociology and anthropology) are too often a wasteland of gratuitous jargon and dogmatic promulgation of strange critical theories. Words like "space," "discourse," "post-colonial," "post-structuralist," "post-[fill in the blank]" and "queer theory," are all too common, acting as crutches, the substitution of faddish critical theories for independent thought. Furthermore, some subjects of academic study are absolutely untenable, like "Sex and the City," hip hop (at least, most of what appears to be out there), and chick lit. It is only when you read the undergraduate and masters theses titles that you understand just how absurd are some of the things that get passed off as academic and why searching for ridiculous thesis titles online is an entertaining pasttime. I bring you the fruits of my labor, broken down by taxonomies of drek:

Here's a pretty standard one:
Narrating My Body, Narrating Myself. Body Narratives of Romanian Teenage Girls

Grabbing two unrelated ideas and finding causality between the two. Bonus points if postcolonial discourse is involved:

Genocide 'n' Juice: Reading the Postcolonial Discourses in Hip-Hop Culture.

Lesbians, space, and more lesbians:

Experiences of Lesbians at the Belgrade Pride 2001 and Zagreb Pride 2002

Re-imagining 'Romanianness': The LGBT Movement Challenging the Heteropatriarchal Order of the Nation

Sexually Active Connection In Long Term Queer Female Relationships

Drowning in Loneliness and Writing the Blues: Creating Lesbian Space in the Novels of Radclyffe Hall and Leslie Feinberg

Sapphic Sisters in the City of Brotherly Love: The Interactions of Space, Community and Lesbian Sexuality

Gender bending:

Children Doing and Undoing Gender. The Case of a Polish Kindergarten Group

Sickos, Psychos, and Sluts: Images of Transgendered Women in Media Culture

"That's an Extra One:" Adolescent Transgender Identity and Self-Acceptance

How Television Viewing During Adolescence May Influence Exotic Dancers' Perceptions of Female Gender Roles: An Exploratory Study

Conflict in Congo: Locating the Gendered Body

General weirdness:

Environmentalism Without Guarantees: The Spectral and Scatological Politics of Displacement in Miyazaki Hayao's Sen to Chihiro no Kamikakushi (Spirited Away)

Images of Japanese Women Who Seek Emancipation Through the Experience of Death

Gratuitous use of vagina:

Vagina Dialogues: The Developmental Underpinnings of Older Women's Fear of Losing Youth and Beauty

Pleasing Pussy: Exploring Women-Centered Pornograph

Let's not give short shrift to the other half of queer theory:

The Racial Stereotype as Sexual Fetish: Latino Racial Fetishism in the U.S. Gay Male Cultural Imagination

Utter bullshit (which was probably written the weekend before it was due):

Self-Mutilation or Body Beautification: The Meaning of Tattooing and Piercing And Implications for Social Work Practice

(Re)producing Masculinities in Sports: Football as 'the Boys' Game'

Exploring the Relationship Between Resident Assistant Leadership Styles and Student Satisfaction in the Residence Halls

Fabulousness as Fetish: Queer Politics in Sex and the City

Comforting Touch Between Nurses and Patients: An Exploratory Study with Implications for Medical Social Work Practice

Translating Double-Dutch to Hip-Hop: the Musical Vernacular of Black Girls Play

Deconstructing the Vagina Monologues: A Taxonomic Approach to Social Change

Race, Class, Conflict and Empowerment: On Ice Cube's 'Black Korea'

Post-Graduation Career and Education Plans for Student-Athletes Who Participate in CoCurricular Activities


Systematics, Osteology, Sexual Dimorphism, Age Classes, and Population Dynamics of Teleoceras Fossiger from Jack Swayze Quarry, Clark County, Kansas, and Minium Quarry, Graham County, Kansas

and my personal favorites:

Erotic Sadomasochism: Women Finding Meaning and Opportunities for Personal Growth through Radical Sexual Practices

Brown Meets Green: The Political Fecology of Poop Report.Com

Monday, December 18, 2006

Movies I Would (Shamelessly) Like to See

December to mid-January is about the only time of the year I go to the movie theater with regularity, but right now, there are not a whole lot of promising movies out there, even though it's "Oscar Season," but there are a bunch of crazy ones churning about. These may actually be the most promising. Here's what I'm going to see if I go to the theater over winter vacation:

Mel Gibson's Apocalypto: This lengthy tale about the end of Mayan civilization seems to have been inspired by Mel Gibson's religiosity and taste for the epic. Mel is proof that full-out crazy people should be encouraged to make movies more often. Whereas an imaginative but failed movie like Waterworld probably drowned in its own ambition because Kevin Costner is too sane, Gibson's Apocalypto appears to deliver. A movie with an insane premise needs an equally insane director, and Mel is just that. Like many religious zealots, Gibson's pet obsession with the end of days and appears to have driven him to ponder how past civilizations fell in order to anticipate how exaclty the Second Coming will unfold. This is the man after all who said of his direction for The Passion of the Christ, "The Holy Ghost was working through me on this film, and I was just directing traffic." Even though a crazy man like Mel is best left to occupy himself with such thoughts, the rest of us are hardly loath to find the subject interesting. Apocalypto promises to be great whether it is good or just plain ridiculous.

Dreamgirls: Some people will probably dismiss this one right off the bat because it prominently features Beyonce Knowles in a serious acting role and purports to resurrect Eddie Murphy from the ash heap of also-rans, but the story itself is quite rich. Based on the career trajectory of the Supremes, Dreamgirls examines the difficulties endured by a pop music group because of their race and gender. Knowles and Jennifer Hudson's characters embody the dilemma faced by black musicians in the 1950s-1970s of whether to embrace blackness or appeal to an increasingly receptive white audience. Jamie Foxx plays the Berry Gordy figure who champions Knowles' ability to appeal to the latter group, while Hudson increasingly moves in the other direction. If nothing else, I am interested to find out how the movie treats these themes. (For a nice explanation of these themes, see Pop Matters' review). I'm a little worried, after all, because Dreamgirls is based on a Broadway Musical! and I haven't been much impressed with Broadway's treatment of serious subjects post-West Side Story. Still, like Apocalypto, Dreamgirls promises to be good whether or not it is actually quality or just melodrama with glittery costumes, big hair, and electrifying ballads.

Rocky Balboa: I am a fairly devoted Rocky fan, in part because I cannot actually remember how bad the later sequels were. No, I recall only the awesomely good of the Rocky series, such as the ambitious analogies between Communism's fall at the hand of democratic capitalism and Ivan Drago's fall at the hands of Rocky Balboa or the operatic travesty of Apollo Creed's death by blow to head. I can't quite conjure up the badness of the dialogue, though I'm sure it is bad enough to make even Joe Eszterhas cringe. And I only speak of Rocky IV; Rocky V is actually the nadir and conveniently the end of the series. Until now. Just as in Rocky V, Rocky is back in Philadelphia in Rocky Balboa, now widowed (Rest in Peace, Adrian), and living of humble means as a deli owner. Just as it did 30 odd years ago, the boxing ring calls Rocky back, and, still up a few brain cells, Rocky answers. Rocky Balboa runs the risk of taking already tired themes and wringing out what life is left of them; on the other hand, a movie aboutRocky in Winter could be a meditative last hurrah for the long-lasting series. We shall see.

Saturday, December 16, 2006

Metro Fare Hikes are Counter-Intuitive

In this country, the gospel of the market reigns supreme, often to the detriment of smart and far-thinking policy. Witness the recent proposal by the Washington Metropolitan Area Transit Authority (aka "Metro") to raise fares during rush hour or peak time to as much as 2.10 to head off a $116 million budget deficit. On the one hand, a good public transportation system is worth the money from its riders, and the Metro is one of the better public transportation systems in the country. On the other hand, it should be expected that a public transportation system is not a money maker and is often a money loser.

What Metro is losing, however, is gained back both financially and in terms of quality of life: in the form of decreased traffic, less roadwork, and less pollution. It is in drivers' as well as rail and bus commuters' interest to have a good public transportation system, even if that means paying a little extra in taxes for it. There needs to be some sort of understanding though, that Metro is not a money-maker, that it will often lose money, and that public good services such as public transportation often do lose money. The alternatives are worse. When Metro riders say that the potential fare hikes will push them to drive to work, Metro faces the vicious cycle where it has to continue to raise fares to make up for lost revenue because of commuters who no longer ride Metro because the fares got too high.

Metro is applying incentives to change the nature of the demand of Metro riders by proposing to increase fares during high peak hours (5 to 9:30 am and 3 to 7 pm on weekdays) to encourage riders to travel in off-hours, but as letter writers to the Post have said, it's pretty difficult for most people to change their traveling times. Most of us have to be at work within the peak traveling times. Metro also might charge a higher fare on people who get off at heavy traffic stations like Farragut North, but again, there is little most people can do to change their destination train station. Market calculations of supply-demand and incentives are not intuitive to public transportation like the Metro. Ideally, the business district in D.C. would be more spread out and workers would have flex time, but as yet, peoples' working patterns are pretty rigid.

Edited to add: Virginia legislators need to increase the revenue they pay into the Metro, especially considering it services so much of Northern Virginia, and Virginia pays significantly less in taxes than D.C. or Maryland. (I'm willing to say this as a Virginia resident).

Friday, December 15, 2006

Small Acts of Immense Laziness

On days when I walk past my apartment's trash chute to throw my garbage away, there are, almost unfailingly, large items underneat the chute cannot fit into it but are meant to be thrown away. Most annoying is when someone places a large box underneath the chute to be thrown away when s/he could easily collapse the box and fit it into the chute (or better yet, recycle it). A vaccuum cleaner and a roll suitcase missing a wheel are just some of the other items that my neighbors have put under the chute. The neighbor(s) responsible for this are not only creating a small mess in the corridor but communicating their immense laziness. This neighbor apparently deems him/herself too good to properly dispose of his/her trash and rather leaves it to the people who clean our apartment complex. Behavior like this is just plain lazy; there is no other word for it. No one is above taking out his/her own trash.

Monday, December 11, 2006

The Last Unpaid Job

In the summer, Washington, D.C. becomes a town of interns, those lucky college students who come looking for the connections and perhaps a little experience to build a resume in anticipation of the post-college job search. However, for those returning post-college to look for a job the old meritocratic way--sending in some cover letters and resumes in response to job postings--lacking "hill experience" can be a disadvantage, lacking explicit connections, even moreso., the de facto website of Congressional job openings, is not necessarily a haven for paid work, as it mostly seems to post internships. Many people even feel compelled take such positions after college, having nonetheless earned a bachelors degree and often thousands of dollars in loans along the way.

Internships seem to be popping up everywhere nowadays to the point that the intern is the new glorified secretary. Why? Because employers love cheap labor and because they have realized that they can expect more than a solid undergraduate record and summer jobs at the local coffee shop. They can expect experience. This sets up a vicious cycle where the job applicant cannot get hired without experience but cannot get exprience without getting a job. Thus, the unpaid, often menial internship. This system undeniably favors the well-off, and these are the students who tend to fill internship spots. The numbers bear this out, according to an article by Yael Julie Fischer in Campus Progress, from earlier this year:
Most students live on a tight budget; we need cash for fun, for essentials, or for tuition. Asking us to fill our resumes by emptying our wallet seems like an awful lot to ask. A USA Today survey of unpaid interns revealed that over 60% had parents earning more than $100,000 a year. Only about 20% of all families of college students earn that much. For most students, working for free is just too expensive

In that vein, Fischer makes a worthy proposal:
Students gain a lot through internships, but they also have a lot to offer. Their skills and talents deserve recognition and there is no reason they should be exempt from the Fair Labor Standards Act (FLSA). Most employers recognize the questionable legality of unpaid internships, which is why no official statistics exist on the number of such positions. Students deserve to earn minimum wage for their work despite the glamour of the job, both because students’ skills warrant compensation and because the educational opportunity of an internship should be available to all, not just to those whose parents can bankroll them or who have the time and wherewithal to hold down other paid jobs on the side.

Universities could also ramp up their funding for the apprenticeship endeavors of their students, not that such pursuits should cut into a liberal arts education any more than they already do. Finally, I still maintain that the best education for the workforce is the liberal arts education if the student emerges with a good grasp of what s/he was taught. Of course, there are many things one learns in the workforce that cannot possibly be imparted at a university, but the ability to analyze and think critically are woefully underrated in favor of a prestigious-looking internship which may have merely involved filing and inputing data to Excel.

Sunday, December 10, 2006

What is art? What is good art?

I just skimmed through a book called Nobrow: The Culture of Marketing, the Marketing of Culture, by John Seabrook, who has written for The New Yorker and Vanity Fair. The book itself suffered from lack of organization and the author's over-reliance on personal anecdotes. Seabrook often tries to fit too much into his thesis, so that everything from Bill Clinton to the Helmut Lang store in New York City's SoHo neighborhood is emblematic of the "nobrow." He should have instead devoted more of this ambition to defining what nobrow culture means, which he never sufficiently does.

Nonetheless, Seabrook reintroduces a couple of challenging questions on the subject of art and what exactly it is or isn't. One question he resurrects is whether culture hierarchies are legitimate or just manifestations of upper class hegemony. My view is that, on the one hand, the old indicators that one enjoyed high culture--season tickets to the opera, patronage of the art museum, knowledge of the Western literary canon--required a certain degree of wealth. On the other hand, that which we consider high culture today has been open and accessible to the masses in the past, from orchestral concerts to Shakespeare plays. Perhaps only more recently do we categorize these performances as highbrow. Furthermore, I think judgements of quality can certainly be made about art. The process of engaging with art is intimately connected with the act of critiquing art.

Seabrook argues that this is all moot because culture today is no longer characterized by the highbrow/lowbrow duality but rather by a unified "nobrow," where what is marketable is king. To the extent that Seabrook is right about this, I believe that a new standard can be introduced to sift through the nobrow: the authenticity of the work of art. This classification may help adjudicate the argument that ensues when one side has to defend "pedestrian" tastes against another side's charges of "elitism." For instance, two music artists may be equally marketable, popular with a mass audience, and able to produce catchy music, but one artist may be eminently more authentic than the other. The latter has composed their own work where the former is produced. Using this distinction, we can discard the inauthentic--the Avril Lavignes and Britney Spears--and then go onto debate whether the authentic artists are actually good at what they do. Thus, those of us who get tarred for disliking art simply because it is popular with a mass audience can avoid such an easy charge and get to the heart of why something is or isn't good.

Wednesday, December 06, 2006

The endless D.C. argument

Commenting of mass proportions broke out on the blog DCist yesterday over a Portland, Oregon-based band's negative review of their experience at music venue DC9 in the Shaw-U Street-Howard neighborhood. (Oh, for your own edification, if you even care, every D.C. neighborhood has at least three names kind of going on. For instance, my office is located in Federal Triangle and just south of Gallery Place-Chinatown-Penn Quarter. Oh, and not too far from Capitol Hill. Yeah. Nutty!). Anyway, here's the start of the blog entry:

A recent tip from Dave at Indiefolkforever lead us to a rather unflattering portrait of our fair city. Norfolk & Western, a Portland band that visited D.C. last month, apparently didn't have a very nice time playing DC9 or visiting the U street/Shaw neighborhood in Northwest D.C. As part of a tour journal posted on Local Cut, the
band wrote:
Washington DC proved to be a less pleasant experience for all of us. DC is not the safest city in the world to begin with, and according to my sources, the club we played at was located in a particularly bad area.

I have mixed, convoluted feelings about this band's experience, the discussion that ensued on DCist, and the larger implications of this type of discussion, all of which I will try to explain.

So anyhow, this band appears to have had a bad time in our fair city, and many commenters are taking personal offense. Really, though, who's fair city is it? Not mine. I still consider myself a Chicagoan (or Chicago suburbanite), and I'm guessing most of those commenting have only been in D.C. for a couple years at most, so let's just establish that few of us are authorities on authentic D.C. (which also happens to be an oxymoron).

Anyway, the perennial argument in D.C. ever since hipsters settled the Northwest (NW D.C., this time, not the Pacific Northwest) and continued to move East, centers around how transplants/upper and middle income/white people feel discomfort towards the "real" D.C. People who come to D.C. are viewed with a skeptical eye, with everyone from suburbanites to bands from the Pacific Northwest at risk for getting slammed for not accepting D.C. as it is. Here's one comment on that blog entry that expresses such a sentiment:

[The band's] comment [on their experience at DC9] is unfortunately typical of quite a few other people I have met from the Pacific NW, and also from other crunchy "liberal" places like Vermont and Minneapolis (where I'm from). They think they're all tolerant and progressive, but aren't comfortable with actual diversity. I'm sure they thought U Street is a particularly bad neighborhood, because like 1/2 the people on the street are black.

It is true that a lot of people habor a mental map of the District that redlines most of Northeast, Southeast, Southwest, and even some of Northwest (east of 16th Street). DuPont and Georgetown are great, Adams-Morgan is fine if you stay on the main strip, and U Street is "sketchy." Seriously, someone said that last thing to me this weekend. People like that don't make it too far out of Georgetown, which is a shame, because Georgetown nightlife sucks.

Lately though, I've been thinking that it's silly to deride such people for their insulairty. For one, we're all insular to some degree, preferring to stay in places where we're comfortable. Furthermore,do hipsters really want frequenters of Georgetown bars invading U Street? U Street establishments are busy enough as it is. Still, much of the dismissive remarks made about D.C. neighborhoods are ridiculous. I've always hated the word "sketchy," part of the lexicon of the self-sheltered urban dweller because what people often mean when they apply that word is that a neighborhood is different to them, lacking in sports bars, Banana Republics, and yuppy condos.

Still, I can understand the worry about being mugged and think the derision of those who are worried about walking alone on a dark street is unfounded. Some commenters acted like the band was lucky not to get mugged:

As far as their "gang experience", sounds like the guys were just pushing their buttons. I agree it would be intimidating, but come on. They didn't get robbed or anything while they were here.

I've never gotten mugged (*big knock on wood*), but the D.C. residents who try to prove their street cred by deriding anyone who's not comfortable with the idea of winding up in a situation where they are handing their wallet to a scary man whose gun is pointed at them have either never gotten mugged either or are bloviating. No one wants to get mugged, and it should not be acceptable to get mugged. I don't care if you're in the poorest neighborhood in the city; I don't think any decent person --no matter their socio-economic situation--should be accustomed to being mugged. Too often the assumption around here is that in poor or non-white neighborhoods, crime is part of the landscape. Even if those conditions predict higher crime rates, no neighborhood prides itself on high crime, and no neighborhood wants high crime. There is this idea, not just in D.C. but all over, that urban authenticity is predicated on one's exposure to crime. It's unfortunate that crime gets elevated to a character-building experience.

Finally, and this is slightly separate from this particular incident, I get a little fed up when hipsters act as if their conception of D.C. is more authentic than anyone else's. Nothing about the modern conception of the city is authentic. For a long time, city streets were home to cesspools of waste because of inadequate sanitation systems, urban residents lived on top of each other in ramshackle dwellings, and the young and old alike labored long hours in the industrial sector with little time for leisure activities like hanging out at the local music joint. The notion that a city is a place for enlightenment and creativity is relatively recent. I always find it especially condescending when people look down on suburbanites for not living in the city and supposing it is because they are afraid of diversity and creativity. These people forget that for a long time, the American dream was to get out of the city. Only recently is the dream to move back, and it is an increasingly hard dream to attain if one's target is a city like Boston, San Francisco, New York, or even D.C. (That's why Chicago is where it's at, but anyway...).

Tuesday, December 05, 2006

The Most Random Group of People You'll Ever See

The group of award winners at the annual Kennedy Center Honors two nights ago could not have been a more unlikely cast of characters. Capped off by George and Laura Bush and Dick and Lynne Cheney, this is the Kodak Moment of the year:

From left, Andrew Lloyd Webber, Steven Spielberg, Dolly Parton, Zubin Mehta, and Smokey Robinson.

This photo could be the setup for an extreme version of the joke that begins "a priest and a rabbi walk into a bar..." Imagine the possibilities. Also, this must be the most overrated group of Kennedy Center honorees, excluding Zubin Mehta, with whom I'm not familiar, and Smokey Robinson, who is pretty awesome. Are the winners of this award usually this mediocre? I have never paid much attention to the event until this year, because it is kind of local news I guess, but really, who won in the past? Puff Daddy? Julia Roberts? Brian Grazer?

Anyway, in honor of the event, I've been listening to some classic Smokey Robinson and the Miracles and musing, as I do every six months or so, on what a disgrace to musical theater Andrew Lloyd Webber is.

Perspective from a Century of Living

Today the Washington Post's Express has a feature [PDF Link] about a 104-year-old Kansan, Waldo McBurney, who still continues to work. McBurney is full of wisdom. Plus, his name is Waldo. It's nice to see coverage of someone who espouses values of humility and simplicity amidst news of excess and hunger for power. Here's McBurney:

We're living in a faster age and we think too much about money and leisure. Our morals have gone downward. There's always hope, but it's hard to see how the good is going to come.

Even though McBurney has known hard work all his life, he indicates that he lived free of the stress that characterizes modern, urban living, where we operate under the idea that productivity is the result of long hours and high pressure. Again, here's McBurney:
McBurney grew up on a farm at a time when neighbors helped neighbors without asking, when life was more about work than worry. "I expect people worry now more than then. Worry is killer," he said.

This guy couldn't be more right. However, there is one pearl of McBurney wisdom that left me a little...confused:

A lot of the poulation thinks the biggest thing they can do is kill somebody they don't agree with.

They do? I think you lost me there, Mr. McBurney.

Sunday, December 03, 2006

Conversations I've had with people about Golden Oreo Cookies

More conversations than you might think can be had about Nabisco's new type of Oreo Cookie, the Golden Oreo. I have instigated many. The tasty vanilla "creme filling" is better sandwiched between two rich, vanilla crackers than the standard chocolate, in my opinion. To some, this is Oreo blasphemy, but I have never much liked original Oreos (Oreoes?). The possibilities for discussion on these points are endless. Thus, I bring you, Conversations I have Had with People About Golden Oreo Cookies:

Setting: Cafeteria at work
Me: Man, I love these cookies.
Co-worker: Yeah, you say that everyday.
Me: This is my favorite moment of the day.

Setting: the Rosslyn Safeway
Store clerk: (looking down at the apples and Golden Oreo Cookies I just put on the conveyor belt) A healthy snack.
Me: Yeah, I love those things.
Store clerk: No, I didn't like them. They just weren't right. They're too original.
Me: Oh, really? Yeah, I just love the vanilla.
Store clerk: I had to give them to my mom. I had bought the Mint Oreos, Halloween Oreos, Regular Oreos, and those ones, and my cousins stayed over and ate them all except the Golden Oreos.
Me: Oh really? You should make them buy you more.
Store clerk: Well, they went back to New York. So all I had left was the Golden Oreos, so I gave them to my mom. She loved them.

See? The source of endless conversation.

Saturday, December 02, 2006

Movie Review: The Candidate

Michael Ritchie's 1972 film The Candidate is nothing if not prescient. An indictment of the television age and its impact on political campaigning, the film suggests that running for office makes it paradoxically difficult for the candidate to be a true public servant.

Bill McKay, played by Robert Redford, is a storefront lawyer in San Diego who helps his clients, people on the margins of society, get a leg up in the legal system. He has little influence compared to a public official, yet, McKay has no desire to run for office, even though (or because) his father (Melvyn Douglas) was once the governor of the state of California. When he is approached by a campaign operative and old schoolmate Marvin Lucas (Peter Boyle), McKay is persuaded to run for U.S. Senate only on the condition that he can say what he wants and that he will lose handily.

Indeed, the incumbent Senator, Crocker Jarmin (Don Porter) holds a safe seat. A Ronald Reagan figure, Jarmin appeals to voters by upbraiding the welfare state and the federal government, though he is not adverse to calling on its resources when his popularity is at stake. He has the folksy, hard-fighting spirit of a former football star who is eminently comfortable feigning the upstanding grandpa role. He has long been unchallenged for his seat, so he is free to parrot glib platitudes that sound logical and come off as sincere.

Then, McKay comes along, and manages to sound even more sincere. When asked what he thinks about property taxes, McKay has the temerity to respond, "I don't know." He does not shy of sounding liberal at the dawn of an era when those views are becoming subject to derision and fear-mongering. Such candor invigorates his base and helps him close in on Jarmin, to everyone's surprise. Soon, his campaign managers are urging McKay to tone down the bold talk and tread lightly so he can appeal to the undecided voters, those people who may vote for McKay because he's cute, so long as he doesn't come off as too angry.

I left The Candidate with some sadness, seeing in the script the idea that the political process insurmountably removes the modern, national politician from the people s/he campaigns to represent. Upon winning the election, McKay asks Marvin Lucas "What do we do now?" One senses that the Senator-elect knows in his heart that he made more of a difference as an unambitious lawyer than he ever will as U.S. Senator. In the television age, elected national officeholders don't make news for their ideas; instead, they are there to entertain, to carry off a persona, whether it is that of a Crocker Jarmin or a Bill McKay.

Friday, December 01, 2006

Taxation with (some) representation?

Will D.C. finally get the House seat it deserves? Probably not from a Republican Congress, but look for the bill to resurface next year when the Democrats are in the majority.

D.C. Mayor-elect Adrian M. Fenty went to Capitol Hill yesterday and Utah prepared to redraw its congressional districts as members of both parties said a plan to add House seats for the District and Utah had good prospects for approval next year.

P.S. What's up with the re-emergence of the fedora? Seen here on Fenty:

Wednesday, November 29, 2006

Movie Review: Stranger than Fiction

I am a sucker for concept movies. At first viewing, I loved Being John Malkovich and Eternal Sunshine of the Spotless Mind. Adaptation and Memento were pretty good as well. Lest I forget Fight Club, another clever one. So it's unsurprising that I began to worry that my rave reviews relied only upon that a film have a surreal concept. Fortunately, Stranger than Fiction's failure as a concept movie proved I had inordinately worried.

The movie begins with clever computer graphics that emphasize the mundane exercises around which Will Ferrell's Harold Crick bases his life. Crick works as an auditor in the Internal Revenue Service who one day discovers that his life is actually a construct for a book written by acclaimed author Kay Eiffel (Emma Thompson). You've seen this man before. He was played by Kevin Spacey in American Beauty, Edward Norton in Fight Club, Jim Carrey in Eternal Sunshine of the Spotless Mind. This is not to mock the common theme in movies of the existential crisis provoked by meaningless work; on the contrary, it is one of the perenially ripe subjects of our age.

The problem is, Stranger than Fiction assumes that a "masterpiece"--the compliment that Dustin Hoffman's English lit professor, Jules Hilbert, bestows upon Eiffel's book--need only present a story that we already know well. Death and Taxes, as the book is called, has barely progressed before we find out that Eiffel has spent years wrestling writers block on the subject of how to kill off Crick. I am usually impressed that authors don't get writers block more often, but I find it hard to believe that a talented writer becomes inert for ten years over killing a character off. Unless the story is a murder mystery, the death seems relatively inconsequential to the larger purpose of the novel.

I wanted to love Stranger than Fiction because it is not often that a zany English professor is one of the protagonists in a film, and moreover, that he is permitted to make clever jokes that appeal to a more literate audience. Unfortunately, Professor Hilbert only came in handy in scenes that would have totally bombed without a closing pithy remark.

No one actor, even Queen Latifah in her absolutely pointless role as Eiffel's publishing company's assistant, ruined the film, though. Stranger than Fiction's problem is that its central concept rendered itself pointless. In Eternal Sunshine of the Spotless Mind, the concept--a device that allows the lead couple to forget their turbulent relationship and (unintentionally) reunite--imparts meaning: to be careful for what you wish, to learn how to embrace or at least accept the past, painful as it sometimes is, to question the wisdom of fate if it indeed exists as a phenomenon. In Stranger than Fiction, nothing about Crick's life as product of third person omniscient narration adds meaning to the story. The movie could have easily been about a boring IRS agent who finds love in an unconventional baker (Maggie Gyllenhaal) and learns how to live life to its fullest, sans cool script device; however, without the gimmick it probably would not have stood out among the pack.

Monday, November 27, 2006

Generalism in a specialized world

Society becomes more complex when individuals become more specialized, and individuals become more specialized when society becomes more complex. This is a truth I despair of every time I confront it, yet it is a truth. As I have said before, I have a visceral aversion to jargon, as its use deters the layperson from understanding the subject matter at hand. At the same time I understand that it is helpful for specialists to have a language with which to communicate with one another. Still, the idea of following a specialized career track, in which one learns the language of the field while moving further away from all other fields, turns me into a commitment-phobe.

And yet, there are fields that are generalist. Law and journalism are the two that come to mind. Both still require the practitioner to learn a language, though journalism's is a language of universalism. As a person who enjoys writing, I fear any stifling parameters to language, which appear in the law, such as legal jargon (too much Latin! and so forth) and a fairly strict writing format. These demands--flexibility in form of communication, generalism--do not comply with the needs of a complex society. So what is a person who wants to be both useful to society, gainfully employed, and interested in her work do?

Be cognizant of this "predicament," I say (to myself). I am in truth living among a luxury of choices. Maybe the new anomie springs from such constant consumerism, where choices are plentiful and perspective takes too much time to summon. I once asked someone several generations older than myself why he had chosen to be a doctor. "Because back then, if you wanted to go to professional school, you either went into law or medicine," he said. It was that easy. Of course, I know it was not easy: medical school is no cakewalk, but the point is, when one has few other choices, one has less room for doubt. As one of my teachers once told me (paraphrased), "the more I have learned, the less I realize that I know." Sometimes the source of paralysis is knowing too much.

Friday, November 24, 2006

Review of movie I haven't seen: Bobby

I continue to feel more motivation to review movies that I have not seen than those that I have. Although I have seen a variety of films recently--The Queen, The Rise and Fall of Legs Diamond, Jules et Jim, Network (again)--I only want to review Bobby, the movie about the second most famous Kennedy (or maybe third or fourth). Actually, Bobby is less about Robert F. Kennedy, JFK's brother and once U.S. Attorney General, than it is about the mood of the country when RFK was assassinated in 1968.

Though I am admittedly curious to see this movie, a few of its attributes worry me. First of all, "directed by Emilio Estevez" is not particularly encouraging. The Post says Estevez is "best known" as one of the "Brat Pack" actors, but among members of my generation, it's much worse: he's Coach Gordon Bombay of The Mighty Ducks and D2. He has said, "Quack, quack, quack, Mr. Duckworth!" in a movie. He is a poor man's Charlie Sheen (who happens to be his brother). Just to clarify, his dad is Martin Sheen, and he had to play a hockey coach in a stupid (but admittedly hilarious) kids movie!

Secondly, the cast aims to be an exciting ensemble and as such includes one-note actors like Helen Hunt, Lindsay Lohan, Ashton Kutcher, and Heather Graham. A film that aims to create an aura around a consequential historical figure totally does itself in if it casts Ashton Kutcher in any role, even as an extra. As for Heather Graham, I was not aware that she was still around, but she has the honor of being my least favorite actress of all time. Her over-acting ditz schtick made the sequel to Austin Powers exponentially worse than it already was, and she looks like an albino bug.

Finally, and most importantly, from all accounts, Bobby is based upon the premise that Bobby Kennedy was a good guy. That is not to say that the film is incorrect to portray him as a figure who inspired, because he did, especially in the mounting turbulence of 1968, but it adds nothing new to the popular historical picture of Bobby Kennedy, which, as tends to be the case with historical figures, is not a very multi-faceted picture. To his credit, he visited Appalachia to bring attention to the plight of the impoverished, and he became a voice of calm in the early throes of instability in Vietnam, but he was also the legislative aide to Joseph McCarthy during the opportunistic witchhunts of the late 1940s and the early 1950s, and he was, by all accounts, a ruthless operative bent on enforcing loyalty to himself and his brother. He also gave written approval to the FBI to wiretap Martin Luther King, Jr., who the discredited J. Edgar Hoover suspected as a Communist, though in fairness, Kennedy also leant support to the enforcement of Brown v. Board of Education and worked hard to desegregate the government. I can certainly see how it would be interesting for a film to explore how a U.S. politician impacted the American people--I often wonder how consequential political figures are upon peoples' day-to-day experiences--but the portrayal of this oft-recounted historical moment risks bringing nothing new to the table.

Top ten Thanksgiving table conversations of 2006

...And the clichés that accompanied them:

(1) the recall of O.J. Simpson's book and interview
"I wonder how those jurors feel now, acquitting a cold-blooded murderer."
"I'm glad the media finally had some sense."
"That man will do anything to make a buck."
"The poor Brown and Goldman families."

(2) the Michael Richards racist outburst
"Kramer really is crazy!"
"The media and the politically correct police are at it again, making Kramer look bad."
"He should have apologized for being an awful stand up comic while he was at it!"

(3) Wii versus PS3
"Boy, the lines are going to be long tomorrow, huh!"
"Mom, can I get a PS3/Wii/Xbox" ad infinitum
(4) College football
"It really should be Michigan versus Ohio this year in the NCAA football championship; it's too bad they're both in the Big Ten."
"Boy [fill in the blank team] sucks this year."

(5) Professional football
Nothing interesting

(6) Traveling
"It was a zoo at the airport this year!"
"I hate flying, what with the security checkpoints and the long lines."
"It was a zoo on the expressway this year!"
"I hate driving, what with the tolls and traffic."

(7) Thanksgiving food and how it makes you full
"Boy, I'm not going to be able to move after this tasty meal."
"This is the best stuffing I've ever had."
"You've outdone yourself with the pie."
"Well, I know what I'm going to be having for dinner for the next week [pause for effect] turkey!"
"Time to loosen the belt buckle a notch or two."

(8) The weather
"You missed some great weather we had last week, Aunt Bertha."
"Boy, I guess I chose the wrong week to come to Chicago. It sure is cold here!"
"Why, I think this is the most beautiful Thanksgiving we've ever had."

(9) The new Congressional majority
something that inevitably makes things awkward between guests of opposing political loyalties

(10) What the younger dinner guests are going to do after high school/college/work/graduate school
"So, you have your colleges picked out yet?"
"So, you know what you're going to do after college?"
"You're an English major: what are you planning on doing with that?"

Please feel free to comment with other typical Thanksgiving dinner conversations.

Big Pharma needs to reevaluate their priorities

The Post and the Times have been covering the re-grouping of the big pharmaceutical companies' lobbying arm in Washington in reaction to the results of November's election. Big pharma is already alarmed that Democrats have promised to negotiate lower drug prices for Americans, which the Medicare Part D plan forbids. They are predicting a less friendly group of legislators while at the same time putting their energies in their big lobbying arms:

Many drug company lobbyists concede that the House is likely to pass a bill intended to drive down drug prices, but they are determined to block such legislation in the Senate. If that strategy fails, they are counting on President Bush to veto any bill that passes. With 49 Republicans in the Senate next year, the industry is confident that it can round up the 34 votes normally needed to uphold a veto.

While that showdown is a long way off, the drug companies are not wasting time. They began developing strategy last week at a meeting of the board of the Pharmaceutical Research and Manufacturers of America.

I have two thoughts after reading about such jockeying amongst the big pharma players and their allies in Congress. First, lobbying, at least in this industry (though I'm sure in others too), is not based so much on the merits of the interests that the lobbyist is representing but rather on the lobbyist's own ties with members of Congress and their staff. The Times and Post article both detail how the pharmaceutical lobbies are hiring people acquainted with members of the majority party. How much of policymaking is based on what and not who the legislator knows?

Second, lobbying sure costs a lot of money for big pharma. This is an industry that has two lobbying arms: the K street group whose targets are the Congress and the drug representatives whose targets are the physicians. Big pharma estimates that it spends 5.7 billion a year on marketing to doctors, another group estimates that 90% of the 21 billion in marketing that the companies have, or 18.9 billion is spent on marketing to physicians, a large part of that devoted to gifts large and small. The thought is of course that the physician will in turn prescribe their drug that they are hawking. Here's one physician's account of drug company "largesse":

It's gotten to the point that it's impossible not to partake of drug-company largesse when I attend a conference. Drug companies underwrite many of the talks, and even the buses that move us from lecture to lecture at no charge carry ads for popular drugs. Sure, I'll turn down the theater tickets and box seats at sporting events and the expensive tours, wine tastings, and meals, but it's impossible not to receive some form of freebie, however inadvertent.

Back at home, there's the community detailing with expensive luncheons and dinners, mostly with lectures attached, but not always. I receive invitations daily, and it's all free—not to mention the magazines and brochures that show up in my mail, without my having requested them. I can't tell who's been sending them; I wish they'd stop.

There's more too, for example, invitations to cruises, on which I could be paid as a consultant, to discuss "how I prescribe antidepressants." I was even gifted with a pricey, inscribed Mont Blanc pen when I became a medical director—I did not keep it—and<sup> the drug rep was upset, because who wants a pen with my name in gold?

Even if the physician makes an active effort to avoid submitting to the quid pro quo, studies have shown that gifts to physicians still have an impact on what the physician prescribes. Here's Dr. Charles Atkins' view:
Of course I'm influenced by them, I'm just not sure how much and in what ways. I have my suspicions, which are reflected in such questions as, Why are we so quick to abandon old medications when the new ones come out? If people spent the same amount of time, energy, and money extolling the virtues of off-patent medications, would we switch so quickly?
And Dr. Stephen Cha:
Like political contributions, these gifts are not necessarily improper, and some industry-physician collaborations can lead to important advances. But research shows that such largess affects physicians' prescribing practices and may compromise their objectivity.

Certainly if I knew that my doctor was getting $5,000 to $20,000 a year from the maker of Vioxx, I would wonder why the doctor was prescribing it.

It is often argued when a patent on a drug expires, allowing the sales of a generic counterpart, or when lawmakers express a desire to negotiate with the drug companies for lower prices, or when pharmaceutical companies are urged to sell expensive HIV/AIDS drugs to poor countries, that the research & development budget of that company will be negatively impacted to the point that it cannot possibly devote the same energies that it has in the past to develodevelopingrugs. However, companies like Merck, Pfizer, Bristol Myers Squibb, and so forth, need to take a look in the mirror when they can spend billions on lobbying to Congress and to doctors and still defend not losing some money by getting drugs to people who need them.

Thursday, November 23, 2006

Happy Thanksgiving!

Hilarious Thanksgiving video from zefrank's The Show!

Tuesday, November 21, 2006

Marriage in France: Passé

The Post featured a surprisingly complex and well-reported article today about the decline of marriage in France. Marriage rates in the Western world have been on a decline for awhile and even more in parts of Europe. The declining marriage rate in northern Europe is seen as a rejection of those institutions that promulgated marriage--the Catholic Church, the traditional family, the closed society--no more so than in France.

In 2004, the most recent year for which figures are available, the marriage rate in France was 4.3 per 1,000 people, compared with 5.1 in the United Kingdom and 7.8 in the United States. The only European countries with rates lower than France's were Belgium, at 4.1, and Slovenia, with 3.3.

The knee-jerk response in some American quarters will of course be that the French are "godless," "socialist," and "relativist." However, the consequences of the marriage decline has not been chaos, broken homes, or rampant polygamy:

Contrary to predictions three decades ago, when the marital downslide began, French family social structures have not disintegrated. Instead, society has accepted and embraced changing attitudes. French law stopped distinguishing between children born in or out of wedlock more than 30 years ago.

A willingness to believe that the marriage decline stems from a dissolution of morals would be too eager to ascribe moral failings to a people, and it misses what is so fascinating about this trend: there are neither the same incentives nor influences to get married as there used to be:
The tax breaks the French government offers married couples, which are not as substantial as U.S. marriage tax reductions, are not enough to persuade most cohabitating couples to formalize their relationships. In France, the greatest financial and tax incentives target the number of children a couple has rather than the parents' marital status.

The couple that is profiled in the story did not see a reason to get married, but they have two children and have cohabitated for many years. It is tempting to compare the U.S. and France, but because of the different populations and sizes between the two countries, it would be difficult to draw conclusions . It is fair to say though that marriage in the U.S. has become a racket. When so many weddings devolve into a game of keeping up with the Jones', it's no wonder that some people would just assume avoid the game. Can a society that marvels at the "huge rock" on a woman's finger and fawns at million-dollar weddings really judge one that does not consummate as many such affairs?

Ségolène Royal, who last week won the Socialist Party nomination for president in next year's election, and Francois Hollande, the party's leader, have had four children during their 25 years of cohabitation. French Defense Minister Michèle Alliot-Marie, another possible presidential contender, has spent nearly 22 unmarried years living with Patrick Ollier, a member of the National Assembly.

"We never had time to get married," Alliot-Marie said in a recent interview. Royal has expressed distaste for the notion, once calling marriage a "bourgeois institution."

"I don't see how marriage would bring any more to our union as a couple," [Sandrine] Folet said. "It doesn't take away anything, it doesn't bring anything."

Like anything else, institutions that no longer seem functional may be deemed irrelevant. France and its neighbors may be headed towards total secularism, but that does not mean that they are valueless and rudderless.

Monday, November 20, 2006

Roundup of the Ridiculous

I have never been filled with such glee.

This video is a brilliant (albeit totally unintentional) illustration of Corporate America.

Bank of America sings U2's "One"

I used to think that TV commericals that cheesed up classic rock songs were made by cynical people, but now I'm willing to believe these people are for real.

Secondly, I love Professor John Orman:

The political party formed by U.S. Sen. Joseph Lieberman after he lost the Democratic primary in August has a new chairman - and it's not Lieberman.

However, according to the bylaws adopted by its new chairman, Lieberman critic and Fairfield University professor John Orman, the senator is an eligible party candidate.

According to bylaws established by Orman, anyone whose last name is Lieberman may seek the party's nomination - or any critic of the senator.

Orman seized control of the Connecticut for Lieberman Party this week after registering as its sole member and electing himself as chairman.

Orman has triggered a process that will force Lieberman and state elections officials to decide the future of a party created solely to return the senator to Washington.