Saturday, August 28, 2010

The Wiki Vacation



Last year, the state of Michigan, despite being hit as hard as any state in the union by the economic downturn, spent $10 million to buy ad time on national cable networks to entice tourists to come visit the Wolverine State. The state legislature allocated another $18 million total in their marketing budget, including over one million just to produce a series of ads.

It's interesting to consider why people choose their vacation destinations. Obviously, the geographical locations of family and friends is paramount in many people's decisions. Beyond that, specific hobbies and recreational interests are often integral in choosing where to go. I've visited Hibbing, Minnesota and Metropolis, Illinois in recent years, two locations that are of little appeal to someone who isn't already interested in Bob Dylan and Superman, respectively. And for some people, timeshares or tradition may play a role in how they determine where to spend their time or dollars. But does anyone see a TV commercial and decide, "Hey I need to go to Michigan this summer"?

One thing that I've never heard anybody say is "I need to take a vacation that will allow me to discover America." If someone were to utter such a thing, I would expect there is a good chance she or he would be laughed at. But for a good chunk of the 20th century, the Romantic idea of traveling for the sake of traveling, making a Kerouac-inspired road trip to get in touch with both the world and oneself, was an idea that celebrated in our culture.

Maybe this is a concept worth revisiting. I propose that at least once in a lifetime, every American should make a pilgrimage to another part of America for reasons not based on relationships, interests, or TV ads, but rather, for no reason whatsoever. Let the journey not the destination be the reward, and be open to spontaneous discovery. Of course, one should drive instead of fly, and follow country roads and state highways rather than Interstates wherever possible. Let Ichabod and Mr. Toad be your guide as you go "merrily, merrily, merrily, on your way to nowhere."

Still, we do need some kind of destination, if for no other reason than we have to know when to turn around and come back home. But I assert that this destination should be as random as possible. And the best way I know to achieve this is to use the random page generator on Wikipedia. Keep on hitting it until an American municipality comes up, and this becomes your vacation destination. I just attempted this three times and came up with these locations (which probably aren't covered by a tourism campaign):

1. Elk Falls Township, Kansas. Population 196. You could probably meet a significant percentage of the population. How many people who go to Las Vegas can make this claim? You could visit Mount Olivet cemetery, and maybe wade through South Fork Wildcat Creek. If I went to this township, I would try to track down the site of the school that Miss Dora Simmons taught at in 1870 (which I know about thanks to Google digitizing a 1912 book about Kansas history).

2. Hopkins Park, Illinois. Population 711. A rural community where nearly half the community lives below the poverty line, but they sport a nifty website detailing redevelopment plans. It seems that Octoberfest might be the best time of the year to visit.

3. South Branch, New Jersey. Part of Hillsborough Township, population 36,634. The Wikipedia page is taken verbatim from the Township website, which proclaims that the community was settled by the Dutch in 1750, with the Narticong Tribe of Native Americans living there before then. Diamond Jim Brady built a house for his mistress there.

I'm thinking that hitting up these three locations by car would enable anyone to learn more about their country, and probably more about him or herself, and would probably be a more fulfilling vacation than golfing or sitting on a beach. Hopefully in a few years I'll be able to pursue such a trip myself. If nothing else, such an endeavor should give me something to blog about. And maybe I could get a cut of a state's marketing budget.

Saturday, August 21, 2010

Staying Power vs. Holding Up



In the mid-1990s a band called The Presidents of the United States America attained as much success as any band could hope for. Coming out of the hot Seattle music scene, albeit with a much different ethos than the original wave of Seattle bands, they managed to put out a platinum album with multiple hit singles, play the late night shows on several occasions, and achieve Grammy nominations two straight years. And by being deemed worthy of a Weird Al parody, they achieved a mark of cultural relevancy. They were an appropriate band for their time-- in some ways their output was just as nihilistic as other Gen X productions, but rather than revel in the dark and depressing implications of this worldview, they simply looked askance at everything and pounded out a series of songs with catchy melodies with ironic lyrics. The bleakest moment on their debut album involved cursing at a cat, and their most radio-friendly chorus declared: "Millions of peaches/peaches for free/millions of peaches/peaches for me" (Incredibly this song, which also contained the lyrics "Peaches come from a can/they were put there by a man/in a factory downtown," got a Grammy nomination for best pop performance, and if not for a new Beatles song that year, might have won the thing).

When we talk about whether an act has "staying power," it is usually done retrospectively, with the benefit of hindsight. Yet even at the height of their popularity, there was a general sense that The Presidents lacked staying power. They rushed out a second album shortly after the first (slapping the Roman numeral II onto their eponymous first record). I remember reading a Rolling Stone review at the time which declared that not only was this a bad record, but that it indicated that the band had nothing more to give. Although it is hard to find a contemporary review on-line, I did find this one from the venerable Yale Herald, which concludes that the best hope for the band is to disappear and come back with a new album that would seem like a debut, finishing with the statement that "otherwise, the charming musicians of PUSA might end up buried by the mountain of Seattle-grunge mimicry they so effectively exploited."

In fact, these prophecies couldn't have been more dead-on. Their next album was a collection of studio scraps called Pure Frosting. They broke up before the turn of the century, then managed to reform and make a couple more low-key records in the next decade, but never again entered the radar screen of Letterman, The Grammys, or Weird Al.

So clearly, this band didn't have staying power. But does that mean that their first album doesn't "hold up"? Technology allows us to live in an archival era, in which media that was originally intended to be ephemeral has been preserved and consumed by audiences that creators never envisioned. And so when you watch a DVD of a television show or movie from a previous generation, or listen to music that was recorded long ago, or even read a reprinted comic book, a question that is often asked is "does this hold up?" This is different than asking about staying power, as something with staying power may not necessarily "hold up"--I think of a band like Kiss, a group with incredible staying power, despite making records which many critics would argue fail to hold up. It should also be pointed out that a certain product may "hold up" even if its producers lacked staying power. To my ears, the first Presidents album still manages to "hold up," even as the band has slipped into obscurity.

But what do people actually mean when they ask if something "holds up"? My theory is that "does this hold up?" is code for "is this embarrassing?" An obvious example would be a movie with poor (by today's standards) special effects, such as when the Kirk Alyn Superman takes off to fly in the 1940s movie serials and suddenly becomes animated (in contrast to the 1978 Christopher Reeve Superman, which holds up pretty well). An ideologically embarrassing portrayal of women or minorities is another patently clear manner in which something could be said to no longer "hold up."

But more problematically, it seems that bygone media is seen as not holding up if it is perceived as lacking the sophistication that we have today. And how do we measure sophistication? Since this is such a fraught appraisal, we often settle for a rather facile evaluation. The more earnest, straightforward, or commercial the narrative and the thematic elements, the less sophisticated it is considered to be, while irony, complexity, and noncommercialism are regarded as marks of sophistication.

But here is where things get dicey: even while perceived sophistication allows a product or element to "hold up," at the very same time it is often prohibitive in the forming of "staying power." The Presidents of the United States of America might have sold a lot of records in their brief time, but even at their apex, they never sold as many T-shirts as Kiss.

Friday, August 13, 2010

Whimpers, Bangs, and Reality



When humans first starting entertaining themselves with staged narratives, audiences expected protagonists to make some type of "dramatic exit." Oedipus needed to gouge his eyes out. Medea had to kill her children and fly off in the sun god's chariot. A lot has changed since the day of Aristotle's three unities-- we no longer expect the action to take place in real time, in one place, with one mood. But we continue to anticipate narratives that end with shock and awe. It's no coincidence that when T.S. Eliot set out to Modernize literature by repudiating and reinventing what came before, one of his most enduring lines became "This is the way the world ends/ Not with a bang but a whimper." This was a shock to our sensibility that closure requires a shock to our sensiblity.

But contemporary storytellers who have attempted to make a statement by ending with a whimper have more often than not met with a backlash. When Larry David tried to emphasize the fact that his characters were supposed to be "hollow men," viewers and critics alike derided the last episode of Seinfeld. But that episode at least had some narrative closure--unlike The Sopranos, which caused many people to contact their cable company to report a technical failure.

So in this postmodern era, creators have become confronted with a dilemma. We have entered a philosophical age in which closure is resisted and the existence of singular symbolic events is doubted. Furthermore, those who seek mimesis in their art are offended by the notion that we need to have dramtic events at the climax of a situation. After all, in real life a dramatic event usually happens first, (if it happens at all), leaving people to slowly pick up pieces rather than becoming spurred on to another dramatic moment. But on the other hand, audiences usually don't care about philosophical zeitgeist or representations of reality, they want what they always have--a story that has a climactic ending!

Given this tension between artist and consumer, somebody has to compromise. In our culture, that for the most part, has been the artist. If someone were to sit through the top 10 movies of the summer box office and then go to the beach and read through the top 10 fiction bestsellers, I would speculate that they would get 20 stories that end with a bang instead of a whimper. But the events of the past week leave me wondering if there has been or will be a shift on the part of the consumer.

I'm not suggesting that people are suddenly embracing non-dramatic, ambiguous endings. Rather, I'm suggesting that instead of changing fiction to fit reality, they are changing reality to fit fiction. JetBlue flight attendant Steven Slater became an overnight sensation for reasons that no cultural theorist need elucidate. I'm sure that even for people who leave jobs on good terms, there is something anti-climactic about farewell cards and cake in the breakroom. And even though the HOPA dry erase gal turned out to be a hoax, the fact that her stunt was seen by millions of people in one day's time certainly unearths the same deep-seated phenomena that Slater tapped into.

So while Aristotle asserted that on-stage drama allowed people to achieve catharsis without having to experience (additional) real life drama, our contemporary culture has created memes that A) make it harder to distinguish between the real and the imaginary, and B) potentially allow dramatic events to bleed over from the realm of make believe. This is attributable to not only the Internet, but so-called reality television as well. As contestants on these shows have become more aware of exactly how to garner maximum attention, it stands to reason that an emerging generation of solipsists is learning how to manipulate events in order to align with their inner scripts. And the fact that those scripts seem to mesh with the kind of script that consumers have demanded since Aristotle's day may make a lot of people happy. But there may be a few folks who suddenly want to gouge their eyes out.

Saturday, August 07, 2010

Snooki vs. The Babe



On July 27, Nicole Polizzi, Paul DelVecchio, and Michael Sorrentino rang in the opening bell on the floor of the New York Stock Exchange. These individuals, cast members of MTV's Jersey Shore, are better known to millions of people as Snooki, DJ Paulie D, and The Situation, respectively.

The very next day, NBA player Chris Bosh rang in the opening bell on the floor of the New York Stock Exchange. He is known to millions of people as Chris Bosh.

At one point in history, when newspapers and motion pictures were the dominant media, anyone of accomplishment in America was bestowed with a nickname. Sports heroes, predominantly baseball players, boxers, and football players, were anointed with monikers and titles. Shoeless Joe Jackson, James "Cool Papa" Bell, Johnny "Blood" McNally, and Sugar Ray Robinson are some of my favorites. If you study baseball rosters at the time, seemingly every team had a "Kid" and a "Buck" (and before the 1950s, when the word took on a feminine context, many teams had a guy nicknamed "Chick"). Some players were so accomplished they had both a permanent nickname and a mythological name: Harold Grange was known as either Red Grange or "The Galloping Ghost," while George Herman Ruth had a series of names known to anyone who has seen The Sandlot. But it wasn't just athletes: politicians, entertainers (musicians in particular), and mobsters also were bestowed with new monikers. Charles Lindbergh became "Lucky Lindy." Ernest Hemingway was "Papa."

But while some may look to the past for a golden age of nicknames, others might look at the present. No MTV producer came up with the name "The Situation." Apparently, the guy started calling himself that long before cameras started following him around. And while the character himself may not be mainstream, the concept of self-dubbing is. You don't have to wait around for a newspaper to give you a nickname anymore; the Internet has enabled everyone who wants one or more "handle" to come up with one with a few keystrokes. But even in the off-line world people are more emboldened to view nomenclature as malleable. When I first started listening to sports radio in the 1980s, callers had names like "Mike from Brookfield" or "Bill from Racine." When I worked in sports radio a few years back, we had callers named "Dr. X," "Big Boo," "The Reverend," and "The Legend."

So the contrasting situation on Wall Street last week highlights an interesting shift in culture. Just at the time when we are emboldened to create our own nicknames, the cultural heroes who were recipients of nicknames in the past are no longer getting them. (For the record, I don't consider derivatives of given names like "D-Wade," "T.O.," or "Shaq" to be nicknames...and nicknames forced on us by shoe companies don't count either. Nobody ever really called Michael Jordan "Air").

These trends can probably be attributed to shifts in media. Mainstream journalists are no longer in the habit of mythologizing what they report (which, in the case of sports coverage in newspapers especially, I maintain is a mistake), but the Internet (and reality television for that matter) has allowed self-mythology to flourish. And so we have seen a strange revolution that even Karl Marx wouldn't have predicted. But then again, the fact that the nicknamed and the non-nicknamed both partook in the same ceremony last week speaks to the fact that perhaps not that much has changed after all. The names may change, but the bell keeps ringing.

Sunday, August 01, 2010

I Didn't Elect the Sheriff



I drove about 200 miles (one way) through Wisconsin this week, and I was reminded that it is political season. Despite a gubernatorial and a U.S. Senate primary coming up next month, the most prevalent yard signs were for county sheriff races. In fact, I found that I didn't have to pay attention to signs announcing that I had crossed a county line--if I was curious, the change in yard signage would amply inform me of the political boundaries.

Seeing all of the sheriff signs, with the inevitable images of starred badges prominently displayed, made me nostalgic for 2002. That fall, during my brief career as a radio news reporter, I covered a contentious county sheriff race involving about seven candidates. I also covered a two-person district attorney race. County races usually aren't that interesting, since incumbents usually rule for as long as they want, but that particular year there was no incumbent D.A., and the incumbent sheriff was an appointee who had never been elected. Adding to the unpredictability, there was no scientific (or unscientific) polling done, so until the actual primary, there was no firm reason to expect any one of the candidates to win.

As I did not reside in the county I was covering at the time, I personally did not cast a vote in the elections. And I was kind of glad about that-- not only because it helped to ensure my impartiality covering the races, but also because I had no idea who I would have voted for. It obviously wasn't for lack of information. Not only did I attend candidate forums and hear the candidates speak in a variety of settings, I personally interviewed them (conducting one on-air forum for the D.A. candidates myself), and in some cases, I observed them do their duties in a way that few citizens do. But at the end of the day, I couldn't say with certainly who would have made the best sheriff or prosecutor.

Electing a legislator seems to be an easier task. As my high school civics teacher advised, the test for voting for a candidate is to say: "I don't have time to run for office this year, so which one of these people would do the job the most like I would?" While we can align ourselves ideologically with a lawmaker, how do we confidently do that for an administrator of justice?

Take nothing away from the two individuals who won the elections and their qualifications for the respective jobs in 2002, but I have to think that in each case, a sizable portion of the citizenry cast their votes on external factors. The winner of the sheriff's race may have benefited from having his father serve in that same capacity several years prior, and the winner of the D.A. race probably picked up a number of votes when he made a pledge to donate a sizable chunk of his salary back to the county. Cronyism may be a bad, but is populism necessarily any better?

I doubt you will find much support from anyone for taking away from voters the power to employ these officials (heck, it's hard enough to take away the power to select baseball all-stars from voters), but I would suggest that it wouldn't be that far outside of the existing status quo. We may elect sheriffs, but we don't elect police chiefs. We elect the D.A., but not the other prosecutors who try cases on behalf of "the state" (nor do we elect public defenders, for that matter).

As a check against cronyism or a spoils system, perhaps we could allow for public recalls of certain government offices. But much as I enjoyed covering the elections of 2002, and much as I enjoy using campaign signs as geographical guides, it wouldn't bother me to remove a little bit of power from the people.