Sunday, May 29, 2011

It's Not the End of the World, and I Don't Feel Fine



I've decided to give in and write about Harold Camping's "end of the world" prediction that didn't come true last week (actually, Camping predicted "The Rapture", which is a little different than predicting the end of the world. And I used to think the word "impeach" was the most misunderstood concept in America). Over the days leading up to the May 21, I attempted to avoid reading or hearing news accounts of the prediction, and I tried to avoid taking part in any conversation about it (though I was less than successful on both resolutions). What was my objection to joining this national "water cooler" topic? I was just annoyed that something that happens with regular frequency was being treated as a special occurrence. If somebody put out a press release saying that he was abducted by aliens, he would not be granted any special audience with the American people. At best, he would have his name added to the long list of "Contactees" on Wikipedia. And of course, Camping himself had already made such a prediction in 1994, which did not receive nearly as much attention as this one. Logically, one would think that if any prediction would receive attention it would be the first one, with any subsequent claim afforded less consideration. So reasoning that the story has no right to exist, I did my best to plug my ears, close my eyes, and deny the reality that it did.

But an Associated Press essay written shortly after May 21 inspired me to examine the matter, after all. My first instinct was to blame the situation on the news media's need to find content to disseminate (see my post two weeks ago). But the essay points out: "As with so many curious cultural blips, from the balloon boy to the angry flight attendant, it's easy to say that attention to this was created and fed by the media. But that doesn't account for the social networks — for the millions on Twitter who made topics like 'rapture' and 'judgmentday' trend throughout the day." I would argue that the same need to find content that drives news sources is also a driving force behind Facebook and Twitter. Just like networks and newspapers feel pressure to find content, social media users have to talk about something.

But then again, for every trending topic there are countless topics that whither on the vine. What is it about "doomsday" that inspires discussion? Forget Harold Camping for a second; if you surveyed a cross-section of the American population by asking when the Mayan calendar ends, I think a majority would correctly respond "2012." But if you asked when the Mayan people thrived, where they lived, or anything else about them except when their calendar ends, I highly doubt you would get a majority.

Examined from another angle, it seems illogical to be concerned about the end of the world, when the much greater likelihood is the end of our world. No events of global significance occurred on May 21, but if it was a typical day, over 150,000 people around the world died. Every day is an apocalypse for somebody; every day is judgment day for someone. And we will all get our turn. So why are so many preoccupied with the expectation that it will be a universally shared experience?

My theory is that even though most people probably don't know what the word "telos" means-- we live under its influence. We are accustomed to joining stories in progress. If we come late to a party and a group of friends is watching a movie, we don't ask them to go back to the beginning. If they press pause and tell us what we missed, that is usually sufficient to proceed. But once we've invested in a story, we hate to have to leave before it is done. If the history of the world is a story, we've become accepting of the fact that we are latecomers who have been filled in on what we missed. But it is disconcerting to know that we will be checking out early. It would give us closure to know that we won't miss anything, that events will not be proceeding without us.

But then again, given that news stories and Internet posts about balloon boys, angry flight attendants, and Harold Camping's predictions are what people are actively consuming, perhaps we can be rest assured that after shuffling off our mortal coils, we won't be missing anything worth talking about anyway.

Saturday, May 21, 2011

Myth, Progression, and the Future of a Genre



I plan to do something this summer that I don't think I've ever done before: go to the movie theatre four times. With rare exceptions, I only see movies based on comic book franchises, so with Thor, Captain America, The X-Men, and Green Lantern all getting blockbusters, I'm making an unprecedented number of contributions to the box office. Factor in a couple of animated direct-to-DVD movies (another Thor and another Green Lantern) and the two hour (before commercials) series finale of Smallville, and I'm feeling sufficiently catered to.

As with any movie based on source material, it's interesting to make note of how and why filmmakers chose to deviate from an original narrative. But there is a difference between comic adaptations and most all other kinds. Movies based on literary works or even on ancient myths are working with a structured narrative with a clearly defined endpoint. Although any given comic book story can have a resolution, the universes that spawned these characters are ongoing. And it wouldn't exactly be lucrative to end them. But aside from the fiduciary interest that corporations (Disney and Time Warner now own all Marvel and DC characters) have in keeping the characters stories from ending, there is a cultural argument for their continued existence. If, as some have argued, characters such as Superman, Batman, and Spider-Man are our culture's answer to the Greek myths, the continued addition to a tapestry indicates a continued cultural vitality.

But that also creates a rather unique challenge for the creators (or caretakers) of the characters. How do you progress characters through a narrative that does not end? The only other medium that comes close to facing such a dilemma is the television soap opera. But soaps are built around ensemble casts and even the most iconic characters are phased out or de-emphasized over time.

A practice in the industry that has arisen in response to this challenge over the last several decades has been termed "illusion of change." In the early 1990s DC Comics killed Superman, eventually replacing him with four different Supermen. It was a bold move, resulting in lots of media attention and renewed interest in the comic. After some time had passed, the original Superman returned to life and assumed his old role, with the replacements fading into the background. And this general process has been repeated with almost all superheroes since then. Almost all of them have been at some time or another replaced, only to return and claim their old job back. Occasionally some other radical change occurs: Spider-Man's Aunt May dies, Superman splits into two separate energy beings, Batman's butler Alfred quits his job, or Wonder Woman starts wearing pants. But eventually things revert to a "classic set up," until the next time the status quo is temporarily disrupted.

Of interest to me, then, are the unusual instances when some change to the status quo sticks, and when it becomes so enmeshed in the ongoing comic narrative that it also becomes part of other media adaptations. Perhaps the best example of this is the marriage of Superman and Lois Lane. For over 50 years, according to conventional wisdom, one of the things that made the Superman myth work was the concept of a love triangle that actually involved two people. Clark Kent loved Lois Lane, but Lois loved Superman. It was thought that if you would destroy this dynamic, you would destroy much of what makes the Superman story appealing. But more than 15 years ago, Lois irrevocably learned that Clark and Superman were one and the same man, and she married him. Since then, two separate television adaptations, including the recently ended Smallville, also had the two of them walk down the aisle.

Another example of a change that sticks can be found in the bat mythos. As a general rule, superheroes don't age. But Robin grew up. In 1984, Dick Grayson moved out of Wayne Manor and adapted the identity of Nightwing. Subsequently, a number of other characters have fought alongside Batman under the name Robin, but the original Robin never came back to the nest...until recently. When Bruce Wayne returned from one of those deaths that superheroes tend to come back from, he found that Grayson had shed the Nightwing costume and had honored his legacy by becoming Batman. And the erstwhile Robin was actually doing a pretty good job in the role. So rather than force his ward to regress, Bruce allowed Dick to continue watching over Gotham City, while he decided he would also put the cape and cowl back on and patrol the rest of the world.

Whether that particular change to the classic Batman set-up will stick is anyone's guess. The track record of the industry would seem to indicate that it won't. But I hope it does. Comic book movies have been going strong for over 10 years now, and the next couple years will continue to see some high-profile projects. But as the initial wave of trilogies starts to wind down, it will be interesting to see if these iconic characters stay in the public consciousness. The X-Men franchise has reverted to prequels. The next Spider-Man film next year will be a "reboot," supposedly retelling the origin that was established in the 2002 film. Batman's director has said that after the next movie, his franchise will be ending.

The first real decade of comic book movies has drawn heavily on multiple decades of comic book stories. But if this genre is to be sustained, I think the source material needs to continue to expand, evolve, and explore new terrain. If the majority of the changes to our myths continue to be illusory, I doubt that I will have a summer like this one in a long time.

Saturday, May 14, 2011

Time Isn't a-Changin'



"Even the article which you're doing, the way it's going to come out, don't you see it can't be a good article? Because the guy that's writing that article is sitting in a desk in New York. He's not even going out of his office. He's going to just get all of these fifteen reporters and they're going to send him a quota...He's going to put himself on, he's going to put all of his readers on, and another week he'll have some space in the magazine."--Bob Dylan to Time Magazine, 1965

"The only person you have to think about lying twice to is yourself or to God. The press isn't either of them, and I just figured they're irrelevant."--Bob Dylan to CBS (Ed Bradley on 60 Minutes), 2005.

Bob Dylan is famous for going through metamorphoses in his career, but as he nears his 70th birthday, one thing has been constant: his skepticism that the media has the ability to portray reality (and particularly his reality) accurately. Not that mass media has been static throughout Dylan's 50-year-career: the online world would have been foreign to the young hipster who sat down (to borrow a media cliche for "being interviewed") with Time Magazine in 1965. But the aging curmudgeon waded into that world yesterday, in order to combat his old enemies. Dylan has had a website, a Facebook page, and a Twitter account for some time, but nobody thought that the man himself had too much to do with them. So it was a surprise when yesterday he posted a message on his website to his "fans and followers."

The gist of the missive is that there has been a lot of misreporting over the last year over Dylan's Far East concerts, and he manages to insinuate that the media apparatus of no less than three countries is flawed. He takes to task the American press for not double checking facts they were reporting (such as the notion that he had been denied permission to play in China, something he says is a false story created by a spurned promoter). He calls out by name British magazine Mojo for misreporting attendance figures and the composition of the audience ("check with concertgoers" he exhorts). And he subtly mocks the Chinese press for promoting him as a '60s icon along with pictures of Joan Baez, Che Guevara, Jack Kerouac and Allen Ginsberg: "The concert attendees probably wouldn't have known about any of those people. Regardless, they responded enthusiastically to the songs on my last 4 or 5 records. Ask anyone who was there."

In all three cases, he not only diagnoses the flaw, he also prescribes a solution. And the common theme is to "ask" or "talk" to anyone who might be able to give truth and insight. It's amazing that the very malady that Dylan identified in the media in 1965 is still the cause of so much misinformation today. It's not that our media overlords are trying to influence public policy by knowingly slanting information. It's that they've got space to fill, deadlines, and sometimes limited resources to get done what they need. And if anything, these conditions have been exacerbated over the years. Resources are becoming even more stretched as traditional advertising revenue streams dry up in new media. And in 1965, at least the editors at Time magazine had a week to put together their stories. Now they have to update content constantly. And back then they had to worry about limited competition. Now they are scrambling to compete with countless alternative news sources. Operating in such a climate, it is easy to see how meticulous fact-checking can be sacrificed. And in today's media, once a story is out there, it can be retweeted and linked exponentially, and with each promulgation there is less of a sense of responsibility by the disseminator to check for accuracy.

There is one other thing that Dylan perceived about the media in 1965, one more limiting factor in its ability to "tell the truth": "I’ve never been in Time Magazine, and yet this hall is filled twice, ...I don’t think I’m a folk singer; you’ll probably call me a folk singer, but the other people know better. The people that buy my records and listen to me don’t necessarily read Time Magazine." What Dylan is recognizing here is the mainstream's inability to fully comprehend a subculture. The outsider can't become an insider in such a limited amount of time. When he finds out at the outset of the interview that the reporter will be attending his concert, he tries to warn and prepare him: "Okay you hear it, see it, and it’s gonna happen fast and you're not gonna get it all, and you might even hear the wrong words."

So even if a reporter takes Dylan's advice and tries to seek out the truth first-hand, it is likely that cultural barriers will still prevent him or her from truly "getting it." So in such a hopeless situation, it's not surprising that the confrontational tone that Dylan takes throughout the Time interview takes a magnanimous turn. Right after he remarks that the readers are being "put on" and that the reporter's article will eventually mean nothing, he says: "I'm not putting that down, because people have got to eat and live... (long pause) but at least be honest about it."

Saturday, May 07, 2011

A New Old Media



The term "shot heard round the world" was first applied to the bullet that apparently started the American Revolution at the Battle of Lexington and Concord. This event occurred in 1775, but it took until 1837 before Ralph Waldo Emerson retroactively bestowed the title. And in a metaphorical sense, he was not wrong. The world did hear about this shot. It just took awhile. News didn't travel fast in those days, as exemplified by the Battle of New Orleans, a War of 1812 battle that was fought after the war was over, since news of the peace treaty did not reach the combatants for two months.

If I had easy access to news archives (and the time to study them), I would be interested in investigating how much the newspapers of that era devoted to covering the dissemination of the news. My hypothesis is that they didn't mention anything related to how people acquired knowledge of the major events of the world. I think it would have been taken for granted that information was at a premium, that it did spread, but that it spread in uneven, unpredictable, and imperfect ways. In such a climate, I wonder how much responsibility news publications felt to be the official standard bearers of truth and accuracy. Our contemporary sensibilities are that media outlets should be held to the highest standards of factual accountability, almost as if they are presenting exhibits in a courtroom. But from what I know of newspapers of the early days of America, they were highly ideological. They would seek to inform the public, but they had an agenda for doing so (and one can surmise how facts, already slippery things when people attempt to view them neutrally, become distorted when they are viewed through the prism of a particular bias).

Last week, a Navy Seal fired a shot which was likewise heard around the world. And unlike the shots fired in the 18th or the 19th centuries, the time lag between that shot and when the world heard about it has drastically diminished. But not only did the media report on the shot itself, part of the story was how the public came to learn about it (and how they reacted to it). The role of social media was highlighted in many articles, with stories about how the news first leaked on Twitter, how a man in Pakistan inadvertently tweeted about the operation before the shot was even fired, how sports fans in Philadelphia started a celebratory chant after many of them received the news from mobile devices, and how many people first found out about the death of Osama bin Laden through either Facebook or Twitter.

I personally found out from a Facebook post. I commented on the post saying, "This is the biggest news I've ever learned from Facebook." Moments after seeing the post, I ran across another post claiming that bin Laden had been killed in a bombing. Later I switched over to "old media" and I learned that it was not a bombing, but an intense firefight with Navy Seals. Later I learned that there was only one armed resistor in the compound. Still later I learned that bin Laden had used a woman as a human shield, later I learned that the woman was his wife, and still later I learned that he didn't use her as a human shield at all. And later still I saw several links on Facebook to purported pictures of bin Laden's corpse (though I knew enough not to click on them).

It's easy to blame social media for the rapid dissemination of false or misleading information in the wake of a major news event. But neither Twitter or Facebook existed at the time of the Columbine school shooting, and we now know that much of what was reported in the immediate aftermath of that event was false. And Facebook has nothing to do with the misinformation that was spread after the death of Pat Tillman.

Newspapers have been referred to as the "first draft of history," a description that simultaneously celebrates and cautions. And it is apparent that we have entered an age in which even that description needs revision, where a new media has taken on that function. But as exciting as it has been to see a new media develop in my lifetime, to be able to see huge stories break on platforms that I wouldn't have recognized even a few years ago, I would be even more excited about the emergence of another "new media," one that could have theoretically developed centuries ago, but never did.

Wouldn't it be nice if somewhere between the first draft of history and whatever draft is currently being disseminated in the halls of academia, we had some forum through which the public at large could be informed about events that, through the passage of time, we can more clearly comprehend? Obviously, investigative journalists can break stories at any time, but with the pressures that media outlets face to stay current, unless there is some outside agent that provides motivation, that first draft runs the risk of becoming ossified. So I propose a new media, which can find its niche by its devotion to the old. And there is a bit of a precedent: if Emerson could "report" on an event 62 years later and give it a title that still reverberates, there is no reason that shots fired today (or last week) can't do the same.