Saturday, September 29, 2012

Upon Further Review


According to the 2004 film Finding Neverland, the original director of the Peter Pan stage play, fearful of a lack of reaction by staid Victorian audiences, brought in children from orphanages to seed the audience.  When the children clapped and cheered wildly, the older folks followed suit.  Unfortunately, the Internet tells me that this is pure artistic embellishment.  Audiences loved the play without any external psychological prodding.

But I don't doubt that in the event it had been necessary, such a ploy would have been successful.  The famous Asch conformity experiments of the 1950s showed that people will almost always alter their perception if they feel that the are out of step with a majority.  It's not that they change their minds, it is that they literally change their understanding of what their senses apprehend.  The authors of the book Scorecasting actually attribute much of the home field/court advantage in sports to officials being unconsciously influenced by  home crowds.

Of course, being a Green Bay Packers fan, I would see evidence of such an influence in last Monday night's already historically infamous outcome.  Naively, I wasn't stressed out when the officials first indicated a touchdown for Seattle on the game's last play.  Knowing that all scoring plays are now reviewed on instant replay, I was fully confident the call would be overturned.  So when my confidence was shattered, I was left looking for explanations.  How can someone be that incompetent-- to assert something as true when there was "incontrovertible visual evidence" to the contrary?  Clearly, the referee was not the best at assessing what had been happening at "game speed," but could he really be that bad at judging slow-motion replay?

Clearly, his ability to perceive had been altered by the environment he was in.  In addition to attempting to judge the outcomes that had already occurred, he was clearly conscious of the outcomes that could result from his assessment of the replay.  The Seattle fans were cheering deliriously, convinced that their team had won.  A principle of psychology is that it is always harder to give up something that you think you already have than to miss out on something you never counted on (which is why games that involve blowing a big lead always feel worse than games where you rally but come up just short, even though they are both losses).  Seventy thousand people would have blown up in (irrational) anger had the call been overturned.  Never mind the millions of people who blew up on Twitter, the immediate context governed the official's mindset.

But the millions of people on Twitter did end up having an effect.  I have seen some people argue that the deal that was reached this week between the NFL and the regular officials could not have been the result of public pressure.  This sentiment rests on the notion that NFL fans are essentially a captive audience, that no amount of poor officiating could turn their eyeballs from their TV sets.  But as I previously discussed in relation to NBC's coverage of the Olympics, there is a psychological cost to bad P.R. that wears on those who have no concrete fiduciary incentive to make concessions.  Bill Simmons argued the same in a recent column for Grantland: "as soon as the commissioner and these owners were put in the position of dreading interactions with everyday people, this was over. So I'd argue that we DID have leverage, and we used it the old-fashioned way."

But Simmons is still privileging immediate physical interaction as the context that normalizes perception.  As our interactions become increasingly mediated by technology, I wonder to what extent the principles of crowd influence will need to be examined and reevaluated.  For a couple of generations now, television has made events accessible to mass audiences, but only in the last few years have they truly become communally shared events.   And just as people have long looked to others to validate their perceptions, people are now turning to the digital crowd.  What that means is still being worked out, but perhaps we can hope for a day when it is truly impossible for anyone to "seed" a desired response, and perhaps we can hope for a day when officials (in all walks of life) will be influenced by more than just the sentiments of the immediate crowd.

Sunday, September 23, 2012

After the Behind the Music


Wikipedia tells me that the 15th season of VH1's Behind the Music began a couple of weeks ago.  This was a surprise to me; I had assumed that BTM had gone the way of MTV's TRL long ago.  Although these two shows appealed to different audiences, in the pre-social networking era of music they were the dominant means by which artists were presented to the public as "important" or "relevant."  I never watched an episode of TRL from start to finish, but at the turn of the century I never passed up an opportunity  to watch BTM.  I've always been a sucker for narrative expositions of artist's careers packaged with retrospective soundtracks and the booming voiceovers reminiscent of National Geographic filmmstrips.  BTM was so successful for VH1, that predictably it led to overexposure (for some reason, it is indelibly linked in my mind with the Regis Philbin version of Who Wants to be a Millionarie?, another turn-of-the-century ratings hit and cultural phenomenon that was run into the ground by its network).  But it wasn't just overexposure that led to a backlash against BTM.  It also suffered from its formulaic storytelling.  Every band or artists was portrayed in a three-act play, with the first act consisting of the "rise of the underdog," the second act was a descent into debauchery and tragedy, and in the final act, the band or artist found redemption through perspective and maturity (and invariably, recorded new material which was always inferior to that produced in the first two acts).

The problem with sticking anyone's life into a classically mythic narrative structure, though, is that the "Hero's Journey" is a fiction.  This fiction has conditioned us to believe that once anyone has ascended from a low point, they wouldn't return to the depths.  I remember feeling vaguely betrayed by Leif Garrett.  The Leif Garrett BTM in 1999 featured an emotional and uplifting reunion between a repentant Garrett and a friend who he had rendered a paraplegic when he had crashed his car high on drugs twenty years prior.  Of course, not long after the episode aired, Garrett was busted for trying to buy drugs  (he's been arrested twice subsequently for similar offenses, most recently in 2010).  Of course, it was ridiculous of me to have any expectations that Leif Garrett's behavior would conform to the behavior we expect from fictional characters.  In a fictional story, the child star perpetrator of a tragic car crash would either die or repent and live an exemplary life.  A series of less spectacular relapses as the character rescinds into obscurity would not be on the table.

What precipitated my trip to BTM's Wikipedia page in the first place?  Two news stories this week once again reminded me that the BTM narrative structure is too convenient for reality.  First, apparently Green Day's Billie Joe Armstrong had an on-stage meltdown and is now in rehab.  I don't remember ever seeing the Green Day BTM, but I know that this does not fit that narrative.  This would have been acceptable for Billie Joe Armstrong back in the days when he was being arrested for public indecency for mooning a Milwaukee audience.  The Dookie-era punk rocking Billie Joe Armstrong would have been a good candidate for such a happening.  But not the 40-year-old father of two Habitat for Humanity volunteer and political advocate.  Likewise, news headlines yesterday trumpeted a drug arrest for Fiona Apple.  I hadn't thought about Fiona Apple for years (I somehow missed that she had released a critically acclaimed album this summer...maybe because I had a busy summer).  But a Fiona Apple drug arrest would have fit in with the narrative of the young rebel who told off the world after upsetting Hanson at the 1997 MTV Video Music Awards.  It does not fit the narrative of a 30-something woman who is touring behind an album that features celesta, bouzouki, and auto harp. 

But if this weekend never had happened, there would still have been more than enough evidence that the BTM narrative was a construct.  Two examples that come immediately to mind: Some Kind of Monster-era Metallica, in which the members somehow found a way to display adolescent growing pains despite having famously cut their hair several years prior, and Brian Wilson, who despite being declared "back" in 1976, proceeded to decades of oscillating between mental and physical states of vigor and infirmity.

So in the final analysis, we should all know by now that real people's lives, especially lives that have had the dramatic amplification of the trappings and affects of celebrity, can't be collapsed into mythological narrative frameworks.  But as our engagement with actual myths recedes and our fascination with celebrity and dubious "reality" converge, we end up projecting such structures where they oughtn't be projected.  Perhaps our only hope is that such narratives reach such a saturation point that we be forced to admit that at least in American rock star lives, there are no third acts.

Saturday, September 15, 2012

Why We Have Time Inflation


This blog has been around for seven years now.  If I wanted to write a boring post (even more boring than usual, anyway), I could document significant changes that have happened in that time, both in myself and in the world.  Or I could draw attention to what has stayed constant throughout that time.  And I'm sure the day will come when I will be tempted to do such a retrospective.  But this year is not that year.  While much has changed in seven years, it's still such a blip on the radar screen in terms of an average human lifespan.  I'm more intrigued by the changes that can occur over a long lifetime, like what it must be like for a 70-year-old now to reflect upon what has changed in a half century (I'm struck by Bob Dylan telling Rolling Stone magazine in an interview published this week that "When you ask some of your questions, you’re asking them to a person who’s long dead. You’re asking them to a person that doesn’t exist. But people make that mistake about me all the time. I’ve lived through a lot"). 

But for all of the paradigm-shifting cataclysmic events that have occurred throughout human history, I think we take one remarkable constant for granted.  Every day is 24 hours.  Yes, we've found artificial ways to prolong our ability to be productive in a day (e.g. electricity, caffeine), and we've innovated ways to increase the amount of days that a typical person spends on this Earth.  But the day itself as a unit of time has not increased or decreased.  And this is somewhat remarkable because over time most human phenomena increases.  Our possessions accumulate.  Our memories become crammed.  Our population grows, our financial markets exhibit inflation, our students' grades inflate.

But in terms of time expenditure, our innovation should allow us to do the opposite.  Clearly, a lot of tasks that we do take less time than they used to.  Communicating across distance is absurdly easier than it ever has been.  Producing written documents for dissemination is absurdly easier.  Copying most products is absurdly easier (I think of the story told at Bob Uecker's statue ceremony--before he found his niche as a broadcaster, Uecker briefly worked as a scout.  The story goes that he once sent in a scouting report that had mashed potato and gravy stains.  But people forget how difficult it was to re-copy something that had been hand-written.  I'm sure a lot of documents back then had stains on them).  Speaking of potatoes and gravy, throughout human history, one of the biggest time sucks has been the process of putting food on the table.  But now, preparing meals is absurdly easier than it has ever been.

There has been a decrease in time spent working in America, but I think one can make the case that the decrease hasn't been commensurate with our increases in efficiency (A think tank called the New Economics Foundation advocates for a worldwide 21-hour work week.  Shockingly, governments and industries have been slow to embrace this standard).  Meanwhile, nobody talks about "time inflation."  While the time in a day stays the same, the time we devote to participating or observing particular events grows.  I would love to see some stats on whether business lunches and committee meetings are longer than they used to be.  What I do know from stats is that in the last 100 years, the amount of days that K-12 students go to school has doubled, and the school day has gotten longer.  When I was in kindergarten, our class was split into the "morning" kids and the "afternoon" kids.  Now it is unthinkable that kindergartners would go home at lunch.  Movies today are on average 20 minutes longer than movies produced during Hollywood's "Golden Age."  In that same time frame, baseball games have gotten 30 minutes longer.  Thirty years ago, night baseball games used to start at 7:35 p.m. local time, so that they would be done by roughly 10:00.  Now they start at 7:10 and usually last well past 10:00.  Thirty years ago, NFL games on Sundays were scheduled for either Noon (Central time) or 3:30 p.m.  Now they are Noon or 3:25.

Conventional wisdom is that attention spans have gotten shorter in recent years, but how does that accord with the reality that we are doing things longer?  My theory is that as it has become harder to sustain attention on any one task, and we frequently have at our disposal means to divert our attention, we really aren't doing things longer, we are doing more things in a diluted time block.  The first known use of the word "multitask" was in 1966--not coincidentally, about the time that there is an observable uptick in length of events.  For all of the revolutionary changes that occurred in that decade, the invention of this word may be the most underrated.  People now find it easy to sit through a three and one half hour baseball game because they are spending a good chunk of that time staring at electronic devices in their hands.  Take away the devices, and I'm guessing there would be a great outcry about the length (and unwieldy pace) of the game. 

And with that, I think this blog post is done.  It only took me about three hours, during which I did about four other tasks.

Saturday, September 08, 2012

The Rise of the Team


No one would every care enough to pose this as a poll question: "What do you think is the biggest difference between the ancient Olympics and the modern Olympics?"  But if that question ever was posed, I think the most popular answer would be either that it is global or that it is commercialized.  Some might respond that athletes now wear clothes, that women are a bigger part of the games, or that participants no longer glorify Zeus.  And all those are valid answers, but I propose that the most underrated change is that now teams sports are part of the Olympics.  As far as I can research, there were no team sports in ancient competition (not even relays). 

As the NFL commences competition for the 2012 season this weekend, we are again reminded that team sports are now dominant in our culture.  And though the specific sport may change depending on the nation, in most cultures around the world, team sports are much more popular than individual sports.  But in human history this is a relatively recent phenomenon.  I recently read this fascinating article about the most famous American athlete of the 1870s--a competitive walker. Other popular sports of the late 19th and early 20th century were boxing, wrestling, tennis, horse racing, bowling, golf, and baseball.  Baseball of course stands out on the list as the lone team sport, but it's worth noting that of all the team sports, baseball is most like an individual sport (the study of advanced metrics in baseball is so far ahead of other team sports because it has been called "an individual sport disguised as a team sport").  It wasn't until well into the 20th Century that football and basketball began to pass the individual sports in popularity.  And it wasn't until that century that people began to talk about "teams" as groups of human beings rather than oxen.  Although the word "team" has existed for centuries, the word "teamwork" (in the sense that means "a group of people setting aside individual goals for a common cause") was first used in 1909--meaning there are some people alive today who are literally older than teamwork.

Football is the most popular sport in America for a variety of factors, but I don't think it's a coincidence that it is the sport that most subsumes individual players' identities.  Other team sports are also seeing an eroding of individual cache, and somewhat fascinatingly in the NBA, it's the players themselves who are doing it, as they eagerly embrace the "superteam" concept.  To come back full circle to the Olympics, the very concept of a "team" was anted up in 1992 with the "Dream Team," a nickname which has been subsequently doled out to the point of banality.  But I think it's worth mentioning that in the "Trial of the Century" (speaking of banal cliches), it was not a single attorney (Johnnie Cochran's lead role duly noted), but rather a "Dream Team" of lawyers that is credited with inducing O.J. Simpson's acquittal.

Although sports does continue to be the context in which we most often think of "teams", the example above helps demonstrate that "teams" are pervasive in every aspect of our culture.  Certainly, the vocabulary of the workplace is now liberal in describing workers as part of a "team," with the term "team building exercises," yielding over six million Google hits.  The Avengers became one of the top performing movies of all time on the strength of the premise of putting individual superheroes on a team.  And another intriguing new application of the term is the use in pop culture of assigning one's allegiance to a particular individual (e.g. "Team Coco" or "Team Edward").

So what is the significance of our cultural migration toward "team building"?  I don't have time to write the book about that subject, but I did consider it in light of the recent political conventions.   The term "Team Obama" yields over 1.5 million Google hits, while "Team Romney" comes in at about half a million.  But I think those results may be deceiving. First of all, I wouldn't use that as a metric to prognosticate election results given the former's head start in building his team.  But in a larger sense, I question where the fans true allegiances may lie.  Concurrent to the recent rise in the popularity of the "dream team" concept has been the introduction of free agency.  We are increasingly seeing individuals as impermanent and replaceable parts.  And with the knowledge that the members of the team will sooner or later give way, we end up clinging to the external symbols of the team itself--the logo, the uniform, the colors--and our loyalties and allegiances to those symbols may override any other concern.

Monday, September 03, 2012

Just a Bit Outside


The last day of fourth grade, my teacher went to the front of the room and began to announce and distribute awards--certificates of accomplishment.  I had no idea prior to this ceremony that any of these awards existed.  Nobody in our class had aspired to any of them, but everyone who received one was more than happy to accept the certificate they were proffered.  Anxiety mounted in my gut as I wondered if I would be receiving one.  At last, it was announced that I was the class "outstanding weather person."  Even now, I do say that this was a well-deserved designation, since I had of my own volition decided, during a meteorology unit in science class, to keep a record of daily weather statistics (in other words, I copied down the trivial information given by the TV meteorologist that most people ignore, then handed it in to my teacher unsolicited). 

Of course, I needn't have feared whether I would be given an award.  It eventually became apparent even to my fourth-grade brain that the teacher had contrived a specific award to give to every member of the class, so that all of us would go home happy and satisfied that we were high achievers who were well-equipped to handle the rigors of fifth-grade.

We are a society that likes to give awards.  Sometimes there are stringent qualifications in place to objectively determine who gets an award (such as who receives a gold medal for the 100-meter dash).  More often, a group of people uses subjective criteria to identify the worthy party (because everyone knows committees are the most reliable method for making decisions).  Sometimes, awards are arbitrarily created in order to boost a kid's self-esteem.  But for all the trophies, plaques, medals, ribbons, banners, and certificates of accomplishment that are issued,we are also a society that values humility.  For some reason, the minute someone starts talking about how deserving they are of winning some kind of award, this is considered a possible reason not to give them said designation.  Meanwhile, those who claim to have no merit are often looked upon kindly when merit is assigned.  The proper way to act when given a compliment is to denigrate oneself (Michael Jordan's Hall of Fame induction speech stands out as a public relations disaster because Jordan insisted upon doing the exact opposite).  Likewise, those who make a habit of self-deprecation are often lauded higher than anyone else, and consequently are given the highest accolades.

I thought about all of this while watching Friday's dedication of a statue of Milwaukee Brewers' announcer Bob Uecker at Miller Park.  It was a curious combination of toast and roast.  Uecker's autobiography is subtitled, "The Man Who Made Mediocrity Famous."  He has built his entertainment career on self-deprecation.  And yet the "honors" section of his Wikipedia page is longer than the any of the other sections: he belongs to four halls of fame (yes, one of them is the pro wrestling hall of fame, but still).  He has his name on the Miller Park "Ring of Honor," and now he has a statue alongside Hank Aaron and Robin Yount.  During the statue ceremony, much of the humor was predictably in the pattern Uecker has established, many of the speakers making cracks at his expense.  (Bob Costas remarked of the statue: ""Pigeons all over the Midwest migrated to Milwaukee to pay their respects").  But of course, there were plenty of accolades for one of Wisconsin's all-time greatest entertainers, much said about his talent to make us laugh, as well about his personal characteristics, with his great loyalty being particularly highlighted.

But I found it odd that pains were taken to highlight aspects of his famously undistinguished playing career.  It was pointed out on multiple occasions that Uecker hit a few of his 14 career home runs against legendary Hall of Fame pitchers. A slide show was presented in order to highlight specific statistics so as to portray his career in the best possible light (at one point, making a ridiculous comparison between his career fielding percentage as a catcher vis a vis the other Milwaukee statue recipients, who played different defensive positions). It was almost as if the Brewers organization felt obligated to point out to people that Uecker has always exaggerated his ineptitude.  But why would they feel this obligation?  Everyone knows that Uecker's statue was awarded for entirely different reasons than Aaron's or Yount's.  But something apparently made the powers-that-be uncomfortable about the possibility of leaving people with the impression that Uecker's playing career was a complete and total failure.  Perhaps Uecker's self-deprecation inspires a kind of defensive reaction among those who revere him.  But ironically, if it were not for his self-deprecation, there would be no reverence to begin with.

It makes me wish that instead of "Most Outstanding Weather Person," I had been awarded "Most Humble Weather Person."