Saturday, November 24, 2012

Untitled


I have never been a huge fan of brevity.  I realize concision is all the rage these days, that we need to say what we want to say in 140 characters or less.  When I started this blog, Blogger was, along with Myspace, probably the dominant platform by which people shared their ideas with the world.  Other nascent social media sites such as Xanga and LiveJournal saw users who would often pen lengthy missives.  Now, Facebook, Twitter, and Tumblr encourage, if not mandate, users to boil their thoughts down to terse communiques. 

But it's not just in the online world that I am conscious of the favoring of succinctness. In a series of vocations I have had in my life, I have been made constantly aware that boredom is the enemy, that if I'm not "to the point" with what I want to say, my audience will tune me out.  This is always difficult for me to accept, since I came of age in an environment where I didn't realize that tuning out was even an option.  When you have four television channels, channel surfing gets old fast.  When you can't scroll through or click on items that provoke interest, you simply read pages in order, in their entirety.  When a sentence you have just read doesn't make sense, rather than click on to the next thing, you go back and reread the sentence multiple times.  When a song you don't like comes on the radio, rather than switch channels (which often seems more trouble than it's worth when you have to tune a dial), you suffer through it until another song comes on.  And when a teacher is in front of the room talking, there is no phone for you to resort to, so you might as well pay attention.

This is not to say that I haven't now, to some extent, embraced the culture I now inhabit.  I don't read nearly as many 19th Century novels as I used to, certainly not nearly as often as I scroll through my Facebook feed.  I am committed to posting something at least semi-lengthy on this blog once per week, but I post on Facebook several times a week (in light of my frustration with character limits, though, I still limit my Tweeting).  And I fully appreciate the ability to not suffer through bad songs.  Likewise, I no longer feel the need to listen to awful TV sports broadcasters, since I can watch a game while following stats or having message board conversations online.

But I do worry that our media is leading us to a more superficial existence.  I know that I'm not the first person to express such a worry. And yet, I wouldn't regard this move toward "the shallows" as an entirely recent phenomenon.  I fully realize that an irony in the above paragraphs is my suggestion that the mid-2000s web and the media of television and radio represent the "good old days" of human engagement, of lengthy discourse.  I do think it's true that the web today is more "shallow" than the web ten years ago, and that the cable TV multiverse is more "shallow" than the network TV of my youth, but of course I recognize that the discussion is relative.  My understanding is that when Ralph Waldo Emerson toured to give lectures in the 19th Century, he would speak for hours.  Likewise, preacher's Sunday sermons could last the length of today's football games.  But something happened to our collective attention span long before anyone knew what a "world wide web" was.  The Internet was not the first invention to encourage people to devote less time to engaging with depth.  More than 100 years prior, the telegraph may have been the first invention that incentivized brevity in communication.

Actually, I would argue that the invention of the newspaper headline, which wasn't until the late nineteenth century, represented a notable new idiom. Book titles in the first couple of centuries after the printing press were notably different than titles today--they weren't rhetorical in the sense that they didn't attempt to inspire purchases.  Authors strove to truly summarize the contents of the book, which usually meant long (and by today's standards unwieldy) titles.  And for a long time, newspaper articles appeared without a headline attached.  But since newspapers were regularly produced and eminently disposable, writers and editors realized they were competing for a potential reader's attention.  The complexity of any news story was compressed into a few mere words; one could regard headlines as the first tweets.  Or perhaps a better comparison would be what some have referred to as "click bait."  Long before people relied on a mouse to navigate their consumption of content, headline writers anticipated the mentality that would drive people to engage or disengage.  And of course, the famous "inverted pyramid" of newswriting helped to condition people to expect an instant pay off in return for attention.

So now given the century long move toward shallowness, I don't hold out hope that we can put the genie back in the bottle.  But I wouldn't mind seeing, on principle, headlines that don't give us what we expect.  In 1986, a magazine editor somewhat famously declared that he found the most boring headline ever in the New York Times: "Worthwhile Canadian Initiative."  I'm not sure if that particular initiative proved to be worthwhile or not, but I know that if the Canadians ever came up with an initiative to do away with headlines entirely, I would find that worthwhile.

Sunday, November 18, 2012

Iconic Golden Sponge Cakes


If someone had never heard of Twinkies before this week, based upon media descriptions, they would probably think that Hostess manufactured a product called "iconic Twinkies." If only the parties involved in "The Great Schism" would have known that one day the word "icon" would bring to mind golden sponge cakes with creamy fillings, maybe they could have set aside their differences in order to focus on defeating a common threat.

I suppose it's not inaccurate to refer to Twinkies as "iconic," given that the product is instantly recognizable to hundreds of millions of people, and does possess some kind of historical cultural significance. But under this criteria, innumerable brands qualify as "iconic."  Essentially, if a product achieved market penetration at some point during the "Golden Age of Television," it is now iconic.  But if I was a CEO, I would much rather try to sell a non-iconic product.  Iconic products suffer from having too many associations.  By transcending their utilitarian value, they are rendered non-utilitarian.  Or more accurately, the utilitarian value is their mere existence, and they therefore can serve their function simply sitting on a store shelf, unbought. 

Consider these two hypothetical scenarios--Option A is that Twinkies will continue to be sold, but you will not be able to eat one the rest of your life.  Option B is that Twinkies will cease to be sold, but you are permitted a private lifetime supply, and you must avail yourself of the product frequently for the rest of your life.  I suspect that if people were forced to choose, the vast majority would select Option A, and not out of a sense of perceived civic obligation.  As the world shifts from analog to digital, and as economic uncertainty permeates every aspect of everyone's life, products that have survived multiple generations become imbued with the power of providing psychological comfort.  Alternatively, products that had appeared to have transcendence but stand revealed as impermanent become imbued with a different kind of power--the power to provide the opposite of psychological comfort.   And this would explain the reported "Twinkie Raids" over the past few days.  People who have never had much of a physical craving for the product are not immune from having a psychological craving for true icons, for permanence (and yes, Twinkies may very well be unique among food products in that they probably could sit in someone's cupboard permanently).

Given that this was the "weekend of the Twinkie's twilight," over the course of the past few days I found myself thinking about other "iconic" entities.  Late Saturday, I realized that Notre Dame is the Twinkies of college football.  Certainly, in some eras, iterations of their uniform have resembled Twinkies.  And over the last couple of decades, Notre Dame football has subsisted more on its iconic stature and historical significance than on its contemporary relevance.  In recent years, I think it had gotten to the point that even Notre Dame haters can't help but lament the program's lack of success.  Schadenfreude isn't freude when there is only perpetual schaden.  Those who actively root against Notre Dame would much rather they lose high profile bowl games than have them fail to make bowl games at all.  The Irish fall from grace has had the effect of giving psychological discomfort to college football fans everywhere.

But wait.  Who is the top team in the nation now?  Maybe there is hope for those golden sponge cakes after all.

Saturday, November 10, 2012

The Positive Push


Every significant action or statement in human history has been met with at least some negative criticism.  It is impossible to take an action without inspiring a reaction (and of course, that's how it should be; we wouldn't want to live in a world where any one person unilaterally did the thinking for all others).  And now, especially in the era of mass media, anyone who disseminates a message to any kind of an audience must be ready for some kind of a negative response.  It's easier than it's every been to broadcast (or at least narrowcast) something, but it's even easier to issue a criticism to something that has been broadcast.  For most of American history, if someone wanted to issue a public response to a public figure, they would either have to stand on a soapbox, or else they would have to mail a letter to a newspaper and hope it was published.  By the latter part of the 20th Century, you could call a talk radio show.  By the tail end of that century, you could write something online.  And now in the social networking era, anyone with a Twitter account may function as a public figure, and anyone with a Twitter account may criticise a public figure.

So if statements or actions inspire negative reactions, and we have more statements and public actions than ever being made, I'm comfortable asserting that our culture is more negative than it's ever been.  I don't want to oversimplify matters.  I'm well aware that for all the hand wringing in recent years over the supposed loss of "civility," there has never been a society where dissension has always been expressed with perfect civility.  But again, the key to my assertion is quantity.  The incivility is not necessarily more objectionable than in the past, it's just that there is more to be uncivil about.

And that's why I was intrigued to read about the existence of something called the Bills Mafia.  The Bills Mafia is a loosely organized group of Buffalo Bills fans--an organization that started on Twitter and now exists in real life.  The origin of the Bills Mafia is directly resultant from the cesspool of aforementioned Internet negativity.  A couple years ago a Bills player wrote something stupid on Twitter, then an ESPN reporter used the same platform to mock the player, and then a group of Bills fans used the same platform to mock the ESPN reporter.  All parties have since moved on, but the Bills Mafia remains.  And now the organization forswears negativity.  Grantland author Ben Austen writes: "Thomas DeLaus, a 24-year-old front-end supervisor at Walmart, and Nick Primerano, 31, who sells communications systems to the federal government, broke down the ethos of the group, and really of the Bills faithful more generally.
"It's been a rough decade," Thomas said.
"But we're a positive push for growth," Nick chimed in.
"The hashtag can't be used for negativity."
"No matter what, we're about team."
"Whether wide right … "
"And no matter what happens at 'The Ralph' tomorrow … "
"It doesn't matter if it's zero degrees at the game … "
"We back the players."
"Community," said Thomas.
"Once Bills Mafia, always Bills Mafia."
The "wide right" quote is a reference to a missed field goal over 20 years ago that cost the Bills their best shot at a Super Bowl championship.  Austen writes about how Bills fans treated Scott Norwood, who missed that kick: "Rather than try to murder the real Scott Norwood, Buffalonians embraced him when the team returned from the 1991 Super Bowl. At a rally held for the team in downtown's Niagara Square, 30,000 fans chanted for Norwood to come to the dais. "I know I've never felt more loved than right now," a weeping Norwood told the crowd."

One thing that used to bother me immensely as a young sports fan was hearing fans of my local teams criticise players for the local teams.  I couldn't reconcile how you could like a team and not like guys who played for that team.  Packers fans have a reputation for being among the best, if not the best in professional sports (decades long waiting list for season tickets, undying loyalty, etc...).  But that doesn't mean that they have always loved individual players unconditionally.  The likes of Tony Mandarich and Terrell Buckley--high draft picks that held out for big contracts and then woefully underperformed--could tell you otherwise. 

Now, one could certainly make the case that many athletes who are booed deserve to be booed, and that it is valid for fans to criticise athletes if the criticism is fair and is not "personal."  That may be, but how efficacious is such action?  It certainly has little, if any, bearing on how a team's personnel department or coaching staff manage a roster.  Fans would probably argue that it presents a psychological release, that tweeting venomous statements about, say, Jermichael Finley after he drops passes is more personally satisfying than not tweeting anything.

But reading Austen's article, I get the impression that being a member of the Buffalo Bills mafia, which means "backing the players" even when the players don't deserve such support, is highly satisfying.  One might counter that this should not be so, that to blithely and happily accept years of more losses than wins, to remain irrationally optimistic, and to cheer for underperformers is foolish.  But a counter to that counter is that to be a sports fan is inherently foolish to begin with, so one might as well choose to be foolishly happy.

This is not to suggest that a sports fan should apply such a mentality in all areas of their life and never utter a negative sentiment in any context.  As I stated at the outset, we wouldn't want to live in a world where anyone could take action of any sort without any fear of reprisal.  But perhaps we need to find a way to better choose not only how, but when, to allocate our right to make reprisals. 

It could be that an entity like the "Bills Mafia" is a cultural aberration--a geographically isolated pocket of irrational loyalists.  Or it could be that it is representative of the beginning of a trend--a response to a saturation of negativity now weighing heavily on all of our discourse.

Saturday, November 03, 2012

Should We Tell Kids That Their Lives Are Weird?


Last week, Slate magazine ran an article entitled "How Can We Make Middle School Less Awful?"  The authors made a number of compelling arguments.  Among them: A) Our society doesn't prioritize improving middle schools for a number of reasons, one being that adults are in charge of improving schools, and adults would rather not think about middle school, as they would by and large rather not call to mind their own middle school experiences B) middle school is important, since it is a predictor for high school success, which is in turn a predictor for later life success C) Schools that strive to meet the emotional needs of middle schoolers also end up ensuring that students' academic needs are met, and D) Meeting the emotional needs of middle schoolers means making them feel like their environment is both safe and fair.  Personally, I'm on board with all of the above.  But I also wonder what may be missing from this assessment.

For me, maturing into adulthood has always meant being constantly surprised--not necessarily by the present, but by the past.  The mindset of a child is that one's external environment is the norm, and they therefore must align their perception with that norm.  Perhaps for most this mindset even continues throughout one's life; part of being able to adapt to life changes involves recalibrating to a new norm.  In other words, there is a passive acceptance that "that was normal then; this is normal now."  But I've always had the habit of going back and re-evaluating the old norm, which usually has meant a retroactive change of perception. I can't help but realize that a world which I once thought was normal is actually, in hindsight, totally weird.  So I'm inclined to say "this is normal now; so that must have been weird then."

For example, when I was in sixth grade, it was commonly accepted among my peers that there was a Satanic cult in our town which conducted ritual sacrifices in "the old Monarch Range building," presumably while listening to Metallica (actually, in the 1980s, easy acceptance of allegations of ritualistic Satanism was not limited to middle school kids).  Five years hence, my peer group had a new norm.  No longer were we thinking that Satanists had infiltrated our community--the focus had turned to urban gangs.  Of course, the examples need not be sensationalistic.  The sociology of fads, the measure of cultural capital, comprehension of morality (and consequences for breaching moral imperatives), notions of conformity--all of these factors illustrate that the reality of a preadolescent (or any unemancipated minor for that matter) bears shockingly little resemblance to the reality that they will one day inhabit.

But we don't make too big of a deal of this.  We generally communicate to children within the context of the reality they inhabit and allow them to proceed, sometimes gradually, sometimes dramatically, into the next reality.  And then once they are there, we don't ask them to go back and relive the old one. 

But what would be an alternative to this?  We could sit down middle schoolers and inform them (in so many words) that the world they are living in is actually a weird one, separate from the normal one that their parents live in.  We could trace for the implications of the weirdness of their existence.  This might mean telling them that many of the things they like now they will not care about in the future and that many of their friends now won't be their friends in the future.  This may allow us to better isolate for them what is normal about their weird existence.  We could emphasize that how they treat other people and how well they learn in school are parts of their lives that do have implications long beyond their present, weird, reality.

And perhaps we could do better to bring our supposedly normal reality to them.  If a study comes out that shows the causes and effects of bullying among pre-adolescents, rather than keep such information to ourselves, perhaps we could share it with them.  If they seek to transgress for the sake of transgression, we may change the focus from talking about why a boundary is set, and instead perhaps disarm them by showing that we understand the sociology of transgression.  If they seek to have some measure of control over their environment, is there a way to enfranchise them without necessarily giving them voting privileges?

I'm not suggesting that we force our children to grow up too fast.  But I am suggesting that if they are already outgrowing something, maybe it's best not to try to let them labor under the illusion that what they are growing into is anything but a penultimate, weird existence.  But maybe that's too weird of an idea to catch on.