Saturday, August 29, 2009

Where Were You When You Learned Ted Kennedy Existed?

When I was a kid, John F. Kennedy might as well have been Abraham Lincoln. To me, they were both towering and tragic historical figures, but long dead, and belonging to an era that passed before my birth. I have clear recollections of the occasion of the 25th anniversary of Kennedy's death; I remember watching televison specials about his presidency and his assassination, both at home and in the classroom. My (fifth-grade) teacher that year had a great interest in American presidents, so I became well-versed in the JFK saga (I also remember reading about twenty pages of a Jackie Kennedy biography before finally succumbing to boredom). And the degree of mythologizing I was exposed to served to cement my perception that JFK's Camelot was just as distant from my world as the fictional Camelot was.

Of course though, time is relative, and I've come to realize that back when I was first becoming congnizant of the cultural impact of JFK, he really hadn't been gone all that long. Twenty-five years is more than two lifetimes for a 5th grader, but now that the then-recent Challenger explosion is almost 25 years in the past, I have an appreciation for Faulkner's famous line: "The past isn't dead. It isn't even past."

But actually, my realization that JFK isn't in a sense dead or of the past came about only a few years after that 25th anniversary. I don't think I can accurately convey in words the mixture of surprise and dumbfoundedness I experienced upon learning that JFK had a younger brother who, in the modern world of color TVs and CDs, still served in the U.S. Senate. All the while I had been learning about our 35th president and the prestiguous family that he came from, nobody had seen fit to inform me that a guy who used to sit at the dinner table with this mythic, legendary, larger-than-life figure was now regularly toiling away on legislative initiatives. (Interestingly, nobody told me about Bobby Kennedy, either, which would have gone a long way to clearing up the befuddlement I felt whenever NFL announcers would say they were broadcasting from RFK Stadium in Washington. "Shouldn't that be JFK Stadium?" I would wonder). I wouldn't have been less shocked to suddenly learn that Elvis Presley had a little brother named Calvin who still recorded songs that would occasionaly show up on the adult contemporary charts and who would often tour the country as an opening act for Neil Diamond.

But again, the perspective of time has enabled me to see why Teddy Kennedy's long legislative career was not a point of greater emphasis to schoolchildren. Whatever the scope of his accomplishments as a lawmaker, they fell short of curricular inclusion--which is no great shame, as few senators are thought worthy enough by textbook editors to get their names in bold print (and fewer still for good reasons). But it was precisely because of those accomplishements that Ted Kennedy was able to carve out an identity that to some degree separated him from the mythology and the legacy of his brother.

Yet what ultimately interests me, as I reflect on the Senator's own legacy, is not how history will remember him, but on how we remember history. (And my apologies to the entire Kennedy clan for my brutal attempt at chiasmus). Every year about this time, much is made about the Beloit College Mindset List. We are reminded the cultural references fade over time, that the next generation lacks the same cultural touchstones, and that in short, the old gives way to the new. But given the uncompromising assumptions that are often made about what younger generations should know (such as, in my case, the unspoken existence of a surviving Kennedy), and the subtle way in which the young are subjected to inculcation of nostalgia, I wonder if there are some people who were, like me, at one time shocked to learn of the existence of Teddy Kennedy, but who watched his funeral coverage this week always knowing that he was an accomplished senator from the Commonwealth of Massachusetts.

Saturday, August 22, 2009

Brett Favre, Shakespeare, and Scapegoats

I'm fully aware that the world doesn't need someone else to opine about Brett Favre, but after going back and forth all week, and changing my mind several times, I've finally determined to go ahead and give it a shot and write about him.

I get the sense from monitoring media (and facebook statuses) in both Wisconsin and the nation that the quarterback is the subject of just a little bit of enmity. He's faced a splattering of allegations, ranging from being too addicted to public acclaim to relinquish the spotlight to calculatingly wiggling out of training camp duty. But at the heart of all invectives thrown Favre's way is a great impatience with his admittedly absurd degree of equivocation. It's a cliche to describe the American public as "forgiving," and though I don't generally disagree with the notion, I do think that Favre is learning the same lesson that presidential candidates do every election cycle-- we despise public "flip flopping," and we will punish those who engage in this activity.

The notion that resoluteness is a virtue, albeit at times a tragic virtue, is imbedded in our cultural DNA. Shakespeare has Julius Caesar state that "I am constant as the northern star, Of whose true-fix'd and resting quality There is no fellow in the firmament." (Though his intractability, most notably in his refusal to listen to warnings, is of course one of his tragic flaws). Yet on the flip side, many attribute Hamlet's tragic end to his legendary indecisiveness. Going back ages further, Sophocles made a career out of creating and subsequently killing characters on the basis of their unwavering resolution to a certain cause. To bring the discussion back to our home soil, the great American novel, Moby Dick, is constructed around a central figure who pursues his "monomania" to the point of destruction. And then there is The Great Gatsby, which speaks to both sides of the issue. The titular character is decisive, resolute, and ultimately suffers a tragic fate, while the supporting characters are indecisive, irresolute, and ultimately suffer the perhaps worse fate of a shallow and empty existence.

The point of invoking these literary examples is to ponder the degree to which we have historically been fascinated with projecting our own anxieties about commitment and resolution onto characters. And I believe Brett Favre is now (if he wasn't before) a national literary character, with all the rights, priveleges, and curses that this designation confers.

But why should we have anxiety about commitment and resolution? For good reason, actually. You can start with the oft-repeated fact that 50% of marriages end in divorce. But less dramatically, who among us hasn't committed an act of extracation? We RSVP that we will be at the wedding, but then something comes up. We accept the job offer, but then realize that we just can't go through with it. We join the club, team, or organization, make a commitment to be there, and then hope that no one notices when we slink away. And this behavior is contrary to what has been instilled in us. And it can be argued that it is instilled in us for good reason-- society would fall apart if there was no sense that statements of declaration have consequences, that social ties and obligations can be severed at the behest of the merest whim.

So because of our unwavering resolution about the importance of resolution, but also because of our collective guilt and anxiety about our unwavering irresolution, two types of scapegoats emerge. In the spirit of jealousy, we need to create and then kill figures who put us to shame with the power of their convinctions (and so the violent ends of Julius Caesar, King Oedipus, Captain Ahab, and Jay Gatsby). But we also can't let anyone too openly flaunt a contempt for resolution. And this is why Hamlet was killed, why Ross Perot was made a national punchline nearly two decades ago, and why Brett Favre is getting the reception he is today.

But unlike a fictional character or a politician, Brett Favre's final narrative is not ours to construct, but his. And in this way, he truly does have the power to assert resolution. If he can guide the Vikings with the constancy of the Northern Star, sports fans will acknowledge that he has no fellow in the firmament. Otherwise, as always, there will be a consensus that there is something rotten in the state of Minnesota.

Saturday, August 15, 2009

Fahrenheit 98.6

A few weeks ago, I explored how our culture celebrates certain behaviors and attributes when displayed by children, even as those same behaviors are not lauded when exhibited by adults. As an example, we don't give adults Pizza Hut gift certificates when they have finished a set number of books. Yet even if we don't put theory into practice, and even as we show movement toward a postliterate society, I believe we still have a strong theoretical underpinning in our society that holds that reading is an inherently constructive act. Whether that theory is built on solid empirical data or whether it is a result of a kind of educational indoctrination (or somewhere in between), the fact is that to read a book, any book really, is vaguely considered to be "better" than watching a television show-- in the same way that eating broccoli is considered to be better than eating ice cream.

As such, we have no problem with our educational institutions mandating concepts like "Sustained Silent Reading" or DEAR ("Drop Everything and Read"), in which children are given freedom to choose reading material, then given an allotted time to peruse it. Of course, if a school allowed students to spend an hour in sustained silent Xbox playing, there would probably be quite a bit of opposition, even as research shows that children can and do develop cognitively from playing video games.

So knowing that there is a privilege assigned to the act of reading, I wonder if this privilege could theoretically exert an influence in policy beyond that which is applied to captive schoolchildren. Many people, most famously Ray Bradbury, have speculated about and warned against a possible dystopian future in which a government censors books, perhaps even removing all of them from circulation. I have yet to see anyone speculate on the opposite scenario, in which a government actually forces books upon a populace. I'm not taking about the issuing of one-sided propaganda, but rather a situation in which a government actually seeks to benevolently foster growth and learning.

Such a scenario may not be all that far fetched when one considers that there are theocracies in the world today that more or less shut down for ritual prayers. And there are many cultures, not just in Latin America, that observe an afternoon siesta. So what about a 30 minute or 60 minute block of time in the afternoon in the USA for good old-fashioned SSR? Broadcasters and Internet-service providers would be required to shut down, as would all commerce. If you wanted to envision a real unswerving approach, you could even close down roads and force everyone pull over to the side to read.

Though I truly believe that there are dormant ideological roots in our society that could help such a notion flourish, I am not unaware that to suggest locking down commerce in the middle of the business day (or really any time of day in this era of 24-hour service) is, as of now, a deal-breaker. But there are a couple factors, which should they come to pass, would perhaps at least open up some soil around those aforementioned roots. Here is what would need to happen:

1) Sustained economic crisis: As we realize that the status quo is not working, we would become open to even radical alternatives
2) A de-emphasis on specialization: This could also come about because of economic distress. As we realize that we may not be able to lock into a single career, and as we realize that time devoted to intellectual development can help us to flourish in diverse settings, we become open to alternatives to business as usual
3) An emphasis on quality over quantity: Americans already put in more hours than workers in many other nations. Sociologists are making the case that we are gaining nothing from our increased endeavors-- except for increased anxiety. As we realize this, we may rebel.
4) A technological backlash: As a culture we are still in the adjustment phase to the new reality that we can be located 24/7. It really wasn't that long ago that if you weren't home, you weren't reachable. While we have gained much from cell phones and blackberries, we have also given up much. A compromise period of an hour "off grid" for everyone could be a way to reconcile convenience and anxiety.
5) An emphasis on cultural literacy: E.D. Hirsch has been trying for years to raise awareness of this issue, but at some point it might come to pass that despite the world of information at our fingertips, we will realize that we don't know anything about anything. The collective shame of this realization will motivate us to take action.

Whether or not the above factors fall into place and this scenario ever comes to pass, I wouldn't mind reading a book about a society that tries it.

Saturday, August 08, 2009

Beer Summits and All-Star Games

It's been a little more than a week since the term "beer summit" has entered our national lexicon. I was fascinated with the occurrence of President Obama inviting Henry Louis Gates and Joseph Crawley over to the White House, primarily because it struck me as a scenario that one would expect to never advance past the hypothetical stage. I feel like I know a little about what I'm talking about here, because anyone who has read this blog with any degree of frequency knows that I habitually float out hypothetical proposals. And though I happen to think I have some pretty good ideas (sometimes), I would be shocked if some of these proposals ever came to pass. The idea of having a couple of high-profile controversial figures over to the White House for beers seems to me the type of thing a blogger would propose, not something that would actually happen.

Quite often, good ideas are often simple ones, the kind of solutions to problems that a child could come up with. As an example, picture a group of kids playing a whiffleball game. The score is tied after 11 innings, and it's getting dark fast. They've got to figure out some creative way to end this game and declare a winner. A home run derby strikes me as the most likely solution, but a case could be made for some other possible methods to pursue. In the end, who doesn't think that the kids would come up with something?

Yet when confronted with a similar dilemma (lack of pitchers instead of darkness) in 2002 at the Major League All-Star game, the commissioner literally threw up his hands. The game was declared a tie, and no one went home happy.

So what is it that allows for creative ideas to flourish among kids in a backyard, where inventiveness is stifled in a Major League ballpark? Obviously, it is that there are no real stakes involved in the former situation, and the kids are not hemmed in by fear of critical backlash. Indeed, Major League Baseball has suffered some criticism for the safeguard they put in place after the 2002 season to prevent a re-occurrence of the problem, a rule that the winning league in the All-Star game gets home field advantage in the World Series. But overall, the negativity inspired by the move is counterbalanced by those who like the idea, and in any event, from a P.R. standpoint it sure beats the outcry that would ensue after another tie (which would have certainly happened in 2008).

Most corporations (and even governments) are not unlucky enough to have their crises unfold on television before an audience of millions. But I believe the same principle that hamstrung Bud Selig and company all too often prevents good, albeit new and unconventional, ideas from being implemented in those milieus. Not only is there a fear of how new ideas will be accepted by the public, there is the inertia of tradition ("we've never done anything creative to end an All-Star game, so we surely can't start now!").

Yet as far as I know, the "beer summit" is unprecedented. In theory, it was a very creative way to attempt to get some closure and resolution for a socially jarring incident. It was a goodwill gesture that one think would have widespread mass appeal. One would think the public would applaud the president for going the "regular guy" route and bypassing the now-tired public relations strategies of releasing statements and giving speeches (though it was an apparent eagerness by the White House to release a statement that arguably exacerbated the situation in the first place).

Yet for all that, in the week since the summit, the president's approval ratings have gone down. To what can this be attributed? Well, perhaps a large section of the population is unwilling to forgive the First Bartender for his initial response to the incident. Perhaps the health care debate has now overshadowed this moment. Or perhaps there is one other factor at work.

We have become accustomed to having voyeuristic rights. Survivor and other shows have taught us that we should expect to be able to sit in when "the council" is meeting. The paparazzi has given us access to intimate moments in the lives of public figures. And even on the campaign trail, politicians trade off of the media's ability to supposedly reveal the "true side" of a candidate. So I can't help but wonder if there was some public resentment, even if unconscious, over not being allowed into the party. As we watched the Gates/Cawley affair unfold like a story (and the participants themselves make for some pretty good characters), we wanted to be able to see the denouement. But instead, we were ordered to clear out of the theater.

I'm not at all advocating that there should have been cameras in the Rose Garden and microphones on the table. But I fear that because there weren't, we might now be stuck with a culture that will play for ties rather than victories.

Sunday, August 02, 2009

Back to School? How About Never Leaving?

Google News reports over 9,000 "back to school" stories over the last week. Retailers have an obvious interest in keeping this phrase at the forefront of our cultural consciousness, and media outlets are all too happy to have a retail phenomena to report on, so no matter how tired we get of hearing this phrase every year at this time, it seems a good bet that it will continue to haunt generation after generation of schoolchildren. But I wonder if students should be so loath to embrace the concept. In fact, I wonder if this phrase (and the marketing push that it symbolizes) has such power of inertia, that it can by itself ward off the year-round school movement. If "back to school" is indeed an unassailable annual ritual, we need to have an "away from school" to precede it. Of course, it was economic consideration that originally led to the concept of a summer vacation, and though the economic landscape of this country has changed, I wonder if it hasn't morphed into an equally powerful influence on education. (Additionally, it should be pointed out that various tourism lobbies have sought, often successfully, to not only preserve the status quo, but extend the length of the back end of summer vacation).

But despite the challenges posed by commerce and tradition, there are those who continue to advocate that we need to rethink the school calendar, that children will learn better if they don't have a large gap in the middle of the year, that too much time is spent every fall re-adjusting and reviewing, that too much regression slows students educational advancement. Of course, these are valid points and there is a fair amount of research that substantiates such claims.

Meanwhile though, there continues to be another strong and persistent theme in education reform, which may be summarized as a backlash against standards-based curricula. These voices argue that too much time is spent "teaching to the test," that students' individual educational needs aren't being met, that an emphasis on "skills and drills" are supplanting the ability of teachers to find creative and meaningful connections between the world and the classroom.

At first glance, these controversies seem to be mostly unrelated. But I believe that a new educational initiative just might be able to make everybody happy. Is there a way that students would actually want education to be a year-round endeavor? Call me naive, but I think when students look forward to summer break, they are looking forward to escaping from institutionalization, not from learning. What if students could be challenged with a summer curriculum that is not about standards, skills, and tests, but about individual pursuit of knowledge and personal growth?

Here is what I propose. Students be given the opportunity to negotiate a summer-long course of study with a "mentor." Mentors could possibly be recruited by school districts, and include not only teachers, but retired teachers, professionals in an appropriate field, or perhaps even older students or aspiring teachers. Students would be asked what they want to learn about, and a series of research questions would be finalized. There would be an emphasis on hands-on experience in order to answer the questions, but some reading would also be required. Perhaps some students could work collectively and cooperatively. Maybe informal discussions could be scheduled, but the groups of students and mentors would meet outside of the rigid structure of the classroom.

I could foresee that for some older students, summer employment would be folded into the course. (And perhaps businesses, which could benefit from the arrangement, could help with funding). Family vacations could easily be incorporated into the project. And in the end, the "how I spent my summer vacation" essay would be rendered much more meaningful, as the student would (hopefully enthusiastically) set about to not only describe their relevant activities, but to answer their original research questions.

Of crucial importance to the workability of this proposal, the retailers would still get to have "back to school" sales. Though hopefully students will no longer assume that "school" necessarily refers to a building.