Saturday, December 29, 2007

Goodbye Spidey

I've read in a couple of different places that marketers make a priority of targeting adolescents because consumer choices that one makes in those formative years tend to become ingrained habits that persist well into adulthood. Though I've often said that if everyone had my consumer habits the economy would collapse, I can't claim to be immune from the pull of adolescent buying habits. For example, I started subscribing to USA Today Baseball Weekly as a 6th grader in 1990, and I still subscribe today (under its new name Sports Weekly), more out of habit than anything else. Likewise, I started purchasing Spider-Man comic books around that time, and never really stopped. Until now.

Why the drastic move? A little backstory is needed. Spider-Man was created in 1962, and over the intervening years, his character grew and changed. He graduated high school, graduated college, suffered triumphs and tragedies in not just his costumed life but his personal life as well, and finally in 1987, he got married to Mary Jane Watson. Pretty much immediately after this event, writers and editors started scheming for ways to undo the marriage. They thought that this made Peter Parker less relatable, and that he should be eternally unlucky in love.

I was pretty much ignorant of the controversy surrounding the marriage, having come aboard in about 1991, but I kept reading Spidey comics through some weird times, all brought about by attempts to undo the marriage. In 1994, they established that the Spidey of the last 30 some years was a clone. The plan was to have the Peter clone retire and live happily ever after with Mary Jane, while the original Spidey would return as a relatable single guy. This turn of events blew up so spectacularly that it was (somewhat simplistically) blamed for Marvel Comics literally going bankrupt (they've since been bailed out by their lucrative film franchises). The clone plan was scrapped (if you really want to know more about it, there is a Wikipedia entry). Then in 1999 came the infamous "relaunch," with Mary Jane seemingly killed in a plane explosion. Sales went absurdly low and she was brought back (comic book characters are notoriously able to resurrect themselves).

Still, Marvel's editor-in-chief, Joe Queseda, has been openly scheming for years about his desire to end the marriage. His solution came this week in the form of the universally reviled "One More Day." Long story short: Peter's Aunt May was shot, partially because Peter had decided to make his secret identity public. She is in a coma, teetering on death's door. Blaming himself, Peter vows to do anything to save her. In comes Marvel's answer to the devil, a character named Mephisto. Mephisto offers Peter and Mary Jane the chance to save Aunt May. All they have to do is agree to let him rewrite history so that they were never married. They reluctantly agree, and Peter wakes up, now living with Aunt May. He goes to a party where Mary Jane gives him the cold shoulder. Also at this party is his friend Harry Osborn, who had died in a great story 15 years ago. (I thought the death of Harry in Spider-Man 3 was poignant, but the comic book death was incredibly moving).

So not only do we fans lose the marriage (and unscientific Internet polls showed fans overwhelmingly in support of the marriage), but it appears as if we have lost 20 years of history (covering my entire reading lifetime). Not only that, but our "relatable" hero just made a deal with the devil. Words escape me. However, they do not escape our less than esteemed editor (some fans have taken to calling him "Joephisto"). Here's what he had to say:

Sometimes when I look at the way that the lines of opinion have been drawn in comics about the marriage, I see the argument falling into two basic camps. The fans may not perceive it this way on the surface, but it is what's happening when you look at it clearly. When we fall in love with these characters, we claim ownership over them in our own way; so for some fans, Peter belongs to them and no one else. So, the way I see it, there are two sides of the argument, two segments of fans. On one side, there is a contingency of fandom that wants Peter to age along with them and live life as they do. He needs to get married, have kids, then grandkids, and then the inevitable. One the other side, there are fans that realize Spidey needs to be ready for the next wave or generation of readers, that no one can lay claim to these icons, no one generation has ownership and that we need to preserve them and keep them healthy for the next batch of readers to fall in love with.

The rhetoric here is appallingly bad. First, it is arrogant and condescending ("Fans don't realize it, but I am smart enough to see the truth"). Second, he uses no less than three logical fallacies. He's got a strawman (right Joe, there are so many fans clamoring for Spidey grandkids). He uses a slippery slope, and he tosses in a false dilemma for good measure.

He's actually right that there are two camps, but he's a bit off in his assessment of the situation. Here's the way I perceive it. One group of fans indeed claims ownership in the character, and they see Peter as someone they want to relate to. On the other hand, there is a group of fans that don't really want to relate to Peter--they just want to see drama, and I will concede that there is more room for drama with a single Peter. If Joephisto would come out and say it is the second camp that he wants to appease, then it would be less galling. But he actually claims that it is for the first camp that this change was necessary. Any Spider-Man fan under the age of 30 grew up with a married Peter Parker. Like me, they started reading when he was already married. There is obviously something about the character that still has made a whole generation of fans relate to him. And now that generation has been case aside, but for who's benefit? Joephisto claims that it is for the future generations, but I wonder. Joe is a little older than most comic fans. He grew up in the era when Peter was still single. Could all of his blustery rhetoric about future generations be designed just to cover a reactionary shift back to the preferred status quo of his generation?

So until sales once again dip so low that Marvel is once again forced to undo this, I'll be boycotting the
Spidey comics, or at least stop buying them. I will probably pursue methods of reading them that don't put money in Marvel's pocket (adolescent habits are hard to break, after all). And for a less articulate but more humorous fan response to the situation, you can check this out (warning: it contains immaturity, foul language, and literal bathroom humor, and is in very poor taste):

http://uk.youtube.com/watch?v=PjN0ThpswRc

Saturday, December 22, 2007

What's in a Name.

Being a Bob Dylan fanatic, awhile back I drove 90 minutes to see the unconventional biopic I'm Not There, a film which has six actors, including a woman and a black boy, portray seven characters based on Dylan, though none of them are actually called "Bob Dylan." However, this may not be as indicative of true fanaticism as my concurrent quest to read literally a couple hundred reviews of the film. I've grown quite good at glossing over plot synopses and focusing on whatever subjective elements are offered by the reviewers.

What I've learned is that there is an absurd amount of groupthink in reviews. I guess I'd already known this thanks to my similarly obsessive interest in reading reviews of comic book movies, but it is disappointing to see that art house features get the same kind of treatment. In any event, I should have been pleased when I encountered a review that offered something new, but I wouldn't say I was exactly pleased when I read the novel observation in The London Telegraph that the actors names in the opening credits appear with punctuation, namely a period. I hadn't noticed this when I saw it in theaters. The reviewer somewhat gratuitously points out, "that's highly unusual: even in real life very few of us ever dot our cheque or job-application signatures." The Telegraph went on to inform me that Franz Kafka used to dot his signature, and that a literary critic saw this as evidence of Kafka's desire for "cosmic finality," i.e. suicidal tendencies.

The reason I was less than pleased with this sequence of thoughts is that I used to be one of the few who actually would sign my name with a period.

However, upon further analysis, I feel that I have no need to retroactively seek professional help. Nor am I inclined to agree that Kafka was asserting a need for "cosmic finality" with his unusual signature. So what was it all about? I think for an answer, we need to turn no further than I'm Not There.

The film's director, Todd Haynes, has stated the premise behind the film isn't so much to portray the life of Dylan, as to show that Dylan embodies an American philosophy and mythology, the idea that "authenticity" is a false vision, that it is not attained but created--in other words, the idea that identity is not imposed but constructed, and this is to be celebrated.

At the center of identity is an individual's name, which is usually imposed (and not just in the sense that people rarely name themselves, but also in the limited pool of names in circulation). The philosopher Althussar saw this as a key step in what he called "interpellation," the way in which society imposes subjectivity.

I first started "playing" with my name in 8th grade, for a brief time choosing to sign all three of my names, as I wasn't able to reconcile the arbitrary way in which most of us dismiss our middle names (or perhaps I was influenced by Pittsburgh Pirates pitcher Jerry Don Gleaton). I kept this up until my sophomore year of high school, when I reverted to the default two-name pattern, only to come up with the "period" innovation my senior year. I was even insistent with my high school yearbook editors that my name appear with a period. I have no clear recollection of any reason, Kafkaesque or otherwise, which triggered this move. In hindsight though, I realize I was deconstructing the form.

This lasted a couple of years, and my practice of adding a period to my name received scant notice. My next move, though, did result in attention. I created entirely alternative names, and messed around with a few, before settling on "Azor." A few years ago I finally took the legal step of changing my middle name. Over the years, I've bristled when anyone has asked the significance of my name. "It's just a name," I've repeated a few times, not even offering a hint that I adopted it. It should be noted that I made the move prior to developing a strong interest in Dylan, but to anyone wondering about the name change, I would have to proffer Dylan's answer, given in a 2004 interview with Ed Bradley: "Some people are just born with the wrong name. (Pause). It happens."

It turns out that this was only an edited answer, with the full length answer just recently leaking out:
Ed: Tell me how you decided on “Bob Dylan.”

Dylan: Well I think it’s pretty much — I don’t know, I was talking to the guy in KISS one time, y’know Gene Simmons, he’s a guy that used to have another name. I don’t know what it was. And he just said it popped into his head one day. And who else — I was talking to somebody else too. Well, all the rappers, y’know? A lot of rappers give themselves different names, because that’s who they feel they are, y’know? They’re not that person that everybody knows when they go to school. They’re more into other things and they need another name.

Ed: You were into other things?

Dylan: Yeah, I mean, you call yourself what you want to call yourself. This is the land of the free.

The reference to the national anthem is well in line with Haynes's attempt to portray Dylan as the fulfillment of an American vision. What is also notably absent from the discussion is the assumed appropriation of Dylan Thomas's name. Here is what Bob Dylan himself once said about this:

"Get that straight, I didn't change my name in honor of Dylan Thomas. That's just a story. I've done more for Dylan Thomas than he's ever done for me. Look how many kids are probably reading his poetry now because they heard that story."

I think I understand how Dylan feels here. Yes, the name "Dylan Thomas" might have served as an antecedent, but what is difficult for people to understand is that this does not imply significance. People are desperate to discover some significance to the name change, but the significance lies in the fact that there is no significance, that significance is present only in its absence. You can only find it in realizing that it is not there.

Wednesday, December 19, 2007

Play-offs Schmlay-offs

Fantasy football team owners everywhere are furious with the Philadelphia Eagles Brian Westbrook. To make a long story short, Westbrook had a chance to score a touchdown Sunday, but intentionally stopped himself from doing so because he realized that if he stopped at the 1-yard line his team could run out the clock without giving the ball back to Dallas. While this was great for Westbrook's real team, it was horrible for many fantasy teams that had him on their roster. What made it even worse is that this time of year, fantasy leagues are in the play-offs, and therefore his move likely cost the season for some owners.

I have no sympathy for anyone who lost in the play-offs this week, because I think the idea of having play-offs in a fantasy league is absurd. I realize it is standard procedure, but it shouldn't be. It is ingrained into the mind of the American sports fan that every season must culminate with play-off games, but no one questions why. The truth is that play-offs are simply a marketing gimmick. This truth briefly flickered into public consciousness when NASCAR, of all sports, added a play-off component. Fans saw this for what it was-an arbitrary move that serves to increase interest and TV ratings while not actually moving toward a more fair representation of who should be crowned the "champion."

In truth, the other sports with play-off systems are not really any better than NASCAR, it's just that we've had generations to become accustomed to them. The longer the regular season is, the more accurately we can read who the best team is. From a purist's standpoint, the team with the most wins in a 162 game baseball season is the best team. The NFL is a little bit harder to justify, though it's hard to argue with the need for a play-off this year to determine that the New England Patriots are the best team in the league.

All that said, I can see why we have play-offs, and as a sports fan, I won't call for their abolishment. However, there are two inescapable conclusions to be drawn: first, play-offs in fantasy sports should be outlawed, and second, we live in some kind of a weird world where the one sport that legitimately needs a play-off is the one holdout that resists one.

Wednesday, December 12, 2007

Attendance Policies in Composition Classes: A Rogerian Essay

I recently assigned my ENG 101 students the task of writing a Rogerian essay. I thought I'd be a sport about it and write my own, just for fun. Enjoy:


It is a sad reality for teachers at all levels that complete student satisfaction is unattainable. This is especially evident in the volatile realm of grading and assessment in higher education, where the stakes for student achievement are high. There is a fundamental tension in the economy of assessment and grading that will always carry the potential for dissatisfaction. It is in the best interests of students to demand a flexible grading system, which will allow them to receive high grades for minimal effort, while it is in the best interests of instructors to supply a rigid system that will grant high grades only for maximal effort. In composition courses, the potential for frustration is often greater, given the subjective nature of assessment.

Despite the reality of inevitable student dissatisfaction, it is possible and desirable for instructors to take some steps to minimize it. They should be willing to listen to student concerns, and make an effort to understand the student’s perspective. As a matter of course, they should also be willing to articulate to students their philosophy of assessment and explain why they put a premium on certain elements. This paper will explore the common practice of assessing first year composition students partially on the basis of class attendance.

In a recent classroom discussion on the matter, many students expressed to me unhappiness with this policy. They’ve articulated to me three primary reasons why. First, they believe that attendance policies should be associated with primary and secondary school, with legal minors who aren’t equipped to decide whether attending school is to their long term benefit. Students argue that once they get to college, they are mature enough to make their own decisions about the relative benefits of attending class, without a draconian authority figure demanding they attend, in effect treating them like the children they no longer are.

Second, students maintain that instructors often don’t understand the real world demands that keep them from attending class. It’s not as if they don’t want to go to class, they say, it’s that they can’t. In ideal situations, going to class will enrich a student’s mind, but in the short term, this is not going to help put food on the table. Other students have family obligations. For example, when a student has a sick child, the health and well being of the child far and away outweighs any benefits that can be gained that would jeopardize the child.

Third, students tell me that since they are paying for the right to attend class, they should be treated as consumers. If a patron buys a movie ticket, then chooses not to attend the movie, there is no added penalty beyond the loss they’ve already suffered. By this reasoning, if a student pays tuition, then fails to attend class, they’ve already been “punished” by not getting their money’s worth, and according to the students’ argument, in our free economy we have the right to squander our money as we see fit.

Finally, though no student brought it to my attention, I think there is one more argument they could make. In composition classes, students write essays. Doesn’t the quality of these essays indicate the achievement level of the student? If students can write “A” quality essays without having to go to class, aren’t they still deserving of an “A”?

Though these arguments are far from specious, and though I have put the best construction I could on them, from my instructor’s perspective, I have a counter-argument to each of them. First, regarding the idea that attendance policies are more applicable to lower levels of education, I think students who make this claim are missing a key distinction between attendance requirements at the two levels. Even if attendance is required in order to attain a certain grade in a college course, it is not legally compulsory that students attend. If a student under 18 is habitually truant from high school, they will find themselves in front of a judge, and they will be given legally binding orders to return to school. If a student over 18 is habitually truant from college, they will suffer no legal ramifications. No truancy offer will be knocking on their door. If they don’t want to go to school, they don’t have to.

In regards to the argument that students have real world demands that prevent them from coming to class, I would assert that though there are valid reasons not to attend class, there aren’t valid reasons to skip class and then still claim a right to the benefits conferred upon those who do attend class. Life is about making tough decisions, and sometimes we are confronted with dilemmas that force us choose to give up something we want in order to keep something else that we want or need. In instances where students determine that they can’t attend class, I bear them no hard feelings, but I also don’t feel obligated to allow them special privileges.

As for the argument that students are consumers, I don’t accept the premise. Taken to its logical conclusion, if this statement were true, we would simply ask the student to send in a check for “X” amount of dollars and award them a diploma. This would certainly cut down on overhead. (In fact, I think there are some on-line schools that operate on this principle). There is a difference between the type of economic exchange in most consumer situations, in which the customer expects a tangible good or service, and in education, in which the “customer” is given something intangible, though also, in my estimation, invaluable. Also, as a side note, it is not completely accurate to say that all students are paying for their education, given that many of them have received scholarships or grants. When these students choose to skip class, they aren’t squandering their own money, but the money that others have invested in them.

Up to this point, all of my arguments have been negative. I’ve discussed why I think the students who are against attendance as a grading component aren’t seeing the big picture. Yet, if I persist in keeping attendance as a part of my assessment, I feel as if I should provide positive arguments, reasons why it should be part of my policy.

My first positive point also happens to address the possible argument that students who can write good essays without coming to class deserve good grades. I think what many students fail to understand (perhaps because we instructors don’t do a good enough job emphasizing it) is that we tend to be process-oriented rather than product-oriented. To many students, the value of a class is measured in outcomes, or the products that are produced. In this way of thinking, a writing class is measured in the essays that are produced.

While this may be true for certain courses (such as a capstone course in graphic design, for example), it is not a philosophy of mine or of any composition instructor I know. Although the finished products are important in this course, the main goal is to give students an overview of rhetoric as a discipline, of the art of argumentation. We want to help students to not only become better writers, but better critical thinkers, able to evaluate the rhetorical choices that others make. Every class, I attempt to come up with approaches that will further this objective. If I didn’t believe this, I truly would be cheating students out of their money’s worth. As such, it is my belief that every class is a valuable part of attaining the full scope of the course, and a student who is not regularly in class simply cannot claim to have learned as much as a student who attends regularly (of course, I am assuming that students who attend are always mentally engaged with the material, which might be a generous assumption). Therefore, it stands to reason that it would be inaccurate to award each of these students the same grade, even if their products, their finished papers, are relatively similar.

There is another reason for holding students accountable for attending class. When a student misses class, they hurt more than their own education. I believe that they are also adversely affecting the educational climate of their classmates. When a student enrolls in a course, they should be able to assume that they are entering a learning environment, where they will benefit not only from their instructor’s expertise, but also from the unique perspectives of their peers. When a student in not in class, that means they are not participating in class discussions, and the dynamics of the class are altered for the worse. Also, if there are many students habitually absent, it causes low morale among the students who make the choice to attend class, perhaps influencing their own decisions about whether to attend. In short, though some would disagree, I think that students have a responsibility not only to themselves, but to their peers.

Still, despite my insistence that attendance is a crucial part of the learning experience and an entirely appropriate component for assessment, I can’t help but hear the cacophony of student voices objecting to the policy, and I am willing to compromise. At an English department meeting prior to the school year, instructors discussed attendance policies. While all were in agreement that negative attendance should adversely affect one’s grade, instructors were split on whether there should be a positive affect to reward students who do show up for class. “In the real world, you don’t get a cookie for showing up for work,” said one instructor, “you get to keep your job.” While this is true in the real world, I’ve already established that I don’t think the comparison between education and commerce is a valid one, and I’m willing to concede that it doesn’t hold true here, either. I’m making the assumption that if students are coming to class, they are learning, and that should reflect on the final grade (though I’ll admit that many of my colleagues would say that it is too much of a stretch to assume that students who come to class are necessarily learning).

Furthermore, I realize that “stuff happens,” and it is unreasonable to expect students to be able to attend every class. That is why I believe it is reasonable to give students three free absences before their grade is affected. Here, I believe real world comparisons are valid. In a fifteen week course that meets three times a week, three absences represents a ratio of one sick day for every fifteen work days. This is an offer that most unions would jump at.

I’m resigned to accept that no matter the policy, some students will be unhappy. However, I hope that by demonstrating a sensitivity to student positions, articulating my reasons, and promoting some compromises, most students will be comfortable and relatively satisfied with having attendance a part of student assessment.

Saturday, December 01, 2007

The Beauty of Geeks

A couple of years back, when my wife was teaching third and fourth graders, one of her students accused another youngster of referring to him as "gay." When confronted about the matter, the accused made the case that he was simply commenting on his classmates' state of happiness.

Aside from the intrinsic humor in this anecdote, it is also illustrative of how the knowledge of a now defunct denotation of a certain word endures even to a generation that literally has no first-hand memory of anything of the 20th Century. I'm sure there is no shortage of words in circulation today that have sharp deviations from their original etymology, but this particular term has had such a radical shift in meaning in such a short amount of time, and has the added weight of carrying heavy such political and cultural signification, that both definitions are seemingly embedded in our cultural DNA.

Yet there is another word that has, in the same or even less amount of time, undergone just as marked of a change, with significantly less overt recognition. As of the mid-1970s, or less than one generation ago, the word "geek" was defined in dictionaries as "one who bites the heads off of chickens at carnivals." By the mid-1980s, when I was in elementary school, I knew the word as a derogatory term synonymous with "nerd." (I became aware of the original definition of the word through a Nintendo Jeopardy game in the early 1990s.) By the mid-1990s, I came to associate the word "geek" with one who has esoteric knowledge about non-academic subject matter (as opposed to nerds, who I regarded as possessing specialized knowledge in academic subjects). The Wikipedia page offers many definitions of the term, most of which are compatible with my previous definitions.

While I think the circus/carnival origin of the term "geek" is fairly well known, I get the sense that it is far from universally known, and I'd be surprised to hear of a third grader who would offer a similar etymological defence as the one cited above. In other words, the shift in meaning of the word "geek" was even more sudden and is even more complete than the shift of the word "gay."

This raises the question not only of how the change happened, but why it happened. Necessity being the mother of invention, it makes sense that the rise of the information age would give rise to the need for some kind of term to describe emergent subcultures of people devoted to the suddenly burgeoning and splintering realms of pop culture. Though professional sports have been around in America for more than a century, was it possible to be a "sports geek" before the existence of ESPN, which started in 1979? Though motion pictures had been around for decades, could one be a "[insert genre] movie geek" before the VCR hit the mass market in the late 70s?

But even though a term may have been needed at that time in history, why did it have to be a term that was previously used to describe sideshow freaks? I'm not sure it's even possible to answer that question, but I (naturally) have a theory. There must have been a tremendous amount of ambivalence for those original geeks (from the human spectators, if not the poultry, who likely had one reaction). They were likely regarded with a mixture of esteem, envy, revulsion, and fear. They were likely esteemed because of the specialized skill they exhibited. Not everyone can bite the heads off of chickens. They were likely envied because, well, anyone who can command a spotlight has always been envied, even if they command attention for the most dubious of reasons (and I doubt that this is peculiar to our reality TV show era). They would have been reviled for obvious reasons, and they would have inspired fear in that they held up a mirror to humanity and exposed the barbarism that still resides in the human breast.

The question I pose is this: in the Trekkie, the sports nut, the comic book guy, the computer technician, the cinephile, etc, do we simultaneously celebrate and loathe them? Do we esteem, envy, revile, and fear them? If the answer is yes, I would posit that we can explain how the word "geek" has transferred denotations so effortlessly.