Thursday, December 31, 2009

Last post of whatever the heck we're calling this decade

With the days winding down where the date above them ends with a 9 the window for recapping the year that's about to end is, itself, coming to an end. Within a day or so of New Year's Day it seems like the appetite for such things dries up, and then people are focused on where the new year is going rather than dwelling on where the previous year went.

This year presents an opportunity that's greater than the typical year-end recaps, as it is perceived as the end of a decade. Well, of course it's the end of a decade; a decade is a period of ten years; on any given day it's the completion of a window of time that started ten years previous. Obviously, today it will be true that the ten-year period that started January 1, 2000 will come to a close, and as such, that appears to justify devoting air time and column inches to making lists of bests/worsts/whatever in that period. With 24-hour news, specialty networks, and the internet there are certainly plenty of outlets that need filling.

Being the first decade of the century it does seem as though the count question that divided people ten years ago at this time can still come in to play. Namely: Does the decade start at 0 and run to 9, or start at 1 and run to 10?

As we approached the end of 1999, in addition to people stockpiling supplies in fear that programmers had not corrected their code pertaining to the use of two-digit years, the main argument was whether come January 1, 2000 it constituted the commencement of the new millennium or such an event actually fell to the first day of 2001. The answer was: It only matters to people who give a crap about delineating history on a base-10 structure.

Clearly common parlance has now declared that the new period started in 2000, whether that's right or wrong. Even though that makes the first decade of the previous millennium only nine years long, the adjustments to the calendar that have occurred over the last couple thousand years really make nit-picking about such things rather moot.

That the year is the length that it is makes some sense to the extent it reflects the relative position of the earth in its orbit around the sun, but as that period of the earth's orbit doesn't match 365 of the 24-hour periods we call days there's the need for extra days in leap years, so even what we call December 31 does not always correspond with where the earth was in its orbit the last time we called a day December 31. However, it's the system we have. All references to a given day are what we choose to interpret as having significance, but at least there's some system in place to try to keep a modicum of order. Grouping years into arbitrary periods like decades (and centuries, and even millennia) is pure semantics; if one wants 2009 to be the end of a decade it's the end of a decade.

When 2020 rolls around it will be far enough away from 2000 for the millennium-or-not debate to be old news, and it will be the start of a period where the second half of the date will all be pronounced the same for ten years straight—that is, two-thousand twenty, two thousand twenty-one, two-thousand twenty-two, etc.; it will be what will later be called "the Twenties" and become the first easily referenced decade of the twenty-first century (and will be followed by seven more such distinguishable decades, to close out the hundred years that commenced in 2000). As we complete a decade of years where the first part was the same—two-thousand—but will be the same through the century, it does no good in providing a handy nickname for the period completed at the end of today.

I've heard some attempts at branding this decade "the aughts," hearkening back to the turn of previous century, but I'm skeptical that in the era of Twitter such an antiquated term will catch on, especially as it hasn't happened yet (despite some trying to make it so for half of the expiring decade).

When VH1 resurrected their I Love the... pop culture retrospective series (after making multiple versions of I Love the '70s, I Love the '80s, and I Love the '90s) for this first eight years of this decade they clearly punted on coming up with a term, wussing out with I Love the New Millennium.  Had they boldly named it "I Love the Aughts," or, really, anything that didn't involve the entire next thousand years, then this would all be resolved, but instead we're in a bit of a vacuum about the determining factor. We'll just have to wait and see what term will be adopted.

I suspect "the first decade" might prove the most easily understood when in the future people attempt to mention this period, but admit it lacks sufficient panache to get a TV series named after it.

The coming decade will have a majority of years ending in "teen," and "the Teens" likely will end up being their moniker when all is said and done, with years ten, eleven and twelve remaining implicitly included.

The impending ten years, with the second half of the full date finally hitting double digits, will also prove an interesting period, pronunciation-wise: While the single-digit years lent themselves only to being spoken as "two-thousand one," "two thousand two," etc., and not "twenty-o-one," "twenty-o-two," etc., the Teens could see a shift to "twenty ten" over "two-thousand ten" (and so on). Personally I prefer "two thousand," being a mere extra syllable over "twenty," but I have no illusions about exerting general influence in such matters. (Or really in any matters, but I digress.)

Of course, the triple-syllable "eleven" we'll hit the following year will almost certainly make prefacing that with "twenty" less of a mouthful, and that probably will be the end of "two-thousand" until perhaps 2020, when "twenty twenty" may sound too much like (what probably will be an outdated) night time news program. Not that it's likely there'll even be news programs then, but some of us will still be around from the time when such things existed, and may find it sounds weird speaking the date that way.

I'd lay pretty good odds that 2021 will be "twenty twenty-one" even if the year before it was "two-thousand twenty," and the "twenty" will supplant "two-thousand" as the common reference for the remainder of the century.

And not to sound morbid but I don't expect to still be around when next we have this what-to-call-the-first-decade conundrum. Unless, of course, reincarnation is what awaits me in the afterlife, in which case my future self may be having this same rumination 100 years hence, in whatever incarnation computers and the internet have taken.

Note to theoretical reincarnated future self: If the content of the current internet somehow persists in to the 22nd century and you come across this, it could prove what the afterlife is. Of course, if the expansion of what's on the 'net continues to increase as it has, and then there's 100 more years worth of blogs and photos and YouTube videos, finding this will prove an even greater proverbial needle in a universe-sized haystack. (That expression may mean nothing to you by that time, but you should be able to look it up in whatever has become of Google.) Unless you happen to be searching for "world's largest corn dog" (presently #3!) or "romanesque broccoli"--the two posts of mine that show up reasonably high in the lists of results by search engines in the early 21st century--it's highly unlikely that you'd ever find that these words ever existed. (Which, considering that hardly anybody here in the 21st century can find these words, is the realistic expectation.) Especially in light of the greater likelihood that in 2109 the media (whatever that is at the time) will be too busy recapping the bests and worsts of that decade.

Whatever they're calling it.

Happy 2110.


And for my reader back in 2009: Happy Old Year's Day!

Sunday, December 27, 2009

Closing down the holidays

No one loves the timing of Xmas as much as the media. With it a week before the end of the year it creates a dead zone of focus, where the public doesn't really expect much in the way of reporting. Magazines can fill their pages with recaps of the year that's coming to a close, which they can assemble ahead of time so the staff can take time off. The government takes a break so there's not much to keep tabs on there. The perception is that everyone just wants a pleasant respite from the usual crap for a week or two, which is almost certainly the case.

However, were there not this window created by the proximity of Xmas to December 31 for this collective breather to occur the media might not have this opportunity to fall back on retrospectives and best-of lists. It's not that they wouldn't still do that stuff, but if they had to continue with active reporting during the latter half of the twelfth month they wouldn't be the only cover stories on the newsstands.

Of course, conversely, were it not for the proximity of New Year's Day to Xmas one wonders whether that holiday would be what it is. Okay, let's not even pretend it would be. Having January 1 off would be just another day not at the job if it didn't carry the association with "the holidays," and, most important, with it being the end of that period. After a month of having "the holidays" dominate everything it becomes more necessary than usual to blow off proverbial steam.

It celebrates the return to normal, which is definitely necessary. Something must signal that, okay, really, the decorations need to come down. And you get some college bowl games while enjoying the day off for no other justification than you had to change the calendar.

Thursday, December 24, 2009

Breaking it down, Xmas edition

In the popular holiday song "Santa Claus Is Coming to Town" it's pretty easy to see how opening with "You'd better watch out" and later a line like "He sees you when you're sleeping, he knows when you're awake" could be somewhat creepy, with Big Brother-esque overtones. However, the closing of that couplet that ostensibly softens the potential for paranoia—"He knows if you've been bad or good, so be good for goodness' sake"—is completely undone by that first part. The assertion that he has you under constant surveillance renders the potential for "being good for goodness' sake" impossible; any good done is not performed for its own sake but in order to avoid making the sort of mistake that Santa's purportedly ubiquitous gaze would get you on the "naughty" list. Any otherwise altruistic gesture becomes selfish because, at least on a subconscious level, it's done to avoid the negative consequences associated therewith.

And what sort of error in behavior could bring one to that disgraceful status? According to the opening stanza, crying alone is sufficient ("you'd better not cry"). Sure, the next part alludes to pouting, but as that's mentioned separately crying becomes its own item on the list, independent of any connection with pouting. So if you're sad you'd better just suck it up because only stoicism gets one on the "nice" list.

If anyone actually paid attention to and followed the words of the song he'd be emotionally stunted and generally distressed about Christmas. He might find himself unwittingly compelled to deconstruct the lyrics in a manner not unlike this. And who'd want that?


Speaking of holiday song lyrics that just slide by, what about this one from "Winter Wonderland": "We'll have lots of fun with Mr. Snowman until the other kiddies knock him down" (emphasis mine). It is simply taken as a given that the joy brought about by putting all the effort into making a snowman and having whatever constitutes "lots of fun" with it shall be brought to an end by uninvolved children who seek destruction out of what seems merely mischievous but almost certainly suggests deeper psychological issues—ones that really should be glossed over so glibly, even if "down" is a convenient rhyme for "clown" (which pretense in that prior line takes the snowman to be).

Kids may be kids, but to take down a snow sculpture—even an amateur one—with such wanton disregard for others is not something to be tacitly encouraged in popular song.

The kids are already paranoid from Santa watching them; we needn't be turning them into delinquents as well.


No, "Winter Wonderland" makes no overt reference to Xmas or any other holiday, but really, when else do you hear it? It's a "holiday" song by association if not intention.

And it really has no application to areas like Southern California where there's little potential to make snowmen. Perhaps a sculpture in the style of a sand castle, but not actual snow.

Of course, the sunshine has enough songs about it; that the snowy landscape of winter gets one is only fair, even if it is just trying to romanticize the bleakest season.


Heck, the reason Xmas (the capitalistic appropriation of Christmas) is as big as it is probably stems from being in winter. Sure, the marketing possibilities are attractive, but one could concoct a holiday that takes advantage of consumerism and put it in any month and get a decent level of involvement. To have it really take off, one needs to have it take place at a time of year when the darkness exceeds the light, when cold reigns, when people are forced to pretend that getting bundled up is fun.

Had it started in the southern hemisphere, during their winter, and then fallen in the northern hemisphere's summer I see little chance it would have the same popularity amongst those above the equator. Not that it would have no following, but it would be on par with, say, the U.S.'s 4th of July, or perhaps Halloween—good, but not something that people anticipate for a month beforehand.

People don't need a respite from long, sunny days in the same way that they do from gray, snowy days. Sure, sweltering summer heat can suck just as much as freezing winter cold, but it's never so hot that they close schools (much to the chagrin of children who may have to attend classes during those days).

And imagine the Christmas carols that would come from that.  Bah humbug.


Merry Christmas, or Merry Xmas, or just happy Friday tomorrow.

Tuesday, December 22, 2009

Illuminated quiz

What can one tell from the egregious, Griswald-esque displays of Xmas lights?

a) who is most excited about the holiday
b) who has the greatest need to compensate for feelings of inadequacy in other areas
c) who will have the largest electricity bill come January
d) all of the above
e) none of the above
f) more of a couple of the above but not so much one of the other
g) not quite enough of all of the above or not quite enough of none of the above but somewhere in between
h) what was the question again?


And here's an example of a not-quite-egregious amount of lights, from a house down the street in my neighborhood.

Sunday, December 20, 2009

Jingling the bells

This time of year it's a bit difficult to escape Xmas music when one is out in public. That it's playing in shops is to be expected, of course; it serves as an advertisement for the holiday shopping season by reminding us of the excitement we felt as children, listening to these traditional songs over and over while parents decorated the house and we waited to see what Santa would bring us (unless we were Jewish, or Muslim, or our parents were Jehovah's Witnesses...).

Even with the ubiquity of this music out there I still spent two full hours one night this week downloading Xmas tracks; my wife wanted to increase our holiday music selections, so I set about the task of making her happy. However, it soon started taking me down my own Santa Claus Lane.

Despite having Bing Crosby drilled into my ears for decades—actually, because of having had Bing Crosby drilled into my ears for decades—I stilled downloaded he and the Andrews Sisters version of "Jingle Bells." As these songs go, you really can't go wrong with Bing. The man is Christmas, music-wise. I mean, when I pause to think about it, I cannot think of a single Crosby song that is not holiday-related. I'm sure the man had a lucrative singing career beyond just that, and I imagine I must have heard at least song he did outside that genre, but his association with this time of year has utterly washed away whatever my mind may have been able to retain about the rest of his oeuvre.

This year, with eMusic's mid-year acquisition of the Sony catalog, the site had the "real" version of "Jingle Bells" (the one that gets played on the radio) rather than some poor-sounding re-creation (which was all they had last year at this time). That's the thing about these songs: There's specific recordings that were the ones we heard on the radio, over and over, and those are the ones that we want to keep hearing. It's fine and well if some newer artist recorded his or her take on the song, but when it comes to Bing we only want that particular one with the Sisters backing him up. All other versions that may have been put on tape with him singing may as well be destroyed; as far as this works in our mind they're as worthless as a three-dollar bill.

(Last year, when searching the site, the only Bing Crosby available was some live recording that had an eerie echo quality that sounded like it had been recorded in an empty mansion. Before this year's mission my wife made a point of noting that when I went to download this year that she didn't want any "spooky" versions.)

The main song she sought, the one she noted as her favorite holiday song, was one she had to hum. "You know, ding-ding-da-ding, ding-ding-da-ding, ding-ding-da-ding…"

"You mean 'Chorus of the Bells'?" I said. (Of course, by that I meant "Carol of the Bells." Hey, at least I was close.)

Searching the eMusic site for that track name rendered hundreds of hits. None included the New Age rock version of Mannheim Steamroller, but I did find an actual orchestral recording of Leonard Bernstein conducting the New York Philharmonic. (I also learned that it was originally known as the Ukrainian Bell Carol, composed by a Ukrainian composer named Leontovych.)  And I found another version performed in Caribbean style, and another done a cappella, and another with merely harps, and a funky rendition by Shawn Lee's Ping Pong Orchestra.

Sampling merely a tiny fraction of all those versions is part of what took so long, and that's a big part of the joy and pain of seeking out Xmas tunes: There's so many of them.

Digging in to the plethora of Xmas music revealed that all the songs one hears over and over on the radio and in stores, etc., are merely the tip of a proverbial holiday iceberg. You may think you know that every artist and his brother has recorded a Xmas album, but until you see how much is out there you have no idea. After a while I started to wonder, as someone who'd heard these songs growing up, if I was contractually obligated to record an album myself. Then I worried that if I kept searching I'd discover that I already had.

There's a reason why places like Starbucks have CDs featuring a sampler of popular Xmas songs next to the register. (Well, there's a reason beyond the obvious, that they know it will sell.) It may not include every favorite tune one remembers but acquiring it with mere moments of effort (presumably glancing at the package while waiting for one's coffee) is the ideal amount of time one should spend getting some Xmas tracks for one's collection in order to maintain that "holiday spirit." To explore the depths of all that's out there (or even all that's on a single music website) runs the risk of being crushed under an avalanche of carols.


Perhaps it would be different with some glasses of "holiday cheer" before embarking on such a task. In this case, maybe friends only let friends drink and download.

Ah, holiday traditions of the future.

Thursday, December 17, 2009

Chestnuts roasted

If you live to be 93, Mel Tormé doesn't want you to have a Merry Christmas.

In one of the most covered songs of all time, his "The Christmas Song" the "simple phrase" of wishing the aforementioned Merry Christmas only extends "to kids from one to ninety-two." If you happen to live to see a 93rd year the writer of the song apparently thinks you've had enough merry Christmases.

Come December... well, it seems "the Velvet Fog" is offering another simple phrase for kids who are over 92: It hasn't been said but distinctly implied, it truly sucks to be you.

Go sit quietly for another seven years and maybe you'll get a mention on the Today Show.


Personally, I think you 93+ folks deserve as good a holiday as all those young whipper-snappers, but that's just me.

Friday, December 11, 2009

I went to college. It was okay.

More unnecessary rumination about college that probably should not be shared publicly. Will I ever learn?

I'm not proud to admit this, but I don't look back at my years of attending classes at a state university as having really taught me all that much. For the classes I took only out of obligation what I had to get from them to pass I pretty much forgot after the semester ended, and for the classes in my area of focus it was more a matter of just proving that in high school I'd learned how to read and assemble an essay. As regards "creative" writing, what I got was a reminder that it took putting some effort in to making the story good (something I already had figured out), and that the professor I kept getting (coincidentally) really didn't give much a shit, and that most people who take those classes aren't doing so because they have talent but because they think the class will be easy. I'm not suggesting it was an utter waste but it was hardly a transformative experience.

There were classes where I did learn some things that stuck with me, although not for any utilitarian reason: the music survey courses I took as electives (an overview of the Romantic period in classical, and the history of jazz). Having a general interest in music (but not having sufficient talent to be a musician), and already having an appreciation for classical and jazz (both of which I listened to during high school, as good background for studying) they provided subject matter I found genuinely interesting. By no means did a couple semesters of high level glimpses of these areas turn me into a scholar who could lecture on the topic, but it would allow me to vaguely identify the distinction between bebop and big band swing, to have a rough idea of what the forms of a symphony were (or at least how many movements there were), and how to pronounce the tricky composer names like Dvorak. Ultimately, it allowed me to participate in a conversation on such a topic if that came up a cocktail party, to avoid seeming like a completely uncultured philistine.

Which, if one gets down to it, is pretty much the practical application of most of what one gleaned from the classes one took at college outside of one's major.

I'm not suggesting that these have come up terribly often, but given that the subject matter stuck with me at all so that I could at least have a rough idea what someone else was talking about it the topic did arise is more than I can say about most of the rest of the classes I took (and that includes ones in my major… alas). That I actually enjoyed the classes while I was taking them is more than can be said for a lot of others I sat through is a nice bonus.

And unlike the material I purchases for other classes (which were either sold back or are sitting on a shelf or in a box), what I bought for those classes still come up sometimes on the iPod (well, the jazz at least; we've discussed the issue with classical before).

But by no means was going to college a waste. Having a degree, if nothing else, proves I can stick with things, see them through, and most important, put up with the bullshit inherent in the system.

That, ultimately, is what one really needs to learn. And how to not seem like a completely uncultured philistine at a cocktail party (such as your spouse's office holiday which you may be attending this time of year).

Thursday, December 10, 2009

Learning... take off, eh

My years of university classes, and thousands of dollars spent on tuition, did not imbue me with a lasting knowledge of, say, what distinguishes a Manet from a Monet (other than how to pronounce each name correctly), even though I did take an art history class. A few months of repeated viewings of the movie Strange Brew (a spin-off from SCTV, featuring Rick Moranis and Dave Thomas as their characters Bob and Doug McKenzie, and Max Von Sydow as the villainous brewery operator) during my high school years has left me with the ability to quote lines from it still to this day, despite not having seen it in over 20 years.

In fact, there was a time, before I even started college, when I could recite the entire movie, verbatim, from memory. My mind, thankfully, has not retained that level of recall, but it's certainly got more lingering in there from the many afternoons my friends and I watched that videocassette, over and over, than I gleaned from the periods I spent in that classroom on campus, looking at slides of famous paintings. I do recall that the sculptor who did The Thinker—name started with an R, I think—also was big on ballerinas.

I'm not a complete philistine.

If only the art history involved Canadian comedic actors perhaps I would have remembered more about it.

Sunday, December 06, 2009


Accepting that on the 'net no one will be convinced by your argument alleviates the time-consuming task of composing an argument; you simply state your opinion, with little or no support, and know that those who already agree will continue to agree and those who do not will continue to disagree.

It's liberating.

Thursday, December 03, 2009

A couple weeks ago, in a galaxy far, far away: More rambling

As I did three years ago in the middle of the first week of December, it's time for a meandering and completely unnecessary post touching on George Lucas' most popular works that succeeds in being not being geeky enough for the geeks and dull for everyone else. Ah, tradition.


A couple weeks ago, while flipping through channels—by which I mean: scrolling through the guide for the satellite-based TV system; there's no more simply going from one channel to the next without knowing what I'm going to find when I get there, but I retain that "flipping" expression as a pleasant anachronism—I came across a program on the History International channel called Star Wars: The Legacy Revealed, wherein scholars discussed the philosophical themes of the Star Wars saga. Footage of experts on areas such as mythology was interspersed with actual scenes from the movies (so clearly it was sanctioned by Lucasfilm). Why wouldn't it be? What filmmaker would not want his work analyzed in such a context, to be compared to classic works going back thousands of years? One could almost feel George Lucas plugging in the elements while reading Joseph Campbell to create the structure for his mythos, so nothing would be more apropos.

It's not that there is no foundation for such analysis of the story of how Anakin Skywalker turned to evil; certainly that has some similarities to the story of Lucifer's fall from grace, and his desire for power making him be willing to adopt the Dark Side is not unlike a Faustian deal (both points made by those interviewed for the program). The transformation of the Republic in to an oppressive empire seems obviously based on the rise of the Nazis in Germany (another point made in the program). Heck, it's a History Channel show, so there's almost an implicit requirement that either Hitler or the Civil War be mentioned.

One might question whether Lucas' work warrants this level of scholarship is not worth disputing; the mythological pieces are definitely there to be found, and I suspect intentionally so. Star Wars is contemporary mythology in its structure.

However, even in the bit I watched (and I recall seeing it previously, as originally it aired a couple years ago) I found myself having the following reaction during the portions where the scholarly talking heads were discussing Anakin in the same breath as classic works like Paradise Lost and then a scene from the film where Hayden Christensen acts with the skill level of a junior high production: If only the execution of the overall saga had lived up to that mythic structure.

It's not that the themes aren't there to be mined, and in A New Hope (1977) and The Empire Strikes Back (1980) the movies turned out pretty well. However, the story in those was all about Darth Vader, galactic badass, not whiny Anakin Skywalker; there was no melodramatic overacting to be done from behind the mask.

On the subject of Anakin being whiny, this was addressed by perhaps the best quote in the entire special. Clerks director Kevin Smith was interviewed and noted that there was no incongruity between how Anakin was and how Darth Vader was; the petulant youth is precisely who would grow up to be the merciless tyrant, he posited, and although it's complete pop psychology I must admit I think it trenchantly accurate. (Would Hitler have been so inspired to seek power had his ego been assuaged by his attempts at being an artist?)

For those who were paying attention, you'll notice I only mentioned two of the six films as having been done well—and probably not coincidentally they were the first two made. That's not suggesting that the other four were crap by any means, or that New Hope and Empire were without their flaws. However, I'm not here to go off on the tangent of specifics about how well or poorly the others were done; that's not the point. My lingering perception was merely that only the first two made struck me as living up to the quality level of storytelling that is associated with the works to which the saga was being compared in this special.

But here's the thing: Even though I am critical of 2/3 of the films in the overall saga (most specifically the ones that are identified as Episodes I – III), I'll still stop and watch at least part of them if I come across them on TV. They provide a modicum of entertainment (which may be based more on the potential that the two good ones established for the saga than on their content), even knowing precisely how the story turns out.

And not only will I stop and watch those movies but I'll stop and watch a special that touches on the philosophical themes in the movies… even though I've seen that before.

So clearly Lucas did something right, and maybe the credit balance built up by 1981 was enough to coast along from then on out. Or maybe the focus changed from storytelling to special effects and sound mixing. The look of the CGI and the THX sound in Episodes I – III are pretty phenomenal, even if the performances from the live actors leaves something to be desired. Maybe that's good enough.

Of course, I can't help but wonder that if the series had started with Episode I, and been the same story-wise, whether there would some 30 years later be made a special on the History Channel about it.

Tuesday, December 01, 2009

Coming back

The concept of reincarnation, with one's essence/energy/soul carried from one corporeal state to the next, does seem to imply some level of administration (if nothing else) to be performed by (what would easily be identified in our mode of envisioning the universe) a deity. It seems a bit too complex a system to operate without some bit of oversight.

One might wonder: Why would this deity set up a system whereby beings keep running the same race (so to speak) over and over? However, the more confounding query, it seems to me, would be: Why would this deity only give us one chance to run this race? The only way to grow, to advance, to develop is through repetition, through practice; a single lifetime only allows so many opportunities for growing as a being.

Obviously the tricky bit about the concept is that we seem to come into the world with the proverbial tabula rasa; what was the point of gaining the experience from previous lifetimes if we just forget it when we start over? And there I think the knowledge comes back to us slowly, in gradual ways, and often not in ways we consciously grasp.

Yes, that is a convenient explanation, but it's no more convenient than any other philosophy, is it?

Thursday, November 26, 2009

Turkey, not the country

Happy Thanksgiving to my American readers.

Have a good Thursday, rest of the world.

Tuesday, November 24, 2009

It's most wonderful time... well, it already has been being the most wonderful time...

With still a couple days until Thanksgiving the Xmas specials have already commenced (interestingly, with a new animated show featuring characters from the movie Madagascar). This isn't that bad, all things considered. Heck, some stores have had holiday-themed displays up for months.

I know people who lament this stretching of the season—and by that I mean the shopping season—out for such a period; I have been one of them. It seems like the stores are striving to shove Xmas down our throats before Labor Day.

While the dismay has been directed at the merchants who start the pattern earlier than we remember it when we were young, the vitriol would seem to be better directed toward the people who actually buy these holiday-themed items in August. If the stores weren't making money putting that merchandise out when they were they wouldn't do it. Economics trump all other considerations.

However, it's too blithe to blame those who have their shopping completed before Halloween. Any such ill will about them is probably more envy that they don't procrastinate like us combined with the romanticized vision we have of hitting the shops when there's at least a chill in the air (or, here in SoCal, when we have to stop wearing open-toed shoes). They aren't upholding the tradition, and Xmas, moreso than any other holiday, benefits from tradition (or at least the appearance of tradition).

Here's the thing: Our children will grow up in a world where they will only know the holiday shopping season to run a majority of the year. That will be their "tradition." And someday when we tell our grandchildren how shopping for Xmas didn't just last all year round they will look at us blankly, and ask us to tell them again about Xmas trees were actual objects and not merely holograms.


No other holiday ever came up with such ripe marketing potential. It was naïve to think it wouldn't be exploited to a greater and greater degree.

Don't blame Xmas. Blame all the other holidays for not coming up with anything to compete with its potential and give the stores something else to drive in to the ground at other times of year, something that would give people who really like Xmas another part of the year to look forward to.

Something to stimulate the economy without stores resorting to touting Xmas in summer.

Whether people could afford two of those a year is another story, especially these days. So anyone thinking of trying that alternate Xmas may be best served to wait for things to get better financially.

Timing is everything.

Saturday, November 21, 2009


More pictures from our trip to the Big Island of Hawai'i have been posted to the photo site, for those of you who enjoy that sort of thing.

There's many of Kilauea Volcano inside Hawaii Volcanos National Park in this post and this one.

There's also some like this fern on the grounds of a B&B in Volcano Village where we stayed in this post.

Thursday, November 19, 2009

Shrugging it off

My wife has copies of Ayn Rand books in our library, which I believe were acquired while she was in university. (I say "library" as though it were more than three bookshelves in the spare bedroom. I know.) Atlas Shrugged. We the Living. Possibly others. I think she has read at least some of them. She was a better student than I was.

There have been moments when I reflect back on my time in university and how I never read any of Rand's books—however, I don't recall ever being assigned to read them, so this is not entirely surprising. Still, I somehow got through years of taking classes—and being, technically, an English major, I took a lot where reading was assigned—without actually reading every book assigned, so it was entirely likely that even if, say, Atlas Shrugged had been on a list for a class that I would not have made it through it. Or necessarily have even cracked it open.

College taught me the far more useful skill of being able to make the most of what I had read and downplay what I hadn't completed. If you think that's a cop-out, you are an educational traditionalist with grandiose notions about the collegiate experience. And almost certainly you read much faster than I do. You may or may not care for Dead Poet's Society (but that may reveal more about your preference for the thespian abilities of Robin Williams).

Now that I am not under ostensible obligation to read books, and certainly not required to try to read them in timeframes that were unreasonable for me even in my proverbial heyday, I sometimes find myself thinking that I should tackle these bits of the canon that eluded my experience.

However, people with whom I have conversed on the topic of Rand's books generally dissuade me from bothering.

I do find myself not so much wanting to read them; I merely want to have read them. Or at least for some reason there are moments where I think I'm supposed to want to want to have read them. I suppose may stem from those instances where my degree comes up and people figure that acquiring it should have required familiarizing myself with a work like Atlas Shrugged. It's not that I got a degree to comply with anyone's expectations, but there's still something that hits upon some insecurity when I can't quite explain or justify. But which I get over quickly.


When one looks at the way my life has gone up to this point, it's not like doing what I should has been a common theme. Should would have dictated I focus on college rather than work more. Should would have had me graduate sooner and get a different job.  Most of all, more than likely, should would not have me married to my wife.

Should would not have resulted in my life being any better.

What I might need to know about Atlas Shrugged in order to understand erudite jokes on the subject I'm sure I could glean from the summary on its Wikipedia page (not that I'm going to actually go to that much effort, but I suppose some day I might). I'll never be able to speak authoritatively on the book, no, but somehow I imagine my life will turn out okay despite that. Or perhaps because of that.

From what little I can claim to have heard about the book it touts the beauty of selfishness (but, as noted, I could be wrong about that). What is selfishness if not eschewing what one should do for what one wants to do?

Reading the book would be tantamount to missing its point altogether. Probably.

I'm sure if this post gets read by someone who actually has made it through the book he/she will correct me if I'm wrong. Such is the glory of the internet: It gives the know-it-alls somewhere to show off.

(Here's hoping the know-it-alls have a good sense of humor.)

Tuesday, November 17, 2009

A last word on this Palin thing

If someone asks me if I watched the Sarah Palin interview on Oprah I'll simply note that I'm ignoring the erstwhile Alaskan governor in the hope that others will follow suit and she'll drift back out of the public consciousness.

That I consider leading by example.


I concede I don't know Mrs. Palin personally, and only can claim to have any familiarity with the public persona that was cultivated during the campaign and over the intervening year in the mentions of her in various media. Were she merely someone I met at a cocktail party she might seem perfectly pleasant. I could comment on the brief business trip I took to Anchorage ten years ago, and how I thought her home state to be quite picturesque; she could relate how she was an avid hunter or something. "Really?" I'd reply. "Shooting them from a helicopter? Well, that must be quite the thrill," I'd remark out of politeness, and I'd probably find some excuse to mingle elsewhere. Likely I wouldn't come away with the impression that she was the proverbial sharpest tack in the box but that she did have a certain charisma. Ultimately, I'd be left ambivalent.

The problem is that from the national spotlight I did learn a bit more about her than the party small talk would reveal, and it only reinforced what the impression from the party would have been. Namely: Not someone who I'd want to spent the better part of an hour in the afternoon see be interviewed by Oprah Winfrey.

Ultimately, my disinterest is more of a referendum on the way that most news organizations continue to dwell on her; while she and (I presume) her team of publicists work to keep her on their radar, one cannot begrudge them for doing their job; it's the media outlets who lazily comply who need to get a message. That being: If you feature her, I will not watch; if you mention her book I will change the channel (unless it's The Daily Show poking fun); if you discuss her as a legitimate presidential candidate in 2012 I will consider you to have abandoned even the pretense of performing journalism. And I'll vaguely aspire for that supposed Mayan-predicted apocalypse to come true, just so you will have something else to report.


And really, it's McCain who deserves the most ire, for allowing himself to be convinced to select such an imprudent running mate and starting this whole debaucle.

It's his fault that now when people hear "Palin" they associate it more with Sarah rather than Monty Python alum and travel enthusiest Michael Palin.

That's what bothers me the most, I think.


And yeah, I realize that by mentioning this I'm not really doing a swell job of ignoring her. That starts... now.

Friday, November 13, 2009

It's the end of the world as we know it (and John Cusack is freaking out)

Some glib conjecture based on seeing a billboard and a 30-second commercial (with the sound off) for the premiering movie 2012.

The billboard claims "We were warned." The trailer features John Cusack fleeing from rampant destruction that is the end of the world, as apparently predicted by the ancient Mayans.

If the idea (which I fully concede the extent of my knowledge is an episode of Penn & Teller: Bull... I saw where they debunked the notion, and which I saw late at night and don't recall that well) is that the Mayans knew a long time ago when the world would crumble in an orgy of destruction at the hands of the universe, and the end of their calendar was alerting future civilations of that, but there was nothing that could be done about it, that doesn't strike me as a "warning"; a warning (in my mind) constitutes information that is presented to allow one to avoid danger by one's actions. If the end is, in fact, nigh, the only reasonable response seems to be making peace with it in whatever way one believes appropriate.

What I can't figure out: If the world is ending, with tremendous earthquakes and presumably all other sorts of ostensible natural disasters, to where does he think he's escaping? It's an understandable reaction to the stimulus of imminent danger, the survival instinct kicking in, but it seems like the sort of thing someone who failed to come to grips with the afterlife (or the lack thereof). It merely delaying the inevitable.

Sure, it wouldn't be much of a movie if our protagonist simply spent the entirety of the movie praying or meditating in order to reach an internal sense of serenity about the end, and then was calmly crushed under rubble, but conceivably if we were, in fact, "warned," conceivably that's what he should have been doing.

Ultimately I guess I must admit I do feel warned, but only about how I shouldn't bother with the movie. That much seems contradictory to the purpose of marketing, but hey, maybe that's how it works these days; Convince the audience the movie is so pointless that they feel compelled to see it merely to confirm that.

It certainly would be setting realistic expectations for the movie-going crowd. Which, I must admit, is a nice thing to do for people as apparently we have only a few more years left.

Oh wait. Is the movie itself the warning? Holy cow. I imagine never has something so secretly profound involved so much CGI.

And all this without me having to actually see the movie. Bravo, mediocre filmmakers. Now, if you'll excuse me, I have some peace to make with something, or at least stock up on canned goods.

Thursday, November 12, 2009

How far is that?

On the HGTV shows about people buying houses I've noticed in the opening voiceovers the host's narration often notes units of distance in amounts of time; e.g., "The apartment is only ten minutes from downtown."

Ten minutes by what method of movement? By car? On foot? Riding an alpaca?

The narration never specifies. Presumably the producers figure the audience doesn't really care; they assume the audience merely wants to hear the sales pitch.

Which is probably true.

The shows aren't so much about giving sound advice. They are real estate porn touting the upside of buying a home in an exciting locale.

Details would only serve to ruin the moment. Can't bust a nut with the narrator specifying how far "ten minutes" is. Nope.

Tuesday, November 10, 2009

A visit to the mailbox

Way back in May when we were sending out the wedding invitations we addressed one to Barack and Michelle Obama, at the White House.

It wasn't just a stunt; we were offering a legitimate invitation for them to attend. Obviously we had absolutely no expectation of them showing up, much less that it would even be acknowledged. But we had enough invitations and stamps to spare, and it cost no more postage to send it to 1600 Pennsylvania Avenue than any other address.

The wedding came and went back at the end of June, and the first couple was not in attendance, which was precisely as we anticipated. The day turned out great. Their absence took nothing away from the event.  Personally, I didn't think of them at all that day.  Silly me, I was focused on, you know, getting married.

Then yesterday evening I got home and, as I do every day, I checked the mail. Amongst the bills and junk I saw a small, cream-colored envelope. It was hand addressed to both of us, with our full names.

In the corner the return address identifies the sender: The White House.

I knew immediately what it was regarding. Our wedding invitation is the only thing we've sent to the White House, and it's one of the few things that has had my full name on it.

I dared not open it before my wife got home. I knew she'd want to be there when the envelope was pried open.

When she got home I met her at the door, gave her a kiss, and then handed her the envelope. She looked at the return address, paused, and then started to cry (much as she did a year ago on election day) tears of joy.

She was too excited to perform the task so she handed it back for me to open, carefully, without damaging the envelope any more than absolutely necessary. And then when I was being too methodical she took it back and we tore a cautious hole along one end.

Inside was a note, embossed with an emblem, with a message wishing us well on our wedding.

At the bottom were the signatures of the president and first lady.

It may have been a stamp, but nonetheless there were their signatures. On a card that had been sent to us.  From the White House.

And only six months after we sent out the invitation. Given the economy and the wars and health care, I'm kind of amazed it showed up that quickly.


It's undoubtedly best that the first couple did not attend the ceremony back in June. As exciting as that might have been, I have to imagine Secret Service padding down all our other guests would have been something of a distraction. And I cannot help but think them being there would have taken the focus off of us. It's the one day it does get to be about us, so that would have been something of a bummer, I must think. I'm just sayin'. No offense to them.

So the card is better. Whenever it showed up.

Thursday, November 05, 2009

The unexpected benefit of the Obama presidency

Let's pretend that in honor of the recent anniversary of last year's historic election day I offer this post.

Wanda Sykes has her new late night talk show premiering this Saturday, which I expect won't suck as much as most talk shows, and part of why I say that is because recently we watched her very funny HBO comedy special (I'ma Be Me).

One of the topics she discussed was her excitement over having a black president. She conceded his mixed-race heritage, and then jokingly admitted if he ever messed up really bad she'd change her tune to "Who voted for the half-white guy?"


If my wife and I have children of our own (that is, specifically by my sperm combining with her egg) they will, technically, be of mixed race. Which is to say they will be similar to the President Obama.

I already know that if the scenario where we have these children comes to pass that my children will have skin tones that won't match mine, and at some point I'll be out with the kids but without my wife and someone will come up to us and, thinking my children adorable (which, of course they will be), say something inadvertently stupid wherein the person will assume the children were adopted. The person's intention undoubtedly will have been to compliment me for taking on the responsibility of helping some children in the foster care system, but they will be falling prey to the cultural bias against mixed race heritage; a lot of people do reproduce with someone of the same race and thus their offspring resemble them in that way, and so it's not inexplicable that such is the paradigm that a lot of people still use for making such assumptions.

As insulting me presumably will not be the intended result of the situation I won't consider it appropriate to get offended and respond with anger. Yelling, "Listen, dumbass, you need to pull your head out of the 18th century!" seems unlikely to help educate the person to have a broader spectrum of ideas about such things. Also, I won't deserve to be praised for something I didn't do. So perhaps I will smile politely and softly explain, "Thanks, but they're not adopted. You see, they're like the president; my wife and I are similar to the president's parents (but reversed)."

In this vision the person nods in acknowledgment, understanding the reference without requiring me to explain further, and, most important, without feeling horribly embarrassed. Perhaps the person is a little wiser for the experience.

And that's all thanks to fact that there'll be a well-known public figure to whom I can allude, who got elected to the White House. Regardless of one's political affiliations, everyone knows who the president is.

I think that explanation will sound a lot classier than, "Thanks, but really all I did was get my wife knocked up."


There's an alternate scenario wherein some idiot makes an obviously insulting remark and I kick his ass, but let's keep it positive. And that one has not even an indirect link to the guy at 1600 Pennsylvania Ave., so I can't tie it in with today's ostensible thesis.

Which, for some reason, I'm considering pertinent at the moment.


And rest assured:  I will be very motivated to teach our potential children that if they grow up and go into politics to make sure they don't screw up. It won't merely be for the country's sake but for mine as well.


Monday, November 02, 2009

Do I dress up for Halloween? You have to ask?

Before I reveal what this year's Halloween costume was, for some reason it seems apropos to do a little costume recap of the last several years:

2004: Death takes a holiday

At a party, I'm a zombie pirate

And at the office, let's call it Jimmy Fallon with neck wound (and a tale of that)

(Yes, the same neck wound makeup used in both.)

At a party, I'm a zombie punker

And at the West Hollywood festivities on actual Halloween (with the story for that) I'm just kind of messed up:

2007 seems not to have been photographed. That, or the evidence has been destroyed.

2008: I'm a PC (and the story of it)

And for 2009?

That requires video...

This was my wife's handiwork. Literally.

Saturday, October 31, 2009

Lost minute costumes

A few weeks ago my wife and I went to a party store to search for Halloween costumes. While in there we looked at many of the packaged costumes and I was reminded of the glory of what the copy writer for the packaging must compose due to licensing restrictions, especially for costumes that clearly are based on an actual person but where that person's name cannot legally be used.

A long curly-haired wig on a man with some round-framed sunglasses is "Radio DJ" because they can't call it "Howard Stern" (who's actually an on-air personality, but that is a distinction that wouldn't necessarily make the costume sell any better). That sort of thing.

However, the one that struck me as both the most inspired and most risible was a shoulder-length bowl-cut wig where on the photo it was paired with a shaggy mustache and round-framed sunglasses. This was in the "'60s" section (and why costumes can be decade-specific is a topic for another time), along with the tie-died paraphernalia. The picture on the package was a dead-ringer for the male half of a popular duo who had a variety program, the late Sonny Bono. Obviously they couldn't call it "Sonny Bono," but what did they turn that into? "Hippie singer" or something like that perhaps?

No, rather than try to adapt it based on his occupation they changed it to words that sounded somewhat like his name: "Silly Boy."

Whether that pertains to him being the butt of the jokes on the TV show he shared with then-wife Cher or to his time as a California politician is another question.

(Note: It is entirely possible that there's some other reason why the wig has that name, but nonetheless the picture on the bag is meant to look like Sonny Bono, so I'm sticking with this explanation.)


Another set of costumes that were particularly noteworthy are the "sexy" versions of just about anything. Nurse, cop, super-heroine, and the like are staples of any party. Pretty much anything that can be adapted to show skin on a female is all that's necessary to qualify in that category.

One would think that male horror icons would be beyond the scope of that area. However, there's also (I'm not making this up) "Sexy Freddy Krueger."

It's starts with the brimmed hat and claw gloves that undoubtedly are part of the (what we'll call) standard Krueger costume. However, the iconic striped sweater, with slashes, is instead a mini-dress. So, apparently, if Freddy's legs hadn't been disfigured along with the rest of him in the fire and he showed them off he would have been making People magazine's list rather than tormenting children in their dreams. (Or both; I suppose those aren't intrinsically mutually exclusive.)

I'm not suggesting that there aren't people whose fetishes wouldn't go that way; obviously the company wouldn't bother (in this case) to pay the licensing fees to be able to use the actual character name unless they believed they would sell enough costumes to make back that investment.

What's more disturbing is that someone at the company who owns the rights to Freddy signed off on turning a character that struck fear into… well, something that is only scary in the way it causes one to question the psyche of the costume wearer.


Here's hoping you won't be resorting to any costume mentioned above.  Happy Halloween!

(Photos of my costume will be revealed in the next post.)

Thursday, October 29, 2009

Who's sexyist?

Recently Esquire magazine declared English actress Kate Beckinsale the "Sexiest Woman Alive." Given that this is an annual event conducted by the magazine, it's a title she will hold only for a year, after which point presumably she will cease to be sexy ever again.

Sexiness is apparently a fickle suitor.

Given that Esquire doesn't also select a sexiest man alive this practice could be seen as sexist. Of course, given that People is handling that male title it may be more of a copyright thing. It's entirely likely that both are sexist.

Whether the fact that Esquire selected a 36-year-old for the position, ostensibly suggesting to their readership that it's perfectly acceptable to fantasize about women who are technically old enough to run for president (of course, not that Kate is eligible due to matters of citizenship), makes it any less sexist is arguable. They still seem stuck on making their top criterion the requirement that the woman look phenomenal in her underwear, so it's something of a mixed message to the readers: Women are still worthwhile as they transition out of youth but as long as their looks haven't succumbed to the nature of aging.

But hey, the covers of women's magazines are filled with images of good-looking female celebrities as well, so maybe the appeal of being sexy runs hand-in-hand with unavoidable sexism.

What's not debatable: The term "sex" can go both ways (connotation-wise), depending on the suffix applied. (It's also subject to bad double entendres, but that's another story.)

Which brings us to the real purpose of today's post: language rumination.


If one takes the root word "sex" and appends "-ist" to it, the resulting term, sexist, pertains to gender; a "sexist" remark is made about a person's sex. However, if one appends "-y" to that root instead, sexy generally connotes something that is attractive in a way pertaining to copulation, i.e., to the act of what (in common parlance) is having sex.

The former is more apropos, as "sex" as a word means being either male or female; "gender" technically pertains to type, but not exclusively to being male or female.

(Ah, the things one picks up while reading books. They have no pertinence to regular conversation, and they make one seem condescendingly erudite, but nonetheless words have particular origins that do not change merely because most people are not aware of them.)

However, the point here is not that the root word is interpreted to mean both. Such is the way language goes; words get used in multiple contexts that differ in connotation. It happens. It's part of what keeps things interesting.

The terms made from the different suffixes are strictly associated with the one meaning of the root word, with no variation that the root word can have. Sexy never alludes to gender, and sexist never pertains to copulation.

Further, the derived terms retain a specific positive or negative connotation. Sexy is always good; sexist is always bad. That's not so much intrinsic to their meanings but from the way those terms have been associated. Sexy gets overused, particularly in advertising, to the point of being almost without impact; sexist suggests a prejudicial insult, but technically could describe an innocuous statement that pertains to something about gender ("Women have ovaries" is sexist in the context of being about a physiological aspect of one sex, but it is unlikely to be thought of as a sexist statement; there's no judgment attached).

And of course, if one takes sexist and inserts another e between the i and second s one gets back to sexiest, and we have come full circle. Sort of.

Something that is without question about all this: Ruminating about the language in this way is not sexy.


And now, some abject pandering to the snarky contingent of the internet:

Something else that's probably not likely to elicit an argument: It's lucky for Kate Beckinsale that she is known as the "sexiest woman alive" because with movies like Whiteout she's unlikely ever to be known as "Academy Award winner."



Go ahead.  Let me have it.  Click the "thoughts on this" link below and go wild.

Monday, October 26, 2009

Uniform disorder

When a few years ago I discovered the Uni Watch website I was somewhat surprised—albeit pleasantly—that there were a great number of people out there who were similarly affected with this minor disorder I have: noticing what sports teams are wearing.

Well, everyone who watches sports notices that the teams are wearing uniforms; that's how we know who is on which team.  But not everyone pays that much attention to the specific aesthetics, or analyzes the changes to the uniforms, or ruminates on whether those changes are improvements over how they were before.  Only a particular type of fan goes that far, and this is a site for us.

Sometimes when introducing a highlight on SportsCenter (or other sports shows) they will mention when a team is wearing a special uniform, whether it's a brightly colored alternate jersey or a "throwback" version of a vintage design or a tribute to a particular event, but it's merely a brief allusion (and, I suspect, mostly to help the viewer understand why the teams don't look like they usually do, not so much to spotlight the differences).  This site takes it to another level, focusing not on how the teams played but how they looked playing.

You may wonder why anyone would do that, but let's stop for a moment and reflect on the fact that before every entertainment awards show there's hours of "red carpet" coverage where the "fashionistas" dissect what the celebrities show up wearing, and that's on television.  This is merely one little website.  But if it helps to draw an analogy, let's say it's a tiny bit like Project Runway for heterosexual males. 

(Stereotypes invoked, yes, but tell me I'm wrong.)

The thing is this: Given the massive amount of money sports teams make from merchandising, these uniforms are a huge business; it's all good and well that fashion designers get their work spotlighted at the aforementioned awards shows, but it's not like that translates much to the average person; not only are the gowns prohibitively expensive for many women but there's few opportunities for them to be worn period. However, you're going to see a lot of people walking around sporting a cap or jersey in emulation of what they saw an athlete wearing on a field or court.

And where would hip-hop be without baseball caps? It's de rigueur. It's fashion.


More than you need to know—well, not that all of this doesn't fall into that category, but hey, you've read this far:

As to why others who visit the site are afflicted I cannot say but for me it started early. As a child I invented my own football league, with teams of my own creation, and designed uniforms for all of them. (For a couple years I even came up with a schedule for the teams, and each one was assigned to a corresponding NFL team. The winners were determined by whose NFL analog scored more points. Yes, I had way too much time on my hands back then.)

("Back then"?)

So, yes, I'd been paying attention to these details since I was a lad, and even progressing into adulthood had not extinguished the inclination. I hadn't designed any uniforms since those formative years; the impulse to create had waned but the tendency to notice what others had created remained. That manifested itself merely in looking closely at highlights or photos from games at the beginning of the season to see if there were any identifiable modifications since the previous year's uniforms.

To be honest, I don't visit the site daily or anything, and much of the time I only skim the story for something that catches my attention. I'm not as gung-ho about it as those who post the stories or those who comment on them. I have left very few comments; I tend to drop by in the evening, long after the dust has settled about whatever topic was discussed during the day, and generally whatever opinion I may have about it someone else has probably already stated.

It is one of only a number of areas in which I have some interest. Let's leave it at that.

But it wasn't until I discovered the Uni Watch site that I realized I was not alone.


Anyway, the reason this comes to mind now: Recently the site organized a survey, open to its readership, where they sought to get rankings of the home uniforms for each NFL team. In the survey one could give each team a score of 0 to 5 (with 5 being the best). At the end the results were tallied so they could see which team got the highest average, the second highest, and so through the lowest. In theory this would determine which teams were considered to have the best and the worst uniforms.

To give you some idea about the readership (or, at least, what I think the makers of the survey expected of the readers), the survey featured just a list of the team names with buttons for 0 through 5 next to them; nowhere on the survey itself were there photos or diagrams or descriptions of what the respective uniforms looked like. Yes, there were links to see the uniforms back at the original post that announced the survey, but I suspect  the idea was that anyone who'd choose to participate would simply know.

And yes, I knew them all off the top of my head.

I didn't spend a lot of time on the survey. I knew I could sit there and really dwell on each one, but I didn't have that level of inclination. Whatever score came to mind I selected and moved on to the next team. And I noticed that a lot of team I gave 3's; their uniforms were fine but not spectacular. There was one team whose uniforms had bothered me ever since they started wearing the current design a few years ago, and that was the only 0 I gave: the Buffalo Bills.

And if you really want to know why they bothered me so, feel free to ask. In short, they're way too busy, with too many similar shades of blue. They seem like the owner (or whoever chose) couldn't decide on a reasonable color scheme.

When the results of the survey were announced recently—well, when finally I got around to visiting the site and discover they'd been announced—I wasn't so much concerned with which team's uniforms were "the best"; I didn't have a strong opinion about that (and besides I believe that tends to be too heavily influenced by which team someone considers his favorite, which makes that less than objective). I skimmed over the listed top ten but dwelt longer on the teams listed as the bottom five.

Now, having read the site and sometimes the comments on and off for a while now I had a reasonable idea about which teams were going to be on that inglorious list. Still, when I saw which team was noted as having been rated the absolute lowest, I felt a twinge of satisfaction. The survey-taking portion of the readers of this site gave ranked the Bills as the worst.

It's a pathetic validation, I know, but a tiny validation it was.

Now, I should admit that of the other four teams on this bad list I actually ranked them relatively high, so on the rest of them I was out of step with the consensus, but on the one about which I felt the strongest (albeit negatively) I was not alone. And is that not what the internet is supposed to be. A window to a world where one is not a freak by one's self, but part of a group of freaks?


The survey of readers on the Uni Watch site regarding the NFL team uniforms showed that, at least amongst those who follow what the teams wear closely, they tend to be traditionalists. The top two vote-getters—the Bears and the Packers—adorn themselves in uniforms that have remained virtually unchanged for decades.  The rest of the top ten are all teams that have stuck with their look for a long time (or in the case of the Jets, Giants and Chargers, who have gone back to uniforms that are reminiscent of old uniforms) The teams that fared the worst—the Vikings, Seahawks, Jaguars, Bengals, and aforementioned Bills—all have changed and incorporated some level of attempted innovation (albeit rather poorly in some cases).

People like what they grew up with, and people like whatever their favorite team wears, but what gets the highest marks are what pays homage to the game's history.

That the respondents rated so poorly uniforms that incorporated what one could consider innovative elements (with the notable exception of the Bills, whose uniforms really are a mess), such as the Vikings' vertical stripe running from the shoulder down to the pants and forming a horn not unlike the one on their helmets, suggests that those who filled out the survey (who have strong opinions about the uniforms) tend to be conservative in their ideas about uniforms. That's hardly surprising, given that sports tend to appeal to the conservative side of people; what is most appealing about sports, on some level, is that history is regarded with such admiration, and by its nature admiring history puts a premium on conserving the traditions established in that history.

The great irony the results showing a preference for sticking with how things were is that if the teams never changed their uniforms the site would have nothing to report except when a new team was added. There'd be only so long that the fans could debate a given team's uniforms before there was nothing further to say; whether the unis were great or awful would be established (whether a consensus or not) and that would be it. Only because teams' management (for whatever reason—but most likely merchandising potential—have decided from time to time to change the designs the Uni Watch fans have had a wealth of material to discuss.

And it allows for the instances where the teams where the vintage "throwback" uniforms to fill with enjoyment those who preferred those looks (even though, frankly, often those throwbacks look dated and archaic, in my humble opinion—but clearly I am in the minority here, so that's to be expected).

And let's face it: Does anyone wish the Broncos had kept those brown uniforms they apparently started out wearing?

Friday, October 23, 2009

The comfort of bad news

One recent morning when I turned on the TV the news programs coincidentally featured stories about gruesome incidents. One channel had a reporter talking about a fatal car accident, another had the anchor talking about a shooting spree in a gym, and another had an aerial shot from a helicopter showing houses engulfed in flames. The only place I could find something innocuous was to switch over to SportsCenter, and that was only safe because the local team had won their game last night.

Good news is no news.

It's not so much that good news is intrinsically uninteresting; it's that good news frightens us.

I'll elaborate.

Reporting good news rings of hubris; it's almost beseeching the Fates to bring about something bad. When something bad has happened, the proverbial other shoe has fallen, and as ridiculous as it seems, it almost allows for breathing easier; we know what has gone wrong.

With good news, we're left uneasy, concerned about when it will turn to bad news.

There's no logical reason for this, but somehow it seems accepted on a tacit level even by the ardently scientific. Something deep down says: Karma is a bitch. Stay humble.

Bad news sucks, but we're better prepared for it than the good stuff.

Oh yeah, we have issues, but at least we have the morning news to distract us from those...

Wednesday, October 21, 2009

Seventh inning stretch: Talkin' baseball (divinity)

Please sit down for this rumination on our national pastime... even if you don't care for baseball...

Saturday night the Angels lost to the Yankees in an extra inning game that lasted over five hours, putting them down two games in the best-of-seven league championship series.  That's what the box score will report.  The highlights showed how the Angels got the lead and ended up having their "closer"—the pitcher whose role is to pitch in the last inning and "close down" the other team with overpowering stuff—gave up a home run to put the game back in a tie.  The play that will be forever associated with the game is the error made by the Angels' second baseman in the bottom of the 13th inning that allowed the Yankees' winning run to score, which the analysts could blame on the bitter New York cold.


The Angels were playing the Yankees in the championship series because they'd defeated the Red Sox in the earlier divisional round of the playoffs.  That was noteworthy because the Angels had never beaten the Red Sox in the playoffs previously.  Going back 23 years when Boston played California/Anaheim/Los Angeles (of Anaheim) in the post season they'd won every time, including eliminating the Angels the last three times they'd been in the playoffs.  The Red Sox certainly seemed to "have the Angels' number" when it came to these post season match-ups.  However, this year the Angels defeated the Red Sox not only by sweeping that series, but by scoring three runs with two outs in the top of the 9th.  Getting that proverbial monkey off their back seemed destined to occur.

Not that there's any logical reason why the events of the past have any direct influence over the current situation.  Certainly a player who had been on the team before and suffered those losses might experience some psychological effect on his confidence, but given how many games these professional athletes play that shouldn't be much of a factor.  It's superstition, and as someone I used to know was apt to say, the only person who made anything off superstition was Stevie Wonder.

More so than any other sport baseball thrives on these superstitions.  Certainly the fans in Boston know about that, given the longstanding belief in the "curse of the Bambino" where the fact that the Red Sox didn't win a World Series from 1919 to 2004 was attributed to some superstition regarding the trade of Babe Ruth from Boston to the Yankees back in the early part of last century.  There was the infamous Bill Buckner flub in the 1986 World Series where the Red Sox were on the verge of winning and an error allowed the Mets to pull a come-from-behind victory.  The heartbroken Boston fans had to blame it on something, and attributing it to supernatural forces going back to before anyone on the field had even been born seemed as reasonable as anything.  It couldn't be just dumb luck.

Of course, the Red Sox were in that position to lose the '86 Series because they'd come back to defeat the aforementioned Angels in the league championship.  There the Angels were at one point a single strike away from going to their first World Series and ended up blowing it.  There had been a very real likelihood at many points that the Sox would not have even been there to lose to the Mets, but they appeared to have fortune on their side—but fortune that did not extend from the playoffs to the Series.

And maybe there really are forces operating beyond our human comprehension that influence these events.  Maybe there's some storyteller in the sky who knows the drama of a streak of losses in a big game, or to a particular opponent, that makes for something easy for people to understand (and easy for sports reporters to milk for air time or column inches).  And eventually that story plays out with the team breaking the streak. 

The Red Sox won the Series in 2004, with talk of the curse having been broken.  When they won another title in 2007 Bostonians seemed to have forgotten there ever was a curse.

This year's story line featured the Angels breaking their curse with the Red Sox.  That was the fortune allotted them by the proverbial baseball gods.

However, that doesn't make it the only story line in the 2009 post-season.

Between the regular season and playoffs the Angels had been very successful against the Yankees over this past decade.  They're one of the few teams with a winning record against the Bronx Bombers during that time.  In certain ways the Halos have had the Yankees' number in a manner similar to what the Red Sox had over them.

And even before the Yankees jumped out to a 3-to-1 lead in the series, I couldn't shake the feeling that this year's story may feature the Yankees breaking their pattern of losing to the Angels.  They haven't won a World Series since 2000, which may not seem that long but by Yankee standards that's a hideous drought.  Or at least it's something that the New York-biased sports media will be able to spin into a story.

Does any of this make any sense?  Of course not.  It's far more logical to suspect the Yankees will win because of their strong pitching staff and batting order laden with All Stars; they did win more games than any other team in MLB this season, so by all rights they should be favored to win the most games in the post-season.  However, that's not how it works with these intuitions; they're gut feelings that merely get justified by analysis.


Make no mistake:  I am not rooting for the Yankees.  Not in the slightest.  I was raised an Angel fan, and although I had to stop being one after the heartbreak of 1986 (I was in the stadium at the game where they were one strike away and then blew it--as I mentioned way back in this post) I certainly continue to root for them.  I just can't believe in them.

And that's the nature of all of this.  My superstition may be that if I get my hopes up and believe they'll win that undermines all possibility of it happening.  In 2002 when the Angels did win their first (and only) World Series title I was pleased but at no point did I think they'd actually do it.

It's sheer folly to claim any credit (in a reverse sort of way) for them succeeding that year because I refrained from having any confidence.  There's no absolute pattern of victory since they fell from my grace; they've lost plenty of times even when I had no faith in them.  And just to reiterate: I do not claim to be a fan; as noted, that stopped 23 years ago (out of emotional necessity).  At most I can claim to be a follower.  The "real" fans would be completely in their rights to declare me as vanquished from the kingdom (although I'd say I left of my own accord).

But I'd be lying if I claimed there wasn't that little part of me that didn't harbor a tiny suspicion that I need to maintain the lack of confidence in them—not because I can influence the outcome in a reversed way, but out of deference to the team I once actively supported; there's no guarantees that me giving up on them will make them win but there seems the likelihood that me believing in them will eliminate that possibility.

Which is absurd, of course.  However, it seems to operate on some level of my psyche that cannot be completely overwhelmed by intellect, that is not utterly dismissed due to its illogic.

It's probably a self-defense mechanism, and suggests that my lack of belief is really delusional; deep down I never stopped being a fan but I had to convince my conscious mind that I did because I couldn't take the disappointment over and over.  Maybe so.

But it remains the case that one cannot prove that there isn't a connection between me having faith and them being thwarted by the baseball gods.  Not that the baseball gods are just looking to screw me but that they do regard my ostensible stand in the matter. 

Is that any crazier than the fans screaming their heads off at the TV, their hats turned backwards (in the "rally cap" position) believing that they bring about a positive outcome?


It's tricky not to at least concede the possibility that for some team it's just their year, that they are being smiled upon by the forces of fortune.  No matter how bad the situation seems in the middle, how hopeless victory appears, somehow they rally back and win.  Just this decade alone two obvious examples come to mind. 

Back in 2004, when the Red Sox won that curse-breaking title, they had to first win the American League pennant by coming back against the Yankees.  In the best-of-seven championship series they lost the first three games—a hole out of which no team had ever climbed—and then they pulled off the unthinkable, winning the last four games in a row. 

Two years before that, the Angels had finally made it to the first World Series appearance in team history.  However, going in to game 6 they were down three games to two against the Giants, and by the middle innings of that game they were behind and it looked to be over.  But then they rallied not only to win that game but to dominate game 7 and earn their rings.

And there's plenty more examples out there.

It's hard to look at the odds they overcame and think mere random chance brought about these results.  Not that it's impossible, but certainly our proclivities leave us inclined to think there's a distinct possibility some kind of divinity for the great American pastime had a hand in the outcome.

If nothing else, attributing it to "it was their year" allows the losing team's fans to feel less bad.  Their team wasn't fated to win, and thus it wasn't that their team didn't play well enough or that they as fans didn't root hard enough; it was out of their control, and hence isn't something over which they should beat themselves up.

And when it continues to not be your team's year over and over, such as with the Chicago Cubs (who haven't won a World Series since 1908), it's comforting to blame it on a curse (although in this case it involves a goat, not a trade of Babe Ruth). 

Whether it's genuine up to some higher sports power or merely dumb luck where a delusion attributes it to non-existent forces, one thing is non-debatable:  That's part of the game.

And the beauty of baseball: There's always next season, when one can hope that the story the baseball gods wish to tell has one's team celebrating in October (and possibly in to November). Not that I'll be hoping, but you know what I mean.