Thursday, December 31, 2009

Last post of whatever the heck we're calling this decade

With the days winding down where the date above them ends with a 9 the window for recapping the year that's about to end is, itself, coming to an end. Within a day or so of New Year's Day it seems like the appetite for such things dries up, and then people are focused on where the new year is going rather than dwelling on where the previous year went.

This year presents an opportunity that's greater than the typical year-end recaps, as it is perceived as the end of a decade. Well, of course it's the end of a decade; a decade is a period of ten years; on any given day it's the completion of a window of time that started ten years previous. Obviously, today it will be true that the ten-year period that started January 1, 2000 will come to a close, and as such, that appears to justify devoting air time and column inches to making lists of bests/worsts/whatever in that period. With 24-hour news, specialty networks, and the internet there are certainly plenty of outlets that need filling.

Being the first decade of the century it does seem as though the count question that divided people ten years ago at this time can still come in to play. Namely: Does the decade start at 0 and run to 9, or start at 1 and run to 10?

As we approached the end of 1999, in addition to people stockpiling supplies in fear that programmers had not corrected their code pertaining to the use of two-digit years, the main argument was whether come January 1, 2000 it constituted the commencement of the new millennium or such an event actually fell to the first day of 2001. The answer was: It only matters to people who give a crap about delineating history on a base-10 structure.

Clearly common parlance has now declared that the new period started in 2000, whether that's right or wrong. Even though that makes the first decade of the previous millennium only nine years long, the adjustments to the calendar that have occurred over the last couple thousand years really make nit-picking about such things rather moot.

That the year is the length that it is makes some sense to the extent it reflects the relative position of the earth in its orbit around the sun, but as that period of the earth's orbit doesn't match 365 of the 24-hour periods we call days there's the need for extra days in leap years, so even what we call December 31 does not always correspond with where the earth was in its orbit the last time we called a day December 31. However, it's the system we have. All references to a given day are what we choose to interpret as having significance, but at least there's some system in place to try to keep a modicum of order. Grouping years into arbitrary periods like decades (and centuries, and even millennia) is pure semantics; if one wants 2009 to be the end of a decade it's the end of a decade.

When 2020 rolls around it will be far enough away from 2000 for the millennium-or-not debate to be old news, and it will be the start of a period where the second half of the date will all be pronounced the same for ten years straight—that is, two-thousand twenty, two thousand twenty-one, two-thousand twenty-two, etc.; it will be what will later be called "the Twenties" and become the first easily referenced decade of the twenty-first century (and will be followed by seven more such distinguishable decades, to close out the hundred years that commenced in 2000). As we complete a decade of years where the first part was the same—two-thousand—but will be the same through the century, it does no good in providing a handy nickname for the period completed at the end of today.

I've heard some attempts at branding this decade "the aughts," hearkening back to the turn of previous century, but I'm skeptical that in the era of Twitter such an antiquated term will catch on, especially as it hasn't happened yet (despite some trying to make it so for half of the expiring decade).

When VH1 resurrected their I Love the... pop culture retrospective series (after making multiple versions of I Love the '70s, I Love the '80s, and I Love the '90s) for this first eight years of this decade they clearly punted on coming up with a term, wussing out with I Love the New Millennium.  Had they boldly named it "I Love the Aughts," or, really, anything that didn't involve the entire next thousand years, then this would all be resolved, but instead we're in a bit of a vacuum about the determining factor. We'll just have to wait and see what term will be adopted.

I suspect "the first decade" might prove the most easily understood when in the future people attempt to mention this period, but admit it lacks sufficient panache to get a TV series named after it.

The coming decade will have a majority of years ending in "teen," and "the Teens" likely will end up being their moniker when all is said and done, with years ten, eleven and twelve remaining implicitly included.

The impending ten years, with the second half of the full date finally hitting double digits, will also prove an interesting period, pronunciation-wise: While the single-digit years lent themselves only to being spoken as "two-thousand one," "two thousand two," etc., and not "twenty-o-one," "twenty-o-two," etc., the Teens could see a shift to "twenty ten" over "two-thousand ten" (and so on). Personally I prefer "two thousand," being a mere extra syllable over "twenty," but I have no illusions about exerting general influence in such matters. (Or really in any matters, but I digress.)

Of course, the triple-syllable "eleven" we'll hit the following year will almost certainly make prefacing that with "twenty" less of a mouthful, and that probably will be the end of "two-thousand" until perhaps 2020, when "twenty twenty" may sound too much like (what probably will be an outdated) night time news program. Not that it's likely there'll even be news programs then, but some of us will still be around from the time when such things existed, and may find it sounds weird speaking the date that way.

I'd lay pretty good odds that 2021 will be "twenty twenty-one" even if the year before it was "two-thousand twenty," and the "twenty" will supplant "two-thousand" as the common reference for the remainder of the century.

And not to sound morbid but I don't expect to still be around when next we have this what-to-call-the-first-decade conundrum. Unless, of course, reincarnation is what awaits me in the afterlife, in which case my future self may be having this same rumination 100 years hence, in whatever incarnation computers and the internet have taken.

Note to theoretical reincarnated future self: If the content of the current internet somehow persists in to the 22nd century and you come across this, it could prove what the afterlife is. Of course, if the expansion of what's on the 'net continues to increase as it has, and then there's 100 more years worth of blogs and photos and YouTube videos, finding this will prove an even greater proverbial needle in a universe-sized haystack. (That expression may mean nothing to you by that time, but you should be able to look it up in whatever has become of Google.) Unless you happen to be searching for "world's largest corn dog" (presently #3!) or "romanesque broccoli"--the two posts of mine that show up reasonably high in the lists of results by search engines in the early 21st century--it's highly unlikely that you'd ever find that these words ever existed. (Which, considering that hardly anybody here in the 21st century can find these words, is the realistic expectation.) Especially in light of the greater likelihood that in 2109 the media (whatever that is at the time) will be too busy recapping the bests and worsts of that decade.

Whatever they're calling it.

Happy 2110.


And for my reader back in 2009: Happy Old Year's Day!

Sunday, December 27, 2009

Closing down the holidays

No one loves the timing of Xmas as much as the media. With it a week before the end of the year it creates a dead zone of focus, where the public doesn't really expect much in the way of reporting. Magazines can fill their pages with recaps of the year that's coming to a close, which they can assemble ahead of time so the staff can take time off. The government takes a break so there's not much to keep tabs on there. The perception is that everyone just wants a pleasant respite from the usual crap for a week or two, which is almost certainly the case.

However, were there not this window created by the proximity of Xmas to December 31 for this collective breather to occur the media might not have this opportunity to fall back on retrospectives and best-of lists. It's not that they wouldn't still do that stuff, but if they had to continue with active reporting during the latter half of the twelfth month they wouldn't be the only cover stories on the newsstands.

Of course, conversely, were it not for the proximity of New Year's Day to Xmas one wonders whether that holiday would be what it is. Okay, let's not even pretend it would be. Having January 1 off would be just another day not at the job if it didn't carry the association with "the holidays," and, most important, with it being the end of that period. After a month of having "the holidays" dominate everything it becomes more necessary than usual to blow off proverbial steam.

It celebrates the return to normal, which is definitely necessary. Something must signal that, okay, really, the decorations need to come down. And you get some college bowl games while enjoying the day off for no other justification than you had to change the calendar.

Thursday, December 24, 2009

Breaking it down, Xmas edition

In the popular holiday song "Santa Claus Is Coming to Town" it's pretty easy to see how opening with "You'd better watch out" and later a line like "He sees you when you're sleeping, he knows when you're awake" could be somewhat creepy, with Big Brother-esque overtones. However, the closing of that couplet that ostensibly softens the potential for paranoia—"He knows if you've been bad or good, so be good for goodness' sake"—is completely undone by that first part. The assertion that he has you under constant surveillance renders the potential for "being good for goodness' sake" impossible; any good done is not performed for its own sake but in order to avoid making the sort of mistake that Santa's purportedly ubiquitous gaze would get you on the "naughty" list. Any otherwise altruistic gesture becomes selfish because, at least on a subconscious level, it's done to avoid the negative consequences associated therewith.

And what sort of error in behavior could bring one to that disgraceful status? According to the opening stanza, crying alone is sufficient ("you'd better not cry"). Sure, the next part alludes to pouting, but as that's mentioned separately crying becomes its own item on the list, independent of any connection with pouting. So if you're sad you'd better just suck it up because only stoicism gets one on the "nice" list.

If anyone actually paid attention to and followed the words of the song he'd be emotionally stunted and generally distressed about Christmas. He might find himself unwittingly compelled to deconstruct the lyrics in a manner not unlike this. And who'd want that?


Speaking of holiday song lyrics that just slide by, what about this one from "Winter Wonderland": "We'll have lots of fun with Mr. Snowman until the other kiddies knock him down" (emphasis mine). It is simply taken as a given that the joy brought about by putting all the effort into making a snowman and having whatever constitutes "lots of fun" with it shall be brought to an end by uninvolved children who seek destruction out of what seems merely mischievous but almost certainly suggests deeper psychological issues—ones that really should be glossed over so glibly, even if "down" is a convenient rhyme for "clown" (which pretense in that prior line takes the snowman to be).

Kids may be kids, but to take down a snow sculpture—even an amateur one—with such wanton disregard for others is not something to be tacitly encouraged in popular song.

The kids are already paranoid from Santa watching them; we needn't be turning them into delinquents as well.


No, "Winter Wonderland" makes no overt reference to Xmas or any other holiday, but really, when else do you hear it? It's a "holiday" song by association if not intention.

And it really has no application to areas like Southern California where there's little potential to make snowmen. Perhaps a sculpture in the style of a sand castle, but not actual snow.

Of course, the sunshine has enough songs about it; that the snowy landscape of winter gets one is only fair, even if it is just trying to romanticize the bleakest season.


Heck, the reason Xmas (the capitalistic appropriation of Christmas) is as big as it is probably stems from being in winter. Sure, the marketing possibilities are attractive, but one could concoct a holiday that takes advantage of consumerism and put it in any month and get a decent level of involvement. To have it really take off, one needs to have it take place at a time of year when the darkness exceeds the light, when cold reigns, when people are forced to pretend that getting bundled up is fun.

Had it started in the southern hemisphere, during their winter, and then fallen in the northern hemisphere's summer I see little chance it would have the same popularity amongst those above the equator. Not that it would have no following, but it would be on par with, say, the U.S.'s 4th of July, or perhaps Halloween—good, but not something that people anticipate for a month beforehand.

People don't need a respite from long, sunny days in the same way that they do from gray, snowy days. Sure, sweltering summer heat can suck just as much as freezing winter cold, but it's never so hot that they close schools (much to the chagrin of children who may have to attend classes during those days).

And imagine the Christmas carols that would come from that.  Bah humbug.


Merry Christmas, or Merry Xmas, or just happy Friday tomorrow.

Tuesday, December 22, 2009

Illuminated quiz

What can one tell from the egregious, Griswald-esque displays of Xmas lights?

a) who is most excited about the holiday
b) who has the greatest need to compensate for feelings of inadequacy in other areas
c) who will have the largest electricity bill come January
d) all of the above
e) none of the above
f) more of a couple of the above but not so much one of the other
g) not quite enough of all of the above or not quite enough of none of the above but somewhere in between
h) what was the question again?


And here's an example of a not-quite-egregious amount of lights, from a house down the street in my neighborhood.

Sunday, December 20, 2009

Jingling the bells

This time of year it's a bit difficult to escape Xmas music when one is out in public. That it's playing in shops is to be expected, of course; it serves as an advertisement for the holiday shopping season by reminding us of the excitement we felt as children, listening to these traditional songs over and over while parents decorated the house and we waited to see what Santa would bring us (unless we were Jewish, or Muslim, or our parents were Jehovah's Witnesses...).

Even with the ubiquity of this music out there I still spent two full hours one night this week downloading Xmas tracks; my wife wanted to increase our holiday music selections, so I set about the task of making her happy. However, it soon started taking me down my own Santa Claus Lane.

Despite having Bing Crosby drilled into my ears for decades—actually, because of having had Bing Crosby drilled into my ears for decades—I stilled downloaded he and the Andrews Sisters version of "Jingle Bells." As these songs go, you really can't go wrong with Bing. The man is Christmas, music-wise. I mean, when I pause to think about it, I cannot think of a single Crosby song that is not holiday-related. I'm sure the man had a lucrative singing career beyond just that, and I imagine I must have heard at least song he did outside that genre, but his association with this time of year has utterly washed away whatever my mind may have been able to retain about the rest of his oeuvre.

This year, with eMusic's mid-year acquisition of the Sony catalog, the site had the "real" version of "Jingle Bells" (the one that gets played on the radio) rather than some poor-sounding re-creation (which was all they had last year at this time). That's the thing about these songs: There's specific recordings that were the ones we heard on the radio, over and over, and those are the ones that we want to keep hearing. It's fine and well if some newer artist recorded his or her take on the song, but when it comes to Bing we only want that particular one with the Sisters backing him up. All other versions that may have been put on tape with him singing may as well be destroyed; as far as this works in our mind they're as worthless as a three-dollar bill.

(Last year, when searching the site, the only Bing Crosby available was some live recording that had an eerie echo quality that sounded like it had been recorded in an empty mansion. Before this year's mission my wife made a point of noting that when I went to download this year that she didn't want any "spooky" versions.)

The main song she sought, the one she noted as her favorite holiday song, was one she had to hum. "You know, ding-ding-da-ding, ding-ding-da-ding, ding-ding-da-ding…"

"You mean 'Chorus of the Bells'?" I said. (Of course, by that I meant "Carol of the Bells." Hey, at least I was close.)

Searching the eMusic site for that track name rendered hundreds of hits. None included the New Age rock version of Mannheim Steamroller, but I did find an actual orchestral recording of Leonard Bernstein conducting the New York Philharmonic. (I also learned that it was originally known as the Ukrainian Bell Carol, composed by a Ukrainian composer named Leontovych.)  And I found another version performed in Caribbean style, and another done a cappella, and another with merely harps, and a funky rendition by Shawn Lee's Ping Pong Orchestra.

Sampling merely a tiny fraction of all those versions is part of what took so long, and that's a big part of the joy and pain of seeking out Xmas tunes: There's so many of them.

Digging in to the plethora of Xmas music revealed that all the songs one hears over and over on the radio and in stores, etc., are merely the tip of a proverbial holiday iceberg. You may think you know that every artist and his brother has recorded a Xmas album, but until you see how much is out there you have no idea. After a while I started to wonder, as someone who'd heard these songs growing up, if I was contractually obligated to record an album myself. Then I worried that if I kept searching I'd discover that I already had.

There's a reason why places like Starbucks have CDs featuring a sampler of popular Xmas songs next to the register. (Well, there's a reason beyond the obvious, that they know it will sell.) It may not include every favorite tune one remembers but acquiring it with mere moments of effort (presumably glancing at the package while waiting for one's coffee) is the ideal amount of time one should spend getting some Xmas tracks for one's collection in order to maintain that "holiday spirit." To explore the depths of all that's out there (or even all that's on a single music website) runs the risk of being crushed under an avalanche of carols.


Perhaps it would be different with some glasses of "holiday cheer" before embarking on such a task. In this case, maybe friends only let friends drink and download.

Ah, holiday traditions of the future.

Thursday, December 17, 2009

Chestnuts roasted

If you live to be 93, Mel Tormé doesn't want you to have a Merry Christmas.

In one of the most covered songs of all time, his "The Christmas Song" the "simple phrase" of wishing the aforementioned Merry Christmas only extends "to kids from one to ninety-two." If you happen to live to see a 93rd year the writer of the song apparently thinks you've had enough merry Christmases.

Come December... well, it seems "the Velvet Fog" is offering another simple phrase for kids who are over 92: It hasn't been said but distinctly implied, it truly sucks to be you.

Go sit quietly for another seven years and maybe you'll get a mention on the Today Show.


Personally, I think you 93+ folks deserve as good a holiday as all those young whipper-snappers, but that's just me.

Friday, December 11, 2009

I went to college. It was okay.

More unnecessary rumination about college that probably should not be shared publicly. Will I ever learn?

I'm not proud to admit this, but I don't look back at my years of attending classes at a state university as having really taught me all that much. For the classes I took only out of obligation what I had to get from them to pass I pretty much forgot after the semester ended, and for the classes in my area of focus it was more a matter of just proving that in high school I'd learned how to read and assemble an essay. As regards "creative" writing, what I got was a reminder that it took putting some effort in to making the story good (something I already had figured out), and that the professor I kept getting (coincidentally) really didn't give much a shit, and that most people who take those classes aren't doing so because they have talent but because they think the class will be easy. I'm not suggesting it was an utter waste but it was hardly a transformative experience.

There were classes where I did learn some things that stuck with me, although not for any utilitarian reason: the music survey courses I took as electives (an overview of the Romantic period in classical, and the history of jazz). Having a general interest in music (but not having sufficient talent to be a musician), and already having an appreciation for classical and jazz (both of which I listened to during high school, as good background for studying) they provided subject matter I found genuinely interesting. By no means did a couple semesters of high level glimpses of these areas turn me into a scholar who could lecture on the topic, but it would allow me to vaguely identify the distinction between bebop and big band swing, to have a rough idea of what the forms of a symphony were (or at least how many movements there were), and how to pronounce the tricky composer names like Dvorak. Ultimately, it allowed me to participate in a conversation on such a topic if that came up a cocktail party, to avoid seeming like a completely uncultured philistine.

Which, if one gets down to it, is pretty much the practical application of most of what one gleaned from the classes one took at college outside of one's major.

I'm not suggesting that these have come up terribly often, but given that the subject matter stuck with me at all so that I could at least have a rough idea what someone else was talking about it the topic did arise is more than I can say about most of the rest of the classes I took (and that includes ones in my major… alas). That I actually enjoyed the classes while I was taking them is more than can be said for a lot of others I sat through is a nice bonus.

And unlike the material I purchases for other classes (which were either sold back or are sitting on a shelf or in a box), what I bought for those classes still come up sometimes on the iPod (well, the jazz at least; we've discussed the issue with classical before).

But by no means was going to college a waste. Having a degree, if nothing else, proves I can stick with things, see them through, and most important, put up with the bullshit inherent in the system.

That, ultimately, is what one really needs to learn. And how to not seem like a completely uncultured philistine at a cocktail party (such as your spouse's office holiday which you may be attending this time of year).

Thursday, December 10, 2009

Learning... take off, eh

My years of university classes, and thousands of dollars spent on tuition, did not imbue me with a lasting knowledge of, say, what distinguishes a Manet from a Monet (other than how to pronounce each name correctly), even though I did take an art history class. A few months of repeated viewings of the movie Strange Brew (a spin-off from SCTV, featuring Rick Moranis and Dave Thomas as their characters Bob and Doug McKenzie, and Max Von Sydow as the villainous brewery operator) during my high school years has left me with the ability to quote lines from it still to this day, despite not having seen it in over 20 years.

In fact, there was a time, before I even started college, when I could recite the entire movie, verbatim, from memory. My mind, thankfully, has not retained that level of recall, but it's certainly got more lingering in there from the many afternoons my friends and I watched that videocassette, over and over, than I gleaned from the periods I spent in that classroom on campus, looking at slides of famous paintings. I do recall that the sculptor who did The Thinker—name started with an R, I think—also was big on ballerinas.

I'm not a complete philistine.

If only the art history involved Canadian comedic actors perhaps I would have remembered more about it.

Sunday, December 06, 2009


Accepting that on the 'net no one will be convinced by your argument alleviates the time-consuming task of composing an argument; you simply state your opinion, with little or no support, and know that those who already agree will continue to agree and those who do not will continue to disagree.

It's liberating.

Thursday, December 03, 2009

A couple weeks ago, in a galaxy far, far away: More rambling

As I did three years ago in the middle of the first week of December, it's time for a meandering and completely unnecessary post touching on George Lucas' most popular works that succeeds in being not being geeky enough for the geeks and dull for everyone else. Ah, tradition.


A couple weeks ago, while flipping through channels—by which I mean: scrolling through the guide for the satellite-based TV system; there's no more simply going from one channel to the next without knowing what I'm going to find when I get there, but I retain that "flipping" expression as a pleasant anachronism—I came across a program on the History International channel called Star Wars: The Legacy Revealed, wherein scholars discussed the philosophical themes of the Star Wars saga. Footage of experts on areas such as mythology was interspersed with actual scenes from the movies (so clearly it was sanctioned by Lucasfilm). Why wouldn't it be? What filmmaker would not want his work analyzed in such a context, to be compared to classic works going back thousands of years? One could almost feel George Lucas plugging in the elements while reading Joseph Campbell to create the structure for his mythos, so nothing would be more apropos.

It's not that there is no foundation for such analysis of the story of how Anakin Skywalker turned to evil; certainly that has some similarities to the story of Lucifer's fall from grace, and his desire for power making him be willing to adopt the Dark Side is not unlike a Faustian deal (both points made by those interviewed for the program). The transformation of the Republic in to an oppressive empire seems obviously based on the rise of the Nazis in Germany (another point made in the program). Heck, it's a History Channel show, so there's almost an implicit requirement that either Hitler or the Civil War be mentioned.

One might question whether Lucas' work warrants this level of scholarship is not worth disputing; the mythological pieces are definitely there to be found, and I suspect intentionally so. Star Wars is contemporary mythology in its structure.

However, even in the bit I watched (and I recall seeing it previously, as originally it aired a couple years ago) I found myself having the following reaction during the portions where the scholarly talking heads were discussing Anakin in the same breath as classic works like Paradise Lost and then a scene from the film where Hayden Christensen acts with the skill level of a junior high production: If only the execution of the overall saga had lived up to that mythic structure.

It's not that the themes aren't there to be mined, and in A New Hope (1977) and The Empire Strikes Back (1980) the movies turned out pretty well. However, the story in those was all about Darth Vader, galactic badass, not whiny Anakin Skywalker; there was no melodramatic overacting to be done from behind the mask.

On the subject of Anakin being whiny, this was addressed by perhaps the best quote in the entire special. Clerks director Kevin Smith was interviewed and noted that there was no incongruity between how Anakin was and how Darth Vader was; the petulant youth is precisely who would grow up to be the merciless tyrant, he posited, and although it's complete pop psychology I must admit I think it trenchantly accurate. (Would Hitler have been so inspired to seek power had his ego been assuaged by his attempts at being an artist?)

For those who were paying attention, you'll notice I only mentioned two of the six films as having been done well—and probably not coincidentally they were the first two made. That's not suggesting that the other four were crap by any means, or that New Hope and Empire were without their flaws. However, I'm not here to go off on the tangent of specifics about how well or poorly the others were done; that's not the point. My lingering perception was merely that only the first two made struck me as living up to the quality level of storytelling that is associated with the works to which the saga was being compared in this special.

But here's the thing: Even though I am critical of 2/3 of the films in the overall saga (most specifically the ones that are identified as Episodes I – III), I'll still stop and watch at least part of them if I come across them on TV. They provide a modicum of entertainment (which may be based more on the potential that the two good ones established for the saga than on their content), even knowing precisely how the story turns out.

And not only will I stop and watch those movies but I'll stop and watch a special that touches on the philosophical themes in the movies… even though I've seen that before.

So clearly Lucas did something right, and maybe the credit balance built up by 1981 was enough to coast along from then on out. Or maybe the focus changed from storytelling to special effects and sound mixing. The look of the CGI and the THX sound in Episodes I – III are pretty phenomenal, even if the performances from the live actors leaves something to be desired. Maybe that's good enough.

Of course, I can't help but wonder that if the series had started with Episode I, and been the same story-wise, whether there would some 30 years later be made a special on the History Channel about it.

Tuesday, December 01, 2009

Coming back

The concept of reincarnation, with one's essence/energy/soul carried from one corporeal state to the next, does seem to imply some level of administration (if nothing else) to be performed by (what would easily be identified in our mode of envisioning the universe) a deity. It seems a bit too complex a system to operate without some bit of oversight.

One might wonder: Why would this deity set up a system whereby beings keep running the same race (so to speak) over and over? However, the more confounding query, it seems to me, would be: Why would this deity only give us one chance to run this race? The only way to grow, to advance, to develop is through repetition, through practice; a single lifetime only allows so many opportunities for growing as a being.

Obviously the tricky bit about the concept is that we seem to come into the world with the proverbial tabula rasa; what was the point of gaining the experience from previous lifetimes if we just forget it when we start over? And there I think the knowledge comes back to us slowly, in gradual ways, and often not in ways we consciously grasp.

Yes, that is a convenient explanation, but it's no more convenient than any other philosophy, is it?