Thursday, January 29, 2009

It is what we need to know

Yesterday morning I was watching some local TV programming as I got ready for work, and the news cut to a story of an auto accident with a tipped vehicle. Their helicopter provided aerial footage of the scene, and as is commonplace with television news, a graphic ran along the lower portion of the screen to explain the visual to viewers with the sound down (which often is the best way to watch such shows, given the vapid words coming out of the mouths of the hosts).

And that graphic featured the text: Car On It's Side.
And because I had the camera nearby (being weird that way), I grabbed it and was able to snap the shot above to document it.

At this point, you're probably thinking, Oh great. Doug's harping on a silly grammatical error again (and, admittedly, there would be some basis for thinking that, given all these posts from the past). But even I've lost interest in simply noting instances of this, so I assure you: That's not why I mention it now.

What made this noteworthy was this:
Within a few seconds, the graphic changed to read: Car On Its Side (documented with this follow-up shot below).

Someone in the news room caught the error and corrected it before the guy in the copter could finish explaining that there were no serious injuries but that traffic was going to be even worse than usual on the affected freeway.

Someone working in the local media demonstrated an understanding of the difference between a contraction and a possessive pronoun, and considered it important enough to make the change to get it right. And that deserves to be applauded, not derided. (Seriously.)

(Someone should do a story about that on the news.)

Maybe there's reason for optimism in this world that has nothing to do with our new president.

Tuesday, January 27, 2009

What a drag it isn't getting old

Over the weekend I was speaking to someone on the phone who followed up wishing me a happy birthday with a joke about how I am now older than dirt.

I quipped back, "Yeah, finally. Now I can embrace being a curmudgeon and tell that young dirt to get off my lawn." And I even shook my fist at nothing in particular to accentuate the moment even though the person on the other end would have no way of seeing. That's how committed I was.

And an obvious indication of how committed I need to be.


With age may come wisdom, but clearly I haven't gained enough to refrain from admitting such groaners to the internet-reading public.

Sunday, January 25, 2009


You can go look at pictures now. That's an acceptable way to pass the time.
The useless photo site.

Thursday, January 22, 2009

Lost in the waiting

Alright, we have a new president, so it's time to move back to important topics, like television.

Bookending the inauguration were the return of two critically acclaimed series, Battlestar Galactica (last Friday) and Lost (last night). I watched both premieres live (as opposed to how a lot of TV is seen these days, recorded on DVR and viewed some time later), and now that both are over, I'm left with a totally different experience. And it has nothing to do with the content of either episode.


I should interject here that this entry will not attempt any analysis of these series. That's better left for fan message boards filled with glib conjecture and the capricious dispensing of the term "douche bag"; if that's what you want, we can take this up off-site.

There's no knowledge of either show necessary if you do not follow the shows, or need for spoiler warnings if you do. They're certainly shows that elicit discussion, both having mystery-type elements to be figured out, but here we'll be discussing something else.

However, a bit of personal background is necessary...


With BG, I started watching back with the miniseries that kicked off its reimagining back in 2003. Given the cheesy source material of the eponymous '80s series (which I watched in childhood), I had low expectations, but they were far exceeded. It proved to be quite good, and when the full series commenced in 2004 I watched it each week with anticipation. It only got better.

Viewing each episode either on the night it aired or within a day or two, there was always the better part of a week to wait for the next one. Whether I wanted to be patient or not was irrelevant; waiting was mandated by the particulars of television scheduling. This carried on for the years of its first three and a half seasons, so it was something to which I was accustomed.

So when the premiere of the second part of the fourth season finished airing last Friday night, I looked forward to next week's episode certainly, but the wait was nothing different than what I'd done for the better part of this decade.


With Lost, which started the same year that the BG series did, I did not watch it.

Okay, as any reader who has made it this far is probably screaming "What?!" at this point, I'll offer this explanation for those who must know why Lost didn't make my regular viewing list at the time (and for those who don't care, skip to the next tilde):

When the television series Lost premiered I had been watching J.J. Abrams' previous show, Alias, since its beginning in 2001. However, by 2004 I thought Alias had devolved to the point where it displayed a shadow of the promise it showed when it started. I still watched, perhaps out of habit, perhaps because I wished to give it the chance to rebound. That patience with it was not ultimately paid off, as the show ended weakly; it proved not to deserve any benefit of the doubt.

Fairly or not, I recall attributing this failure to creator Abrams' essentially forsaking Alias for Lost. He was a creator, but he was not a finisher.

I did give the (then new) Lost the courtesy of viewing the first episode (as it premiered on the same night that I was already sitting through the previous series). And I thought I gave it a fair shot. Maybe it subconsciously held the baggage of being blamed for the downturn of Alias, but such is, I suppose, the danger of the network touting who had created it; they didn't want me coming in to it with a blank slate of expectation, and thus I did not.

So I watched the premiere, not necessarily with an open mind, and I specifically recall being unimpressed (by whatever criteria I employed at the time). I didn't watch it the following week; I'd given its creator one chance before, and I didn't feel any compulsion to grant him any further opportunities to disappoint me.

As the TV season progressed I heard the buzz about it. I saw critical acclaim for it in the media. I was unmoved. It wasn't that I doubted it was a good show at that juncture; I was skeptical of it still being good at the end.

Fool me once…

However, this is not a diatribe against J.J. Abrams. That much I offer just to allow the reader to understand why I didn't watch Lost during its first four seasons.


And now back to our topic...

In the late autumn of 2008 my fiancée borrowed the DVDs for the first four seasons of Lost from a friend of hers. She had been discussing the show with the friend at work, and the friend raved about it.

My fiancée only sat through the last couple of seasons of Alias with me, joining the story after Abrams left to focus on Lost; she found it compelling. And it was compelling. I'm not saying it was an altogether worthless piece of crap; the writers knew how to craft a cliff-hanger ending to each episode that made one want to know what happened; they were certainly capable of that much.

(At the end of each episode she would cry out "I hate this show!" It wasn't that she genuinely despised it; she hated having to wait a week to find out what happened next.)

Perhaps she lacked the Abrams' disdain because she came in to Alias after it was already on its downward slide, and thus thankfully lacked the perspective to see it as I did; for her, it never got worse.

So as I was saying, in the autumn of 2008, despite my reservations, we started watching the first season of Lost, four seasons behind everyone who viewed each week when it had aired in the intervening years.


[I feel compelled to explain that I consider a compelling show to not necessarily be a good show; that's one where you watch not merely to see where the story goes but because the story is so well done that you'd watch it over and over, even when you know what happens. (And not merely because, eh, there's nothing else on.) A compelling show can be good, but the one does not ensure the other.]


With the intent to catch up before the season 5 premiere, we devoted a few weekends almost exclusively to watching episode after episode, sometimes viewing half a season in the span of a single day.

When one episode ended, there was no waiting a week to see how the cliffhanger would be resolved; it came up immediately (or within minutes if we needed to change discs). When season 1 ended, there was no waiting months to see what happened next. Heck, even in the middle of the episode there was no waiting through commercials. It could not possibly have fed the need for immediate gratification any greater. (Sometimes my fiancée would cry out "I hate this show" semi-ironically when getting up to switch out discs, in sardonic homage to the Alias days.)

And although I am not ready to let Abrams off the hook, I will admit that part way through watching the first season's episodes I was doing so willingly. Lost does not rely as much on merely being compelling as did Alias; it is clearly a superior show. (Whether it ultimately proves to be "good" remains to be determined after the series concludes. And given that it seems Abrams has little to do with the show at this stage, that may bode well for it.)


And now we have watched the two-hours that kicked off the new season. It was hard enough sitting through commercial breaks, but now the questions posed at the end will not be immediately resolved. We are forced to (gasp!) wait until next week.

Yes, this is no different than what everyone who has been watching for the past four years has had to endure, but they've had four years to get used to it.

We're not ready for this kind of transition. Maybe ABC could sneak the next episode early, and then slowly transition to a weekly schedule so we can acclimate to that.

Perhaps we should have waited until the next two seasons were over and came out on DVD.


Oh, and one last thought: The premieres for both Galactica and Lost were prefaced with recap shows that featured producers talking about the shows in an attempt to allow new viewers to be able to jump in as of the latest season. I found both to be more annoying than helpful, because both shows are ones that are best discovered by watching them unfold, not by being told what you should have picked up along the way regarding what the themes were.

If you haven't started watching either Galactica or Lost, I recommend holding off on starting until the complete series DVD sets are available (and then having a marathon viewing session), but even trying to catch up with the DVDs as we did is vastly superior to just watching these recap shows.

SciFi and ABC need to accept that they're unlikely to get new viewers at this stage, and should simply be happy if the viewers they already had stick with them. And don't put pedantic producers on screen; let the shows speak for themselves.

Tuesday, January 20, 2009

Can music save your inaugural soul?

During the Inaugural Concert held Sunday at the Lincoln Memorial, Jack Black introduced one of the performers, prefacing what was to come as representing what one might hear on the radio while driving across the country. Then Garth Brooks came out and started singing "A long, long time ago / I can still remember how the music used to make me smile..."--the intro to Don McLean's opus, "American Pie."

However, he then skips over the rest of the first verse and jumps to the chorus. So, I suppose if your radio was on a station where the record was scratched you might hear that exactly as performed, but hey, the song is over 8 minutes long in its entirety, so a bit of editing was necessary. He then proceeds through the second verse (including the couplet "Do you have faith in God above / If the Bible tells you so?") and one more extended chorus, getting even Obama himself singing along.

In 1972 (when the song hit #1 on the charts) there were probably more people who recalled the proverbial "day the music died" (when in 1959 Buddy Holly perished in a plane crash along with Richie Valens and the Big Bopper). Now it's probably better known as that song Madonna butchered and the one used in the Chevy commercials. It's still one of the best rock and roll songs ever written.

And because McLean refused to state exactly what it was about, that's left it open to interpretation--of which there are many. However, the consensus is that it's a tribute to Holly (as noted in The Annotated American Pie) with references to other big name artists who came before and after, including Bob Dylan and the Beatles. Some choose to interpret it as having political overtones ("when the jester sang for the king and queen" perhaps referring to President Kennedy), but others find the song to focus too much on music for such meaning to have been intended. But such is the beauty of art: It is whatever the listener wants it to be.

One thing that is certain: The listeners want it to be a rousing party sing-along, despite the fact that they're singing the couplet:
"And good ol' boys were drinking whiskey and rye / Singing 'This will be the day that I die'."

While that seems overt references to possible alcoholism and death, that's only an obvious interpretation if one is, you know, paying attention to the words. A popular interpretation is that the closing line alludes to singing Buddy Holly's hit, "That'll Be The Day," where the chorus finishes with "That'll be the day that I die." So it could very well not be about actually dying but about reminiscing about the early days of rock while enjoying a cocktail. But somehow I doubt all the thousands at the concert have done the research, so I'd guess some of them were mouthing the words without dwelling too much on the specific words.

Because they're under no obligation to interpret those lines (or any of the rest of the song) as somber (even though even without knowing much about music history the lyrics do tend to be less-than-upbeat). It's got "American" right there in the chorus--so it's patriotic! And "pie"! Who doesn't like pie? And what kind of pie could be better than American pie? And then we get a nice reference to a popular American automobile with an internal rhyme with "levee" (which, other than with Led Zeppelin, is a word that doesn't get a lot of use in rock songs). Then there's the easy to remember "dry" and "rye" and "die"-ending lines.

The song has six verses. Six. And they're not short verses either. The song is long and the only part that's easy to remember is that chorus. Everyone somehow recognizes how brilliant the song is, even without being able to quote much more than the lines of the chorus, so it has transcended its rather maudlin-seeming origins and become a good song for a celebratory event (such as what this concert was supposed to be, presumably).

To dwell on the allusions to booze and shaking off the mortal coil with a more literal interpretation would almost certainly ruin the only portion of the song that's really accessible for a broad audience.

And as Americans, we are free to completely overlook that and interpret it as being as jubilant as the Isley Brothers' "Shout!" (which is into what Brooks segued after "American Pie").


But I have to admit: If I'm driving across country and after "Shout!" on comes Garth Brooks' "We Shall Be Free" (as happened during the medley he performed during the concert), I'm changing the station. Preferably to something playing some Stevie Wonder.


Saturday, January 17, 2009

A rose by any other title

It's only a few days until the inauguration. That means there's a closing window of time left to say anything about race; starting Tuesday we enter post-racial America (Tracy Morgan said so during the 30 Rock acceptance speech at the Golden Globes last week), and the topic will be passé.


There's a worthwhile reference to these definitions from the U.S. Census Bureau, which states the "categories are sociopolitical constructs and should not be interpreted as being scientific or anthropological in nature."

Thus, conceivably we are are about to enter "post-sociopolitical construct America." But that doesn't exactly roll off the tongue.


Credit where it's due (and dialing the tone down to somewhere less tongue-in-cheek):
I found that Census info from this post on the Center for Media and Democracy site which questioned whether the election of Barack Obama really indicates we've gone past race (sociopolitical constructs) when he is identified as "the first African-American president," with the "one-drop rule" (any amount of black ancestry at all makes one black) being applied even though his parents would have checked different boxes on a Census questionnaire.


That's something that had been on my mind for a while. Back in the period just after the election in November someone I know (who was a strong Obama supporter) posed the sincere question about why he was generally identified as "black" or "African-American" when (by societal standards) he had one parent who met that criterion and one parent who did not. Why was his skin color so important and not, for example, his ethics?

While I thought the question a valid one, I made no effort to try to address the question directly. First, the massive scope of it encompasses the history of our nation, sociology, psychology, anthropology, and genetics in ways where I could not even feign a shred of expertise to speak authoritatively. Second, and more important, I don't believe there is an answer; there are as many answers as there are people.

What I responded (which I offer more because I feel I've obligated myself to reveal it than I have a pressing need for you to know it) was more or less this:

You don't have to understand why people call Obama the first African-American president; you should understand that it is important for them that they do so, and you should respect that. At the end of the day, no one is hurt by someone else calling him black. Those who call him black must respect that there are others who may call him bi-racial because it is important for them to acknowledge his mixed heritage.

There's only one term that everyone will need to agree on:
Mr. President.


But that was just my thought on the matter, offered as representing no one other than myself (and offered with the hope of it being respected whether it is agreed with or not).


(Of course, there are those who will employ an abjectly derogatory term other than black or bi-racial; for them there's no hope .)

Thursday, January 15, 2009

Can you hear me now?

Infants, toddlers, and small children all have one obvious trait in common: They have no idea how loud they squeal with glee when they are really happy.

It makes perfect sense; if a bit of fun warrants a slight exclamation, then a lot of fun must be commemorated with a much louder vocal burst.

They also share another trait: The abject lack of awareness about the difference between doing so on a playground or backyard and doing so on an enclosed public space like a bus or plane or train.

And accompanying parents are utterly powerless—perhaps due to fatigue, and perhaps due to the realization of the futility of trying.

Such squeals are more tolerable than cries or sadness, as they are interrupted by laughter.

Just barely.

Tuesday, January 13, 2009

Putting up with it

The trouble with having a live-and-let-live attitude is it means one must let those who have an everybody-should-be-exactly-like-me belief have that belief; to force a more tolerant outlook on those others would turn one into a hypocrite, but to not require that of these intolerant others could result in the loss of one's option to be tolerant of them.

Maybe Ann Coulter really does sleep very well, without having any paradoxes in her beliefs to keep her up at night.

Sunday, January 11, 2009

Why can't I find a video like that?

From the things-I-really-should-stop-trying-to-write-about department:

Starting just before New Year's Day (or, rather, on Old Year's Day), VH1 Classic started showing what it considers to be "classic" music videos in alphabetical order by song title. Apparently they intended to show 2,009 of them, as the series was called "2009 for 2009."

Although that number of videos took a week, I admit I only flipped by from time to time, not really watching full videos, but pausing for at least part of the song. And from that pattern of viewing I discerned "classic" appeared to comprise the proto-videos from the early '80s through the full-fledged productions from at least the mid-to-late '90s (and possibly later).

Being arranged by title rather than by genre or by year, it made for some interesting transitions. One I saw that caught my attention was Pearl Jam's "Jeremy" (you'll need to follow this link to see it; it appears they don't want this video embedded)...

...followed by Rick Springfield's biggest hit, "Jessie's Girl":

The songs came out roughly a decade apart—"Jeremy" in 1991, from Pearl Jam's debut album, Ten (although the video didn't come out until 1992); "Jessie's Girl" from Rick's Working Class Dog in 1981—but they seemed to be from different centuries. Obviously, the art form developed quite a bit between those years, and the technology and production values advanced significantly during that time period, but that wasn't quite it.

I was in my early teens in 1981, and I remember hearing Rick Springfield on the radio a lot, but didn't see the video for a few years (when visiting someone who had MTV). By 1991 I was to my mid-20s, and by that time I had cable, and saw the video when it was new and still far from "classic." And on top of that, I purchased Pearl Jam's CD. But the difference didn't seem to be merely that "Jessie" was associated with my teen years and "Jeremy" with adulthood, or what I remembered from those ages.

I didn't think it was that Rick Springfield could be more easily dismissed due to making it big as a soap star and having his musical popularity peak and wane before Pearl Jam's even started (while Eddie Vedder and company seemed to take the rock & roll integrity path). It wasn't that the subjects differed so greatly (one song being about lusting after a friend's girlfriend and the other about a school shooting).

For a moment it seemed to be that early '80s production was inferior, but even that didn't quite pan out; later during the weekend, when the channel had gotten to the M's, I saw The Knack's magnum opus, "My Sharona."

That actually came out in 1979 originally, and I'm old enough to remember it from pop radio at that time as well. While that song did get a resurgence in 1995 with its inclusion in Reality Bites, and the video shown was the version featuring footage of that movie (and yes, the footage of the band from its original days does look dated), there was something about the song itself that stood up despite being now 30 years old. Presumably it had roughly the same sort of production that Springfield got for his song, but it didn't seem anywhere near as old as "Jessie" (even though it was a few years older).

After ruminating on it for a while another distinction hit me. "Jeremy" was now about 16 years old, but it didn't seem that dated; when "Jessie" had been around 16 years (1997 or so) it already seemed old. And the reason was simple: It just wasn't as much of a quality song.

Oh sure, "Jessie's Girl" was reasonably catchy. I'm not suggesting it was a bad song. I'm not saying I don't have it in my music library (I do). It just wasn't as good from a musical quality standpoint. "Jeremy" stood up better because it was a better song, regardless of how it was produced. It's not that "Jessie" was a more of a trite pop song; it was just an average pop song. And not that "Jeremy" was Pearl Jam's best, or the best song of its year, (although it had that excellent intro and outro), but it had... that intangible element of "higher" art which holds up over time.

Now, let's be clear: It's not that I don't like "Jessie's Girl"--because I do--but well, it is what it is (or was what it was). And to be honest, I'd probably rather listen to "Jessie" than "Jeremy" if I was looking for something to hear. (But of the three songs mentioned above, I'd pick the somewhat lascivious "My Sharona" and its killer guitar hook and thumping beat.) "Jeremy" almost certainly would get better treatment from music critics, but frankly, it is something I need to be in the proper mood for.

And that is probably the best way I can describe the distinction between the "higher" art that seems more timeless than the "pop" art that gets indelibly linked with the time in which it gets created.

But that's merely what conclusion I drew about it so I could sleep. Please feel free draw your own. You need your sleep, too.


One last thought, to be fair: Although Pearl Jam may have written a better song, I doubt they would have been as even half as good as Rick on General Hospital (had they tried their hands as midday thespians).

It all balances out in the end.

Casting light

Shameless self cross-promotion compels me to direct you to the useless photo site because I've posted a bunch photos there recently.

It takes a lot less time to glance at pictures than read all these words I post here. I'm just saying.


Thursday, January 08, 2009

Getting over it

There's a saying that recommends that you do something each day that scares you to spur personal growth. However, if we're talking things that truly inspire fear, conceivably it's a finite number of somethings that would qualify.

Now, admittedly, it's likely there'd be plenty of things, so it would take a while--years, possibly decades--but still, it stands to reason that if you kept on the suggested pace of one frightening task per day, eventually you would run out of things to be afraid of, and you would have to abandon the routine.

And missing doing the something each day--that probably would scare the crap out of you.

Maybe you want to pace yourself, and just do the scary things every other day. And as the list of fears grows shorter, perhaps spread it out to once a week. Thus, either you'd never run out of things, or when you do get to the end you wouldn't be quite going cold turkey.


Yes, that saying undoubtedly is best not deconstructed (or whatever the heck that was I did above). However, I like to think that doing so contributes to my personal growth. (No, I don't expect it to work for anyone else.)

Wednesday, January 07, 2009

Be vigilant

Recently I noticed a charge on my credit card statement that I didn't recognize. So I got the phone and dialed my credit card company to report the suspicious activity.

They advised me that the first step was to contact the company to which the charge was attributed and ask that the charge be reversed. Which I did, and when I heard back from them they claimed to have no record of any transaction, but they were polite in their reply. (And no, I didn't give them my card number. I may not be the sharpest tack in the box, but I'm not a complete idiot. Or at least I wasn't in this instance.)

So I called my credit card company and disputed the charge. I explained the situation to the initial representative, who then they transferred me to the dispute department. I was on hold only a few minutes, and when I was connected, the representative had been advised of my situation by the first one, without me having to re-hash the story all over again. I provided a few more details, and she issued a refund to my account. And to be on the safe side that account was closed and I'd be issued a new card with a new number in five to seven business days. I was only on the phone for about 10 minutes.

In short, dealing with the credit card company was not an arduous ordeal at all. Unlike most of the calls I've made to companies in the past when I've had a problem to report.

Sure, I'm kind of left with a slight unease because it went too well, waiting for the proverbial other shoe to drop, but for the moment I have nothing to complain about.

I just thought I'd offer an optimistic post for a change.


But do keep an eye on your statements. The scam was charging only $11.89, which conceivably would have been a small enough amount that many may have missed it or let it go. But doing some research online I found many others who were similarly charged over the past six months, so it sounds like someone has been committing this fraud for a while, and conceivably are still out there (and maybe even blaming it on unrelated companies).

Yes, it's disquieting that it's possible someone got my credit card number, and I'll be watching all my accounts very closely, but by paying attention I may have nipped this in the bud. At least I'm choosing to believe there's a possibility that this isn't the prelude to something horrible.

It's a new year. I'm keeping pessimism at bay. (We'll see how long this holds up.)

Tuesday, January 06, 2009

Take that

[As it's Epiphany, and some may still be getting presents, perhaps you'll put up with one last vaguely holiday-related post, pertaining to gift-giving. To the extent that white elephant gift exchanges count.]

The seeming incongruity (not irony) of a white elephant gift exchange is that although all the items are supposed to be undesirable the participants who choose later than others can take the undesirable item that someone has already selected. While on the surface the rationale for such an action would be that the person taking the already-revealed item is too cowardly to risk unwrapping a not-yet-revealed item and discover it to be even worse than what is out there. Alternatively, it indicates that one or more of the participants don't understand the nature of the game (so to speak) and offered up gifts that were not, in fact, undesirable. However, assuming that everything available follows the spirit of the activity, there is another reason: It's fun to take away something from someone.

Certainly there is a hierarchy of sorts in such an activity, whereby certain gifts are not as undesirable as others, and the contest ultimately can be construed as seeing who ends up with the worst item; that scenario makes the taking of an opened item from someone a defensive move, especially if one is amongst the last to select (and most items have been revealed)—at least if I get that I know it won't be the worst one, the taker may subconsciously think.

That is a subconscious thought that seems justifiable; it's just "playing the game" to a great extent. However, it avoids the delightfully evil aspect of the act. Perhaps only unconsciously one must accept that it's one of the few situations in life where being a complete asshole is perfectly acceptable. Picking a gift that is still wrapped may indicate the one choosing has a more hopeful streak than one who takes what someone else already picked, but it's also boring. There's so many times in life when a considerate person sees jerks who don't give a crap about anyone get away with their behavior and has no recourse but to accept it. And yes, it's psychologically advanced to be of a mindset whereby seeing the inconsiderate suffer no consequences causes one no distress (but that's also the sort of mindset that would not construe any gift to be undesirable and therefore belongs to one unsuited for the exchange in the first place), but that's not a state I suspect many of us have truly achieved (much as we may like to think we have).

It's petty and stupid but damned if there isn't some satisfaction from making another suffer in a tiny, ultimately insignificant way.

Let's face it: Much as we may like to believe there is justice in the universe, we so rarely see evidence of it. We fancy the notion that those inconsiderate jerks who've wronged us got their comeuppance eventually, but we didn't witness it if it came, so there was no catharsis. But to be that jerk, for just a moment, in a situation where the person we "wrong" ultimately won't give a crap, may allow for a tiny bit of venting (sort of).

Unless of course we just choose a still-wrapped gift and take our chances with whatever that ends up being. There is that option as well.


At the department's holiday party gift exchange last month, there were definitely a few moments where a gift was "stolen" from someone and done so with glee. In a few cases, a steal caused a chain-reaction of "theft," where the person who just had their crappy gift stolen turned and stole someone else's crappy gift, and that person then stole someone else's crappy gift, continuing until someone who didn't give a crap picked a still-wrapped item. It was a fascinating study in sociology, if viewed from the correct perspective.

The things humans will do.


When my turn came around, I grabbed the gift from one of the guys who happened to be standing near me. It was a "Bendy Pirate"—which is pretty much what the name suggests (a plastic stick-like figure whose body and limbs bend and who was decorated with ostensible pirate paraphernalia, including a removable peg leg—really). After a few others' turns the curving corsair was stolen from me, and having already achieved my catharsis from the initial steal, and because the game had been going on for over 30 people's worth of turns, I just grabbed the smallest wrapped box from the spot by the tree where the remaining gifts were.

I opened it. It proved to be a package of Gas-X. For infants.

Yes, baby medicine. It wasn't some clever appropriation of the packaging with something else inside. It was exactly what the package indicated.

Not surprisingly, that proved to be something that was not stolen by those with remaining turns or by those whose gifts were stolen during those turns.


While it is not uncommon for the items offered up by the participants to be re-gifted crap from the previous year, and frankly the standards for what is acceptable are pretty low, I cannot imagine next year I'll be re-gifting that.

Baby medicine is almost ridiculously tacky, but opened, year-old baby medicine crosses a line.

Even if I knew someone with an infant child (who might actually have use for it), I don't think I'd be comfortable giving it away now. The explanation of why I had it would be more trouble than it would be worth.


And for the record, for the fifth year in a row, no one participating in the exchange brought an elephant, white or other color.

[Astute readers may recall I've written about this gift exchange previously. Yes. It's a never-ending source of inspiration. Well, once a year.]

Sunday, January 04, 2009


Find out what this is by going to the useless photo site. Unless you have something better to do.

My good side

Preface to today's tale: I'm not one to dwell on age. I don't feel as old as I am, and it has been remarked by others that I don't look as old as I am, but in the end, I am as old as I am. And it's not what could be considered "young," but that's not something to be dreaded as far as I'm concerned.

One of the perks of being not young is not getting carded when going in to a bar or buying alcohol. Some may fancy the gesture, but I find it merely an inconvenient delay toward my aim. Generally, one look at my face and the bouncer or clerk may not reveal my true age but it can easily indicate I'm above the requirement, and that's fine by me.


A short while ago I ran down to the supermarket to procure some items my fiancee needed for her dinner menu. It wasn't a full shopping trip, so I had only a few items, and to bypass the lines at the cashiers I went to a self-serve checkout aisle to make my purchase.

Included in my items was a bottle of wine, and thus when I scanned it the overly cheerful voice associated with the machine called out to show my I.D. to the attendant. However, the attendant was absent at that moment. I glanced around for a few seconds, seeing no one to verify my age, so I turned and and put the bottle in the bag.

At that moment, I heard a voice from behind me say, "You're okay." I glanced over my shoulder and noticed it came from the attendant, who was already checking the I.D. of another customer at another self-serve station who required similar attention.

Then thing is: She didn't even look at my face. I was turned away completely from her, and I had on a hat and a coat that partially camouflaged my head.

Apparently I've reached the point where from behind--pretty much just the back of my neck--looks not young.

And I have to admit, I'm not sure how I feel about that.


Next time, I may just wait in line for a cashier. At least they have to look at my front side.

Thursday, January 01, 2009