Thursday, January 31, 2008

A simple nation

Last week while driving around I heard a bit of an NPR show where some political pundit talked with the host about how the combination of Hillary and Barack (yes, if using one of their first names I must use both of their first names) on the primary ballot presented a challenge for African-American women, as they didn't know whether to vote for their gender or for their race. He went on to suggest Latinas had an easier time backing Hillary (due to race not playing a role, only gender), but that Latinos didn't necessarily follow Barack.

The pundit painted the electorate with these broad strokes. And I found myself particularly disquieted, because, for reasons that likely defy explanation, I don't believe everyone is nothing more than what their race and gender makes them out to be.

I'm not saying race and gender are nothing when it comes to politics; I'm just saying it's not everything.

However, the specifically disquieting thought is not so much that maybe the way people vote can be so easily predicted along those lines, but that by not thinking that it's really that simple, I am even more unlike most people than I thought I was.

It's not that I have particular issue with being unlike most people. Frankly, I take some slight comfort in the notion. However, given that this is a republic, it makes it even more unlikely that any candidate represents me even slightly. And there's nothing to make the candidates want to appeal to me, as an individual, because I'm not easily fit into a bloc of voters whose racial or sexual characteristics align them in ostensible groups.

By being an individual, it seems I am left out of the political process. Or at least of no interest to pundits on radio shows.

This is why I don't listen to such programs.

Wednesday, January 30, 2008

Necessarily humbled

Years ago I composed something at work where I hyphenated an adverb and an adjective. I don't even remember what the two words in question were, only that the first ended in "ly" (and thus I figure it must have been an adverb). When I had a co-worker (with whom I got along well, and still do) look it over prior to being distributed, he identified that in American English we don't hyphenate after a word ending in "ly" (unlike, as I was thinking at the time, our British friends).

[This wasn't it, but the best my mind can muster at the moment as an example: "The wrongly accused man was given a financial settlement"; not "The wrongly-accused man…."]

My co-worker didn't present the correction in any sort of condescending way. It wasn't questioning my intelligence in any way; it was only exactly what I had asked him to do: find anything that wasn't correct. He knew me well enough to know it was a mere momentary mental slip, and I knew that he knew that.

Nonetheless, I remember in that moment that he mentioned it I blurted out an excuse for why I may have done it—something about having been reading a lot of books written by Brits, where I must have seen such usage of hyphenation. However, I'm pretty sure that was not why I did it. Frankly, it was probably an exaggeration, if not an out-and-out lie. He nodded in acknowledgment, clearly not having needed any explanation. The error was easily dismissed and easily corrected in the document, and no damage was incurred to my reputation in his mind I'm sure.

Clearly I felt compelled to offer some explanation to justify the error even though it was not warranted, nor was it necessary. It served only to ostensibly assuage damage I inflicted on my own ego, even though obviously I didn't delude myself well enough that the excuse given really explained it. In the end, it merely exacerbated the shame in my mind, as I transformed a minor style faux pas into an abject mistruth.

I've made plenty of other mistakes in the years since that incident (although never that one again), but those don't stick in my mind like this one. With those I undoubtedly offered appreciation for the note, made no excuse, made the correction, and moved on. Likely due to the lingering shame over how I handled this incident, I learned that the best way to get over mistakes is to keep ego out of it as much as possible.


Only someone who was pretty well-informed about language would have even caught that sort of style error, so he was a good candidate for proofreading. That was probably also why it had such a powerful effect on me: His was an opinion worth valuing.

Likely the reason my ego was thusly impugned was I should have known that. Heck, somewhere in the recesses of my brain I probably did know that but didn't think of it when I was writing. It didn't make me any worse a person than any of the myriad other mistakes I've made. Consciously I know that. However, subconsciously I knew I should have known better.

And now I do.

Well, I've never tried to hyphenate an adverb and adjective since that incident. I haven't moved to the UK or anything like that. (I need to figure out whether it's really stylistically appropriate there first.)


To see more interesting shots of what is pictured on the right, check out this.

Song lyric du jour

"This glimpse of brilliance is better than a long look at mediocrity."
Ha Ha Tonka, "St. Nick on the Fourth in a Fervor"

Thursday, January 24, 2008

Don't you remember you told me you saw the movie?

With its inclusion in the movie Juno, Sonic Youth's cover of "Superstar" has been getting regular airplay recently on all three of the commercial alternative radio stations here in L.A. (Indie 103.1, KROQ, and the new 98.7). Which is fine by me; I think it's a good version. (This post on Aquarium Drunkard allows for hearing it if you're not hearing it on the radio.)

Of course, I thought that back in 1994 when I had the If I Were a Carpenter compilation CD on which it originally appeared. Back then I don't recall even hearing it get played on the radio once.

That a critically lauded and commercially successful (and now Oscar-nominated) movie featured the song led to it getting back in the cultural radar is hardly surprising, nor is it intrinsically bad that it took such a thing to get everyone on the bandwagon. (Where would the Shins be without Zach Braff?*)

When I have been hearing the song get back-announced on the radio by whatever on-air personality was at the microphone, they have been consistently alluding to it only as part of the Juno soundtrack (on which, yes, it does appear), with seemingly no awareness of the song's origin.

It ends up coming across (in my admittedly jaded mind) with the implicit message that the song was recorded for the Juno soundtrack, and that it's a new song. I'm not saying that I've heard that explicitly stated, but from the way it was spoken about that was how I interpreted the on-air personality's level of understanding about it.

Just a short while ago I heard it played on Indie 103.1, and the on-air personality (TK) discussed a great deal of the history of the song (not merely the Sonic Youth version) that I barely knew or didn't specifically know: That it was written by Leon Russell, that Richard Carpenter saw it performed on The Tonight Show with Johnny Carson and thought it would great for Karen's voice (leading to the Carpenters recording their version, which is generally the one people think of). He then alluded to Thurston (Moore, of Sonic Youth) thinking it would be great for the Juno soundtrack.

TK's little back-announcement was a nicely structured, and I concede it was intended to be more entertaining than informative, but given that the Sonic Youth version was released over 13 years ago does undermine his last bit; I'm highly skeptical that any member of Sonic Youth actively sought to get the song on the soundtrack.

As anyone who saw the movie and was paying attention knows, in the movie the song is not merely played over a montage; it is something the characters actively talked about in dialogue. Jason Bateman's character even specifically mentions the compilation CD on which it appeared while holding the CD in question in his hands. The CD appears briefly on screen.

I guess it's only obvious to those of us who actually knew of the CD that it wasn't a fictionalized thing in the movie but something from the reality outside the film. (I may have been the only person in the theater who, at the moment when the CD was mentioned on screen, thought: Oh yeah, the Sonic Youth song was pretty much the only really good one on the disc; no wonder I ended up selling it to a used CD store years ago.)

I know it's expecting too much of contemporary on-air personalities to know much about the music being played (although on Indie they tend to be better than most commercial stations), but I can't quite shake the slight disappointment stemming from knowing that the only research necessary to know the origin of the song is to actually see the movie--the one for which it proved so well-suited that the writer put it in the script--which they indicate they have done, recommending it as a good movie after talking about the song.

Maybe, like me, they were distracted by someone behind them in the theater failing to refrain from talking during the movie...

* I'm sure the Shins would be fine, actually. They may not have gotten played as much on the radio, however.

Filling space (future regret)

The media has questioned whether the American public is ready for a black president or for a woman president, but I think it's the media who isn't ready for the American public to be ready for either, because then they have no stories to fill their pages, TV and radio time, and websites.

The media gains nothing from the public simply being okay with it.


In the interests of fairness, it is worth noting that to allude to "the media" and to make assertions about what "the media" does (which, given that "media" is technically plural, means that from a strict grammatical standpoint, it probably should be "what 'the media' do"; I digress) is conceivably just as an egregious use of generalization as what was suggested the media does (do) with generalizing.

Perhaps the term "the media" is supposed to carry the implicit caveat of meaning "certain members of the group identified as 'the media'" without literally meaning every single person who could be considered included in that group.

Granted, as I've done above, it's not like the term "the media" is used except in a pejorative sense, to indicate that there's something being done by individuals who are part of those who write for newspapers, magazines, TV, radio, or websites that the one using the term finds objectionable. I don't think I've ever heard a sentence like: "The media is doing a great job of reporting on this story." Any compliment is directed at a specific publication or outlet: "The Times did an excellent exposé on government corruption."

Individuals can do a good job; collective terms only screw up.

Generally speaking, that is.


The point of journalism (the specific medium in the above-mentioned media) is ostensibly to chronicle important events; the real purpose of journalism is to provide a story that sells.

Some may consider that a cynical attitude. I consider it merely an explanation, for example, of why the stories about the presidential campaigns are structured as they are.


I re-read this Chuck Klosterman piece recently, and for some reason it seems to vaguely apply to something I've mentioned above. If nothing else, it's much more cleverly written than what I've done above.

Wednesday, January 23, 2008

Song lyric du jour

"Nothing wrong with failure; everybody does it."
- Robyn Hitchcock and the Venus 3, "The Authority Box"

Tuesday, January 22, 2008

Xtreme photo viewing

A bunch of pictures (including this one) have been added to the useless photo site. (Scroll down through the December 22 post to see all of them--if you dare!)

Monday, January 21, 2008

The Romanesque Broccoli Project explained

Back on the first Sunday morning of the new year, my fiancée ventured out to the local farmers market and came across the following vegetable:

She purchased the unusual plant, but not because it was touted as being delicious. No, she bought it primarily because she thought it would make for an interesting photographic subject for me.

I kid you not.

She said it was called Romanesque broccoli*. (I'm led to believe it's due to the plant originating in Italy, at least four centuries ago.) The buds are fractals, and it naturally grows that way.

Again, I kid you not.

She put it in the refrigerator.


This presented me with a bit of a new situation for me as a pseudo-photographer. I typically find and shoot subject matter as it is; I don't tend to artificially (so to speak) arrange what is to be photographed. However, with this, I had to figure out how to put it into some situation where its unique appearance could be best captured.

So I took the easy way out: I waited until the next morning, when the sun would be coming through the kitchen window.

I got up a bit early so I could attend to the task before work. And I opened the fridge and regarded at the subject, and I thought, Hey, that doesn't look bad in the light from the back of the fridge.

So I moved around some of the other items on the top shelf (so they wouldn't take focus away from the vegetable) but left the broccoli itself simply sitting on the paper towel where it had been all night. I set up the camera on the counter, propped the door wide open, and took 11 shots over 6 minutes.

But I only changed the position of the broccoli once, so all the shots were basically the same, just with different camera settings, slightly greater zoom, etc. The best one shot of each of those two positions inside the fridge can be seen on this posting on the photo site.

And yes, the light inside the fridge was exactly the same in the morning as it would have been all day the day before. Waiting served me not at all with these.

Eventually I could no longer justify letting the cold air continue escaping from the fridge, art or no art. (Pretending for the purposes of this entry that there was any artistic element to this.) I pulled out the subject and put it on the counter where the light from the sun outside would be coming in.

Of course, unlike the day before when it was very clear, that Monday morning was partially cloudy, so the light fluctuated from nice and bright to muted and diffuse over the course of the next 16 minutes, as I took another 37 shots with varying amounts of illumination, shot from multiple different angles, repositioning the broccoli at least twice.

I then had to hurry and put it back in the fridge and rush out to just barely get to work on time.

Now, two weeks later I have finally posted only 7 of those 37 here on the photo site (suppressing a self-indulgent notion to put up more), which you can click over to and judge the results for yourself.


What happened to the Romanesque broccoli?

That night my fiancée steamed it.

And seasoned it with black pepper and French grey sea salt.

It served as a delightful side dish with roast beef. It tasted like... well, a broccoli version of cauliflower. Which, considering how much more expensive than regular broccoli (or cauliflower) it was, means I really need to be happy with how the photos turned out. As models go, it was ridiculously cheap.


* Doing further research just now I have found it's also more properly called Romanesco broccoli, or (not surprisingly) fractal broccoli. What's somewhat surprising, however, is that it is not, in fact, broccoli; it is actually a type of cauliflower. But it's not like scientific reality plays a role in how such things get named.


Interesting meta side note: In doing the aforementioned most recent research, I found that in a Google search of "romanesque broccoli" one the posts of the broccoli shots I just posted to the photo site an hour earlier was already included in the results.

On just the second page of results.

This makes me disinclined to go back and change those posts to more proper "Romanesco broccoli," as I'm sure my posts would end up way farther down in such results.

It seems better to be more likely found by people searching for its more colloquial name than get lost in the shuffle amongst those who really know what it's called.

Wednesday, January 16, 2008

Hopping for hope

I must make it clear up front: I am not a fan of so-called reality television. That's not saying I never watch any shows that fall into that category; it's saying that I tend to do so with emotional distance (if not outright ironic intent), where I really don't care about what happens to the people on the show. That's probably more due to the way that the folks featured in these shows tend to be self-absorbed and generally unlikable than me being a script-TV snob.

This season, for the first time, I have watched The Amazing Race. Initially this was because my fiancee was watching it while I was in the room. However, over time I did get caught up in it, because the premise does make for captivating television, with teams of "regular people" traversing the globe and performing tasks in order to get to a particular spot before other teams.

What intrigues me more than the obvious elements of seeing who'll successfully reach the end without coming in last is watching what must be genuine moments that get captured by the camera. Because while they're engaged in the competition each time is constantly filmed, it's impossible to be on their best behavior every second. While anything can be manipulated through selective editing, I do get the impression that enough of the real personalities of the participants comes through over the course of the season. And from that perspective, I couldn't help but conclude some of these people really are seriously awful human beings.

What's further intriguing is that they clearly don't grasp this about themselves. I am at least vaguely aware of my negative side (I like to think), and therefore I consider myself wise enough to not go on such a show; I feel no need to have my moments of glib frustration broadcast every week to an audience of millions. (I prefer to occasionally have them be read by an audience of occasionally double-digits.)


Of course some part of me liked to believe there is some modicum of justice in the universe, and that part wished for the non-awful people to succeed over the awful ones. That often was not the case, but the aspiration existed nonetheless.

Watching the episode from this past Sunday (the last one before the finale), of the four teams there were three I considered non-awful and one awful team. It was a dating couple who fought every episode, bitching at each other at virtually every opportunity. They were also risibly ignorant at times (for example, when in Taipan, Taiwan, they alluded to liking Thai food as though it was from there; they were not being ironic).

And at one point in the middle of the episode, I said aloud to my fiancee that I could live with any of the other three teams winning. As long as it wasn't them, it would be okay.

At the end of the episode, the awful team came in last and was eliminated from competing in the finals. And both of us jumped up and down in genuine glee that they were out of the race.


We leapt up from the couch, simultaneously, without premeditation; it was a genuine and, yes, emotional response. Granted, on the surface it's not admirable that we reveled in this moment of defeat. (And it became impossible to retain the glee of the moment, as shortly after they were told of their elimination, they wandered away—with the camera following them, of course—and the man began to cry. Part of what made them awful was their self-absorption, so in a way this was entirely in character. However, it transformed them from personalities to be despised to personalities to be pitied; I still felt no empathy, even though I could understand the immense disappointment of the moment.)

Deeper, however, their defeat represented the triumph of (relative) good over bad. The teams that qualified for the finals were the teams who had done better to overcome their pettiness and cooperated (or at least that's how it got edited). Even if only in the way it came across through production staff manipulation, it represented at least the appearance of justice.

And that was worthy of jumping up and down for.


I'm not suggesting this has transformed me into a fan of the reality show genre or anything. It may help me understand why other people are: It's not mere WGA-less entertainment; at times it's a tiny beacon of hope, in a semi-metaphorical sense.

Tuesday, January 15, 2008


A few weeks back we saw the movie Juno, which we enjoyed. At least, that's the way I choose to think of the experience.

I shall explain.

We saw it at the Arclight, which I generally have found to be a place where people who really want to see movies go (as opposed to some place where people go to kill time by seeing a movie). The seats are fairly comfortable, the sound and picture quality are good, and if it weren't so expensive we'd probably go there more often.

The thing is this: There was a couple seated behind us in the theater who were completely incapable of not whispering during any portion of the film that didn't have dialogue. It's not that they were speaking loud enough that I could even make out what they were saying; my fiancee, sitting next to me, didn't even notice them. And they were conscientious enough (in a manner of speaking) to stop as soon as the characters on screen starting speaking, but without fail they carried on some tiny conversation whenever there wasn't conversation happening on screen.

I didn't say anything to them during the movie, because I figured if their parents didn't teach them to shut the fuck up in that situation there was no chance that I was suddenly going to elicit behavioral modification. Also, it's not like they were just gabbing throughout the whole film; they were only partially audible, and not interfering with dialogue, so they didn't quite cross that line I have.

After a while I made a little game of it. Whenever the scenes would transition and there'd be a little musical interlude, I'd start counting in my mind: Five, four, three, two, one. I lost track of how many times they started whispering right as I got to "one" but it was more than once.

And we made it through the movie to the end credits, at which point they got up and left right away. We stayed to watch the credits. I never got a look at them.

Here's the thing: Although I didn't let it bother me too much during the movie-watching experience, the lingering association in my mind when I see ads for the movie (which have become way too ubiquitous now) is not primarily of the story or the acting, or even of the soundtrack; my first thought is about those people whispering behind us.

In retrospect, they ruined the movie for me. It wasn't so much that it was ruined at the time, but now it is.

I now have fantasies not of going back in time and asserting myself and asking them to be quiet. No, I fantasize of kicking their damned teeth down their throats, and possibly stomping on their throats.

That is completely inappropriate. I know. But the thing is: I'll never get the chance to inflict physical violence upon them. I wouldn't know them if they were standing next to me.

It was quite intentional that I didn't get a look at them. That way there'd be no temptation to confront them later.

So I appease myself with inappropriate thoughts. It's not admirable, but it's better than inappropriate action.

However, I pity the fools who pull that shit near me in the future…

Ooo, the thoughts I'll think about them.

Monday, January 14, 2008

Ho Lee Night

A few more lingering holiday-timed shots have been added to the useless photo site, if you're not sick to death of that stuff.

Thursday, January 10, 2008

Oh, fudge

[PG-13 post below.]

How did I learn profanity? It's obvious that somewhere along the course of my life I not only learned words like "shit" and "fuck" despite never being taught them in school. I imagine I learned it the same way Ralphie Parker did in A Christmas Story: from my old man. My father didn't pepper his conversations with swear words like a proverbial sailor but he would utter them when he got really upset, or smashed his thumb with a hammer, or some such scenario. It wasn't a calculated response; it was merely how he'd learned (probably from his old man) to subconsciously react in moments that (in a manner of speaking) required it.

I have no specific recollection of hearing my father use such terms, but based on how the parts of my life I do remember better went, I have to imagine that's the likeliest initial exposure I had to them. Obviously, forces beyond my parents further reinforced my using them at moments when I get really upset, or smash my thumb with a hammer, etc.

While it was never my father's specific intention to "teach" me these words, over the years he did help me through modeling (of a sort). I'm sure at some point, like Ralphie, I must have uttered one of the words in his presence and gotten in trouble for it, so I consciously learned something. However, obviously the lesson was not that I should never use those words, but that I should be careful when using them.

Those words had power. (There's a clever South Park episode about that.) By yelling them (or perhaps merely thinking them, or maybe even typing them) at a moment of heightened negative emotion or sudden physical pain, it mitigated the emotion; it vented some (not all, but some) of the anger or frustration I experienced (even if that frustration was directed at my inability to swing a hammer and strike only the nail, not my thumb). That may not be an ideal coping mechanism, but it's far from the worst reaction one could have. Focusing one's negative energy into an uttered word (or string of words) can release that negative energy.

What's interesting is how specific terms became codified as the "bad" words (identified primarily by the way they get bleeped on broadcast television). Conceivably anything uttered that achieves the goal of venting the frustration in that moment would serve; clearly that's why words that sound similar to the codified profane terms are said in moments when there is a lessened need for venting (such as when the nail you're holding with the thumb that's about to get smashed with the hammer slips out of your hand, falls to the ground, and slides into a drain; a simple "shoot" is sufficient in that situation, without needing a full-on "shit").

However, these variations permitted in public fail to carry the same catharsis, because they aren't the real thing when it comes to cursing; those words simply indicate a desire to suggest swearing was allowed in the circumstances at hand while demonstrating regard for social decorum. In moments where the need for release is too great (or when one is alone), only ejaculating (it has other connotations) a vile, unmitigated swear word serves the purpose.

And thus, one must be cautious about not overusing those words. Were it the case that I'd learned to casually toss some form of "motherfucker" into every other sentence, it would have lost any cathartic power before I finished junior high school. However, because my father conveyed through his actions that those words should be reserved only for circumstances that absolutely require them, I learned how to maintain their strength.

Frankly, I worry slightly about children whose parents never swear in front of them. Obviously, it's not good to let it run rampant when the kids are in the room, and it shouldn't be directed at the children, but if the parents always stifle that reaction when the children are present, they will have no choice but to pick it up on the playground, from children whose parents better roundaboutly empowered them.

I'd rather my child be the one doing the teaching in that scenario than the one doing the learning. At least there's some modicum of control exerted over my own offspring.

Good parenting, like anything else, is relative. Thanks, Dad.


Yes, the above eschews the use of profanity to insult others when they are the source of the frustration for which release is required. In that case, the use of swear words as epithets (or portions thereof) depends on one simple criterion: Can you take the guy in a fight, or at least outrun him? If not, then the outward appearance of restraint trumps the need for venting. (It's not like your average dipshit is going to suddenly wise up by being called a "dipshit" to his face.)

That's the sort of lesson that is, unfortunately, more often learned the hard way (typically on the playground).


Parents: Please don't allow your children to learn by reading this post.

Monday, January 07, 2008

Drink it in

another useless photo site has some more December photos posted, if you haven't visited there in a while (and wish to tacitly encourage my photographic inclinations).


Another posting that would have been more appropriate last month, but hey, better late than never...

I am not bragging when I say: I am a reasonably good dancer. I am not great, and when it comes to formal (non-freestyle) dances I really can only pull off basic swing moves, but I have a decent grasp of rhythm, a good ear for the beat and bassline, and I am not embarrassed by anything I do on the dance floor. (The last one is the key.)

I admit I am "good" by having a sense of rhythm, yes, but also because many people are not "good" when it comes to dancing. That is almost entirely due to their inhibitions that prevent them from being comfortable, and therefore prevent them from having the necessary spirit to be considered "good" by others.

So, basically, because I am not hideous, and because I am willing to do what others are not, I get to be "good" by those standards. Which is how it should be.

I have danced at all the office holiday parties I've attended (which would now be seven). Anyone from the office who has been to one and looked at the dance floor has almost certainly seen me. And those people talk about me in terms of me being good (in a non-ironic way) to people who have not seen me.

This causes these uninitiated people who attend a party where I am and who are inclined to dance to want me to dance, too. Which I do, and which I do well enough to more than fulfill the expectations created by the aforementioned talk by the people who've seen me before.

Again, I'm not reinventing dance or anything. I'm just able to do it without shame.

Nonetheless, almost without fail, each year on the Monday after the party it proves to be the case that the people who prior to the party had beseeched me to dance then feel compelled to join in about touting my dancing prowess to people who weren't at the party.

This is flattering, certainly, but it always begs a question in the back of my mind, which I never ask.

Why wouldn't I be a good dancer? I mean, why is the default position that I would be not good?

They speak in such flowing terms to be complimentary, I know. I appreciate that. However, the tone suggests that they are actively surprised to discover I am as good as I am. Why should it be surprising? Because I am a heterosexual, Caucasian male? I concede that demographic does not have the best reputation when it comes to strutting stuff on the floor, but is it that not in a way just as insulting as (for example) thinking that all black people like fried chicken, and being surprised by discovering one who doesn't?

Why can they not simply speak of it in terms of it being something I, as an individual, can do? Why must it be this seemingly amazing defiance of stereotype?

That's not how they're thinking, I know. But the thing is: I don't get the impression they're thinking about what they're thinking.

I merely hope that when they make the movie about this that Woody Harrelson doesn't play me. That's what I'm getting at.

I think.


And now, here's a photo to completely undermine everything I stated above:

Have a field day in the comments, folks.

Thursday, January 03, 2008

You don't know me at all... lucky you

I don't tend to write about work, but here's another tale from my department holiday luncheon a few weeks ago.

In addition to the white elephant gift exchange mentioned in this post, the luncheon includes a game called "Something you probably don't know about me," which goes like this:

On slips of paper everyone in attendance writes down a fact about himself/herself. These slips of paper are then folded up and put in a basket. Then someone picks one and reads it aloud. Then people make guesses about who they think was the person who wrote it. If within the first three to five guesses (depending on how many the judge decides to allow) someone successfully identifies the writer, that person gets a point; if no one guesses correctly, the writer gets a point. And then at the end there are prizes for the people with the most points.

So, it's something of a variation on To Tell the Truth.

Thus the fact that one chooses to write down should be something that others would not expect of one. Typically the facts involve childhood incidents or mildly-embarrassing-but-endearing anecdotes. Ostensibly the game allows everyone to get to know their co-workers a little better, but there's little genuine insight offered in the tepid revelations. But it is generally amusing.

As each person has only one piece of paper, the game also becomes something of a test of memory, to see if one can recall who has been revealed and who is left. With a large group, however, it tends to prove too unwieldy to remember; names get called out for people who were just revealed, even though those people should be easily dismissed.

In past years I've always had mine identified before the allotted number of guesses elapsed. That's partially because people in the department pretty well figure it's possible I've done just about anything. I always get called out many times before my slip is read, so it's only a matter of time until my slip is the one read and the guess is correct. Also, I generally lack the ability to camouflage slight embarrassment; my face goes flush, and my skin tone is pale enough that it stands out rather obviously. Thus after the fact was read aloud, it became a simple matter of which person would glance at me and see the unconscious response on my cheeks and know to yell out my name.

There is another, more subtle, aspect of the game. It isn't obvious but when it comes to trying to disguise one's fact and avert being guessed, this other aspect is almost as important as selecting a fact that seems very unlike the way one is perceived. This year I took advantage of it.

When my slip of paper was read, I was the first to make a guess, speaking the name of someone whom I'd pre-selected as a feasible target in a loud voice and pointing my finger at them in a playfully stern way. Of course that person denied the fact being about her, and then I glanced around the room while feigning searching for another candidate. Four other people made guesses, of various others around the room, and all of which were wrong. For the first time in several years of playing the game, I didn't get guessed.

I finally played the game.

It's hardly an impressive bit of psychology to grasp that by making a guess I was making myself seem to be off-limits as the originator of the fact but it was effective. The reality was that the entire game up to that point, the only times I'd been guessed were during rounds where I didn't make a guess myself. This merely continued to prove true when it really was mine.

I will admit that I did get a little too excited when I essentially got away with my ploy, blurting out "Suckers!" when I got a point for my deception.

Hey, if you're going to play you should play to win. Of course, I was too busy concocting my scheme that I wasn't able to guess the originators of any other facts correctly, so that point was the only point I got the entire game, and I didn't come close to winning the overall game.

You can't win 'em all, but for one moment I did (kind of). Which is better than I usually do.

(Pathetic, party of one: Your table is ready.)


I will reveal what my "fact" was upon request.

Tuesday, January 01, 2008


"'To Serve Man'--it's... it's a cook book!"
- from the Twilight Zone episode "To Serve Man"
(aired last night during the Sci-Fi Channel's TZ marathon--yeah, I'm quite the party animal)