Showing posts with label dougression. Show all posts
Showing posts with label dougression. Show all posts

Monday, August 20, 2018

Music in the '50s to the '80s, and the '80s to today: Running to standstill?

Last year U2 toured in honor of the 30th anniversary of their album The Joshua Tree.

I remember 1987 (I was just out of high school) and how much songs from that album dominated rock radio back then. It was almost nauseating, and I was someone who liked their music (but wasn't a huge fan).

I also remember what I thought of songs from thirty years earlier (1957) at that time (1987). I'd listened to plenty of the oldies station in my younger days to be familiar with the likes of Jerry Lee Lewis' "Whole Lotta Shakin' Goin' On" or the Everly Brothers' "Wake Up Little Susie" or Sam Cooke's "You Send Me" (I could go on). I liked all that music as well, but being from more than a decade before I was born it did seem... old. Again, it was very good, but... old.

Here's the thing: It is now just a little over the same amount of time passed since 1987 as had passed from 1957 to 1987. However, when I think about the music of 1987 now, it doesn't seem old. Intellectually I grasp that it is, but it doesn't seem old in the way 1957's music seemed in 1987--even though that same oldies station now plays mostly songs from the '80s (and wouldn't consider playing anything from the '50s).

Saturday, September 07, 2013

Coming down with a case of... the Vapors (they weren't just turning Japanese)

Back in 1980 (at age 12) I was coming off liking the Village People and started listening to pop radio, and remember hearing and really liking some of the unusual hits that flittered on to the charts, in what I didn't realize at the time was New Wave.

A track that I enjoyed then and continued enjoying ever since was the Vapors' only hit, "Turning Japanese." The riff, the beat, the not entirely explicable chorus, that "Think so, think so, think so" part in the final run of the chorus: it was marvelous. And frankly, I have to say I still think so (think so, think so)—not out of nostalgia, but because it remains an excellent slice of power pop.


I never got their record nor even the single, but I listened with anticipation for when the radio station might spin it again. (Yes, kids, there was a time when everything was not immediately available on the internet.) Eventually it turned up on virtually every '80s hits CD that came out, so finding that band's one hit proved easy by the end of the decade.

And for the intervening decades, even as my musical horizons expanded, my relationship with the band remained unchanged. I honestly cannot say why.

Recently I finally came across the band's "best of" collection (another digital trip to rediscover a missed past) and heard other songs by them, only 33 years after the fact. After several listens I have enjoyed that rest of their (still available) material, such as what I have learned was the b-side of the "Turning Japanese" single, a dynamic six-and-a-half minute live track called "Here Comes the Judge."

Wednesday, May 22, 2013

Under the Punk Covers: Love the Damned (Splitting the Dickies)

A few weeks ago I was on that popular social network site, and because at some point in the past I had imprudently confessed a preference for early punk pranksters the Dickies a link appeared in my newsfeed. It was to the video done (clearly on no budget) for their 1979 cover of the theme for the '60s kids show The Banana Splits, "The Tra La La Song" (which featured singer Leonard Graves singing into a banana instead of a microphone; it is best that their success never relied on such visuals on the small screen--see for yourself below).


Clicking through to the hosting site's page I saw videos posted for others, including the Damned's cover of "Alone Again Or" (which, I was vaguely reminded, featured the British punk pioneers dressed in Spanish garb out in the middle of a desert... for some reason).


Their faithful rendition of the classic 1967 Love track still holds up after 26 years, in large part because the song is so good (certainly not from that 1987 video).

The funny thing about that, when I think back to that time, is how when I heard it (on KROQ) I was unaware it was a cover.

Wednesday, January 18, 2012

Here we are now--entertain us: How social media salvages the Golden Globes

Although we don't go to a lot of movies anymore (and, thankfully, have gotten over any sense of feeling obligated to see a bunch of nominated films in the early part of the year), the wife and I do still watch the major awards shows. And somehow the Golden Globes telecast still meets that criterion.

For better or for worse, the ceremony where the Hollywood Foreign Press Association divvies out its little trophies still qualifies as what will be a topic in the pop culture field during the lead-up to it and in the days following it. And while both the Globes and the Oscars are, in the end, a frivolous exercise in celebrities being feted for not having had to get real jobs, the reality of the power of movies and TV even amongst those who couldn't care less about awards is still pretty strong. We like to be entertained, and at least to an extent we like to see the worthwhile entertainment be commemorated.

What that has to do with these awards shows clearly becomes less clear.

Tuesday, January 12, 2010

There's no running away from the song in my head

In this recent post I discussed my brain's proclivity for having random songs pop in to my mind. (Go ahead. Read that one first if you haven't already. I'll wait. Done? Okay.)

Two morning's after the one mentioned in that post my mind gave me another such tune: Bon Jovi's "Runaway."

(Yeah, we'll touch on the specific track at the end of this post.)

So what pattern is emerging? The previous post noted Nirvana's "Love Buzz" (1989) as my brain's song du jour, and as "Runaway" came out in 1981 it appears songs that were released during these past two decades are too recent for my mind to have entered them in to those crevices from which these music moments emerge.

(As further proof I must mention that during the night after the "Runaway" incident, as I brushed my teeth, my brain commenced with Felony's 1983 new wave hit "The Fanatic".)


~

Perhaps there is something about the chemical development of the brain that makes the experiences of a particular age range more indelible. When really young the brain isn't developed enough for those memories to stick as well, and after a certain age the brain probably deteriorates to some extent. It's not that the brain cannot still think and reason, but the capacity to integrate memories diminishes.

Let's get something else about this acknowledged: When it comes to remembering music there's almost certainly a part of the equation that comes from the state of one's life emotionally; in one's teens and twenties one's emotional development probably makes the importance of the songs one hears then carry more weight regarding what gets stored in those crevices of the brain than what one hears in one's thirties and later. It's not that one cannot hear new songs and like them very much, not that one doesn't consciously know those songs just as well as the ones from earlier days, but the unconscious organization in the grey matter is not putting those songs on this out-of-nowhere playlist. Sure, a song that was recently heard can linger in the short term memory easily, but that's not the brain pulling it up from the depths; it's merely parroting what's on the surface.

Enough vague generalization. This started with what happens in my mind, and that's all I can pretend to know.

Was music more important in my life priorities when I was younger? Of course. How I feel about it now is not even close. It's not that I don't consider it somewhat important; it's still something to which I devote a reasonable percentage of my free time; I still acquire new music for the collection, but I don't follow the press, don't debate it, don't feel compelled to try to turn others on to what I like, nor do I spend time analyzing it and arranging it in year-end (or decade-end) lists. Seeing the bands perform live is not as urgent (in fact, I don't think I attended a single concert all of last year, although the wedding—both getting ready for it, and then later paying it off—played a role in that).

In short, it's not that I don't actively enjoy music still; it's that I merely enjoy it, without being as passionate about it. I'm not saying it's better that I'm not as obsessed with it (and on some level I lament that I don't still have that relationship with music... insert wistful sigh here).

And that, I proffer, makes it less likely for the songs from this era of my life to worm their way into those recesses of the brain from which the unconsciousness will pull out something for the personal soundtrack in moments when there's no other distractions.

Of course, it's possible that no one else in the world experiences this. Perhaps others' minds don't have an inclination toward having something "playing" in them when they're standing in front of the bathroom mirror, brushing their teeth or combing their hair; it's possible their brains just think about whatever they're apt to think (or not think at all). Or maybe they keep some source of sound—the radio, the TV, an iPod, something—going during all waking hours so their brains have some other focus, music-wise, and never encounter this scenario.

Some people cannot quiet the voices in their head; I can't stop the music.

~

Okay, now about the whole fact that it was "Runaway":

The mildly discouraging aspect of the morning's particular selection is that it wasn't even culled from tracks that I owned at one time. It's not that I hated the songs of the boys from New Jersey; I merely wasn't that into them. I heard them on the radio, but not because I sought out their tracks; it was nothing more than exposure due to occasionally regarding what was happening on the pop charts.

I vaguely recall coming across the video for "Runaway" on VH1 Classic while flipping around channels one night (under the somewhat specious auspices of a block they call "Metal Mania"; I suppose they were "metal" in some interpretation of that genre) many months back. That undoubtedly planted the seed that, in this manner of speaking, came to fruition earlier today. However, it doesn't explain why such was the fruit it bore on this particular day.

It's a decent enough song, sure, and not one where if it came on the radio I'd jump to change the station or anything. Still, the other noted track (Nirvana's cover of "Love Buzz") was at least one that I do have in the library, one that I actively liked and one which held a more significant role in my relationship with music.

I suppose such is the danger of having years of my life with the radio on. Riding in the car, while doing homework, and certainly as the background while working. Thanks to the radio I know the words to songs I don't even like much, only because I heard them over and over. So technically the random song pulled from the depths could have been something much worse.

That's not really making me feel better about what songs my brain will whip out in the future.

Well, whatever the unconscious throws on the turntable inside my cranium, it seems the one predictable aspect of its selection is this: It won't be something I only got into during the first decade of this century; that stuff simply came to the party too late.

~

Aren't you glad I shared? Now tell me what your brain does to you. Click the link below and fess up.

Saturday, January 09, 2010

Just ain't the way it seems (can you recall my Love Buzz?)

I've written before of songs inexplicably popping in to my head (in this post it was "Arthur's Theme") in moments when it's quiet. It's not having annoyingly ubiquitous songs that I've heard recently (sometimes called "earworm") get stuck in there but ones springing from the depths of my unconsciousness. It's simply what my brain does sometimes. I've come to accept it, and now to be intrigued by what the mind will pull out.

~

One morning this past week the song that ran through my mind for no discernible reason while I was getting ready (i.e., one that I had not recently heard or had cause to think of): "Love Buzz" (the Nirvana cover of a Shocking Blue song).

What of that?

~

The change of the calendar puts the album on which Nirvana included "Love Buzz"—1989's Bleach— completely over 20 years old.

A child born the same day that Bleach was released (not that it was a milestone at the time, but sticking with the point of reference my sub-conscious chose for me) will this year be old enough to join me in a bar by the middle of the year. (If I still went to bars, that is.)

It was bound to happen (assuming I lived that long) that the world would reach the twenty-first anniversary of that album's debut but it's something that seems remarkable, even though it's remarkable only in how ordinary it is. Every marker for a particular time will eventually be twenty-one years gone. In the first week of 2031 the days we're experiencing now will be two decades and a year in the past. Somehow I doubt that those days that precede my 63rd birthday will strike me the same way if some random song that came out (or, by our current frame of reference, will come out) in 2010 pops in to my head.

I was about to ponder if this is the sort of reaction that someone who remembered the first wave of Beatlemania in 1963 felt when 1984 came around, but the parallel would only exist if that person were in his/her early twenties (give or take) in those first years of the '60s, and thus to be in his/her forties when L.A. hosted the Olympics and Reagan got re elected. (Of course, 1984 would not be as apt to inspire reflection as would the beginning of what is considered a new decade, but perhaps this hypothetical person was also a fan of George Orwell and would take that year as a time to ponder the passage of time, as I'm more or less doing now.)

It's not merely the passage of any period of twenty-one years; it's the passage of those twenty-one years from more or less one's twenty-first year until the twenty-first anniversary of that year.

In 2000 a song from my adolescence would not have elicited some reflection on the twenty or so years that had elapsed. No, it requires the perspective of taking those first steps in to adulthood, and then looking back at those from the first steps into middle age. (I'm not suggesting I feel like I'm on the precipice of middle age, but what other aging milestone term is there?) The forties are an age where one doesn't feel that old but the first decade of one's life where one can look back 20 years and see events that transpired while one was, technically, an adult.

We'll get in to that a bit more in another post. (Yeah. Something for you to look forward to.)

~


That I focus on the legal drinking age in the U.S. as my benchmark I cannot entirely explain. It's not as though alcohol has ever played a significant role in my life. Maybe it's more the implication about being able to walk in to a bar or a liquor store—without a fake I.D.—that connotes some significance, some tiny level of genuine maturity. (I know when I was 21 I was still pretty much an idiot, but I was far less of an idiot than I was at 18.) Maybe it's more a matter of hitting 21 being that point where one finally gains the full benefits (to the extent that being able to buy booze is a boon to anyone) of adulthood. (No, getting a discount on one's car insurance at 25—assuming one started driving at 16—is not worthy of being a milestone.) 21 is young, but somehow more than being a year older than being 20.

In any case, it's almost certainly this change: Over these past few years when I've met people who were in their early twenties—who were old enough to purchase a drink—and I did the math in my mind I would have that moment where I thought, When you were born I was I was still in high school, or, When you were born I was in my early college years, and now it will be, When you were born I was already getting in to this bar, and now here you are standing next to me ordering a beer, too.

People older than me are undoubtedly looking at that with a tiny bit of condescension, thinking, Guess what, Doug, you're old, and I would note that in my mind I've been old; this is merely a different phase of old.

~

The beauty of aging is that it sneaks up on you. Some might consider that a detriment, but the wonderful aspect of 21 years elapsing on you is that the daily routine kept you distracted from noticing it happening. Otherwise you'd never have an old song pop into your head  (that in your mind you still considered to be a newish song) and have a nostalgic rumination on how those intervening years have gone, and how glad you are that you're not still an idiotic 21-year-old, but take comfort that you still remember that song like you were.

~

Have a thought on this? Notice the link below for sharing. It might get that song out of your head.

Thursday, December 31, 2009

Last post of whatever the heck we're calling this decade

With the days winding down where the date above them ends with a 9 the window for recapping the year that's about to end is, itself, coming to an end. Within a day or so of New Year's Day it seems like the appetite for such things dries up, and then people are focused on where the new year is going rather than dwelling on where the previous year went.

This year presents an opportunity that's greater than the typical year-end recaps, as it is perceived as the end of a decade. Well, of course it's the end of a decade; a decade is a period of ten years; on any given day it's the completion of a window of time that started ten years previous. Obviously, today it will be true that the ten-year period that started January 1, 2000 will come to a close, and as such, that appears to justify devoting air time and column inches to making lists of bests/worsts/whatever in that period. With 24-hour news, specialty networks, and the internet there are certainly plenty of outlets that need filling.

Being the first decade of the century it does seem as though the count question that divided people ten years ago at this time can still come in to play. Namely: Does the decade start at 0 and run to 9, or start at 1 and run to 10?

As we approached the end of 1999, in addition to people stockpiling supplies in fear that programmers had not corrected their code pertaining to the use of two-digit years, the main argument was whether come January 1, 2000 it constituted the commencement of the new millennium or such an event actually fell to the first day of 2001. The answer was: It only matters to people who give a crap about delineating history on a base-10 structure.

Clearly common parlance has now declared that the new period started in 2000, whether that's right or wrong. Even though that makes the first decade of the previous millennium only nine years long, the adjustments to the calendar that have occurred over the last couple thousand years really make nit-picking about such things rather moot.

That the year is the length that it is makes some sense to the extent it reflects the relative position of the earth in its orbit around the sun, but as that period of the earth's orbit doesn't match 365 of the 24-hour periods we call days there's the need for extra days in leap years, so even what we call December 31 does not always correspond with where the earth was in its orbit the last time we called a day December 31. However, it's the system we have. All references to a given day are what we choose to interpret as having significance, but at least there's some system in place to try to keep a modicum of order. Grouping years into arbitrary periods like decades (and centuries, and even millennia) is pure semantics; if one wants 2009 to be the end of a decade it's the end of a decade.

When 2020 rolls around it will be far enough away from 2000 for the millennium-or-not debate to be old news, and it will be the start of a period where the second half of the date will all be pronounced the same for ten years straight—that is, two-thousand twenty, two thousand twenty-one, two-thousand twenty-two, etc.; it will be what will later be called "the Twenties" and become the first easily referenced decade of the twenty-first century (and will be followed by seven more such distinguishable decades, to close out the hundred years that commenced in 2000). As we complete a decade of years where the first part was the same—two-thousand—but will be the same through the century, it does no good in providing a handy nickname for the period completed at the end of today.

I've heard some attempts at branding this decade "the aughts," hearkening back to the turn of previous century, but I'm skeptical that in the era of Twitter such an antiquated term will catch on, especially as it hasn't happened yet (despite some trying to make it so for half of the expiring decade).

When VH1 resurrected their I Love the... pop culture retrospective series (after making multiple versions of I Love the '70s, I Love the '80s, and I Love the '90s) for this first eight years of this decade they clearly punted on coming up with a term, wussing out with I Love the New Millennium.  Had they boldly named it "I Love the Aughts," or, really, anything that didn't involve the entire next thousand years, then this would all be resolved, but instead we're in a bit of a vacuum about the determining factor. We'll just have to wait and see what term will be adopted.

I suspect "the first decade" might prove the most easily understood when in the future people attempt to mention this period, but admit it lacks sufficient panache to get a TV series named after it.

The coming decade will have a majority of years ending in "teen," and "the Teens" likely will end up being their moniker when all is said and done, with years ten, eleven and twelve remaining implicitly included.

The impending ten years, with the second half of the full date finally hitting double digits, will also prove an interesting period, pronunciation-wise: While the single-digit years lent themselves only to being spoken as "two-thousand one," "two thousand two," etc., and not "twenty-o-one," "twenty-o-two," etc., the Teens could see a shift to "twenty ten" over "two-thousand ten" (and so on). Personally I prefer "two thousand," being a mere extra syllable over "twenty," but I have no illusions about exerting general influence in such matters. (Or really in any matters, but I digress.)

Of course, the triple-syllable "eleven" we'll hit the following year will almost certainly make prefacing that with "twenty" less of a mouthful, and that probably will be the end of "two-thousand" until perhaps 2020, when "twenty twenty" may sound too much like (what probably will be an outdated) night time news program. Not that it's likely there'll even be news programs then, but some of us will still be around from the time when such things existed, and may find it sounds weird speaking the date that way.

I'd lay pretty good odds that 2021 will be "twenty twenty-one" even if the year before it was "two-thousand twenty," and the "twenty" will supplant "two-thousand" as the common reference for the remainder of the century.

And not to sound morbid but I don't expect to still be around when next we have this what-to-call-the-first-decade conundrum. Unless, of course, reincarnation is what awaits me in the afterlife, in which case my future self may be having this same rumination 100 years hence, in whatever incarnation computers and the internet have taken.

Note to theoretical reincarnated future self: If the content of the current internet somehow persists in to the 22nd century and you come across this, it could prove what the afterlife is. Of course, if the expansion of what's on the 'net continues to increase as it has, and then there's 100 more years worth of blogs and photos and YouTube videos, finding this will prove an even greater proverbial needle in a universe-sized haystack. (That expression may mean nothing to you by that time, but you should be able to look it up in whatever has become of Google.) Unless you happen to be searching for "world's largest corn dog" (presently #3!) or "romanesque broccoli"--the two posts of mine that show up reasonably high in the lists of results by search engines in the early 21st century--it's highly unlikely that you'd ever find that these words ever existed. (Which, considering that hardly anybody here in the 21st century can find these words, is the realistic expectation.) Especially in light of the greater likelihood that in 2109 the media (whatever that is at the time) will be too busy recapping the bests and worsts of that decade.

Whatever they're calling it.

Happy 2110.

~

And for my reader back in 2009: Happy Old Year's Day!

Saturday, September 19, 2009

This is not about Kanye, Serena, or Joe Wilson: An uncivilized post

[You probably should sit down for this one. Oh, you are already?  Good.]

On the cover of Tuesday's USA Today I noticed (as I passed the dispensary--yes, one of those antiquated devices on street corners where people can spend money to buy a paper version of the website; quaint, I know) a story with the title "What Happened to Civility?"  Accompanying it was a photo of Kanye West stealing the microphone away from Taylor Swift at the VMAs.  Below that were smaller shots of heckler Rep. Joe Wilson during the president's speech, and next to that a close-up of Serena Williams (presumably at the U.S. Open, in the midst of her tantrum).  From what I could see of the article through the glass, it was drawing some connection between these incidents of less-than-respectful behavior and a larger societal problem with a lack of civility.

And it occurred to me fairly straightaway:  People who act in civilized ways—who don't yell out during the president's speech, who let someone accept an award without jumping on stage to tell them they weren't deserving, who take what they believe to be a bad call without losing their temper—do not make the news.  This is not a clever or terribly insightful conclusion, I admit, but it struck me as a primary explanation.  The various parts of the media that report on such things do not put on their cover stories about people who acted in a civilized manner in whatever situation called for it; that was simply expected, and therefore unremarkable.  As Oscar Wilde noted, the only thing worse than being talked about is not being talked about; today the desire for attention—whether it's positive or negative in nature—is almost certainly greater than back in Wilde's day, and it's infinitely easier to attain.  Thus, the ability to develop a taste for it (so to speak) is much easier as well.

I'm not B.F. Skinner, but it doesn't take a doctorate in behavioral psychology to see why what was perceived as civility would seem to have disappeared: it is not reinforced the way that incivility is.

Of course, the likelihood that people really ever were civil is questionable, but now we're veering off into the good-ol'-days conundrum.

But when USA Today has a headline reading "Good Guy Does Right Thing in Tricky Situation," we'll know we've turned a corner.  Or at least seemed to have.


Again, this is all what came to me without really knowing what the article was about.

(And this part probably would have been more clever or amusing if those sort of elements were better in regards to being reinforced in contemporary society.  Unlike in Wilde's day, today cleverness generally gets one misunderstood.)

~

When I got home Tuesday night I briefly looked up the civility article online.  And for our purposes here let's just note it was more or less what I'd surmised.  Thus, as I suspected while having those initial, inchoate thoughts, there was nothing particularly insightful to my reaction.  What level of inspiration I thought I may have had to do something with those thoughts dissipated; they didn't really offer anything that wasn't in what the writer had included.

That would have been the end. This post would not exist.

However, then I made the mistake of starting to peruse the comments that had been left after the article.  I dare not say "in response to" the article, as that would not (in my humble opinion) accurately reflect their overall nature.

The beauty of the internet is that it allows anyone to offer his/her thoughts.  The disquieting aspect of the internet is that it allows anyone to offer his/her thoughts.  (Case in point: what you're reading right now.)

The irony of how an article about a lack of civility spurred comments that were, I'd have to say, not terribly civil was only noted in one comment I saw., but that was a rare exception.  The majority fell more into the category of complaining about whatever issue was stuck in the commenter's (proverbial) craw. 

I couldn't make it though the hundreds (as of the time I post this days later it's in to the thousands) that were left; I can't imagine anyone could unless he/she has a masochistic streak and gets off on reading others bicker back and forth.

However, of the dozens I did make it through, those that struck me as thoughtful, without being merely rancorous (whether coming up with someone or something to blame for this lack of civility, or going off on a tangent and lamenting how another news story wasn't getting better coverage), were few.  There were some, but they seemed a distinct minority.  There were others that noted the newspaper itself was not entirely without culpability in promoting the very scenario in question.  But regardless of one's thoughts on the topic, the only definite conclusion I could draw was that there was no consensus about what could restore a aura of civility.  Nor, frankly, did I get the impression that most who was moved to leave a comment would have really desired that; they agreed with the premise that civility would be good (I presume), but by and large I doubted that a lot of what was posted would have made it through a filter of civility.

The civility in this context seemed to revolve around the way that most commenters appeared to have utterly ignored most of the other comments.  They made their sweeping generalization and got out.  It was not a discussion where others' points of view were considered; it was yelling "Here's who I blame!" and stopping the conversation at that point.

(The secret to reading the comments on a website may be in one's mind to preface all of them with the phrase "In this wing nut's whacked-out opinion…" and then be amused by the result, or delighted when it proves unnecessary.)


~

"This Golden Age of communication means everybody talks at the same time / And liberty just means the freedom to exploit any weakness that you can find."
- New Model Army, "225" (1989--yes, twenty years ago)

~

Obviously, the comments left after the article seemed to reveal more about the individual pet peeves of those commenting than capturing any sort of pulse of the American public in general.

Perhaps that's what is the problem—not merely with ostensible civility but with people in general:  Rather than accepting that people acting out inappropriately in a public forum (i.e., on TV or some other medium that can be captured, such as our beloved internet) should be seen as representative of those individuals, they draw larger conclusions about society at large from these incidents (even if that is specious at best).  It will be interpreted as accurate if the reader is so inclined, but that doesn't intrinsically make it empirically true.  People want there to be patterns, and thus patterns are perceived. 

(Yes, the irony of making a generalization about making generalizations is intentional. At least I delude myself with the notion that it is.)

~

At the end of the day, I'd argue that this reveals the true American pastime (heck, perhaps the human pastime): being pissed off about something.  That is how our country started, when you get down to it; a bunch of colonists got upset about their treatment.  Without that, the British flag might still be flying over our capital.

Sure, back two centuries ago they took up muskets rather than keyboards, but you get the idea.  The tactics have changed over time, but the ultimate mindset remains kind of the same.

Seen in this light, society isn't really going downhill; it's pretty much the same as it always has been.

Of course, that sort of perception may serve to make one less inclined to be pissed off about the ostensible downfall of civility, which would ruin the great American joy of being upset about what one has perceived as something to be upset about.  So this may not be for everyone.

~

And wrapping up...

Two weeks before the Joe Wilson/Serena/Kanye trifecta spurred this civility debate, there was an incident at one of the town hall meetings about (we'll call it) the health care issue that were prevalent in the news (whether they were really representative of a majority of the public or not), where a fist-fight occurred and a man had the tip of his finger bitten off.

Bitten off.

I'm not sure whether the reason that didn't spur a cover story on USA Today about a dearth of civility was because: a) a digit removed dentally is pretty darned newsworthy regardless of the context, or b) neither the perpetrator or the victim was in the public eye (at least not in the way a politician or athlete or music star would be), or c) it wasn't done in the same week as two other noteworthy instances.

Perhaps all of the above.

Maybe when it crosses over to the hideous and monstrous, calling it merely uncivil seems a bit inapplicable.  A breach of etiquette is one thing, but something that elicits a I-can't-f*cking-believe-it response is probably construed as quite another.

The only unarguable conclusion to the drawn from a cover story on civility:  A somewhat slow news day.

If a paper can spend front-page space on that, society has not become that uncivilized.

~

Thanks for pretending to pay attention. Especially this long. I'm going to presume you know the link to the comments section is below. Remember, without those comments, you wouldn't have had to sit through this.

~

P.S. Regarding the opposite of "civility" I'm not sure why the adjective is "uncivil" but the noun is "incivility." Gotta love English.

~

And in case you missed Lewis Black's brilliant "Back In Black" rant from Wednesday's The Daily Show which touches on this whole thing far better than I have above, go here and watch it.

Sunday, August 23, 2009

The Per-since-tence of Memory: The "50 Concerts" meme

The Facebook note in which I was tagged included this opening text:
"OK, here are the rules. Test your memory and your love of live music by listing 50 artists or bands (or as many as you can remember) you've seen in concert. List the first 50 acts that come into your head. An act you saw at a festival and opening acts count, but only if you can't think of 50 other artists. Oh, and list the first concert you ever saw (you can remember that, can’t you)?"

To which I must reply: I'm amazed I can think of any concerts I went to (many of which were around 20 years ago), off the top of my head, but recalling which one was first would require consulting records, which the rules forbid, so pardon the hell out of me for being old.

While I'm following that "first ones that come to mind" rule, I twisted those rules to list some concerts with multiple artists; the title was "50 Concerts," not "50 Bands You Saw In Concert." Frankly, the people who start these memes really need to put a bit more effort into naming them, in my humble opinion.

Anyway, that said, here's what I came up with in a little over 10 minutes of concentrated effort one night this past week while riding the train home, in absolutely no particular order:

Anthrax
Afghan Whigs
Nirvana
Screaming Trees
Soul Asylum
Red Hot Chili Peppers
The Replacements
Paul Westerberg
Old 97's
Rhett Miller
Alice In Chains
Lilith Fair
(really; yeah, I went with a woman; no, I wasn't the only male)
Gin Blossoms
Luka Bloom
Deacon Blue
Edie Brickell & New Bohemians
Beat Farmers
Mojo Nixon & Skip Roper
Pixies
The White Stripes
Tori Amos
New Model Army
The Waterboys
Love Battery
Robyn Hitchcock (solo and with the Egyptians)
Jimmy Buffett
Buddy Guy
Long Beach Blues Festival
Ramones
R.E.M.
Neil Diamond
Neil Young
Mudhoney
The Monkees
Weird Al Yankovic
The Kinks
The Who
The Jazz Butcher
Love & Rockets
The Godfathers
Phish
Feist / Spoon
Dramarama
The Smithereens
The Dickies
Dinosaur Jr.
The Damned
Al Green
Sonic Youth
U2 / Public Enemy / Sugarcubes
X
Oingo Boingo

p.s. Yes, that's 52.

~

Now that I've delineated a list of artists I saw play live for a Facebook meme about that, I think a distinction needs to be drawn between going to a concert and happening upon a band. Going to a concert implies intent; you were going to the venue with the express purpose of experiencing the artist's live show. That's different from going to a bar or coffee house or restaurant or record store and there happening to be a live band playing; the point of your venture was to drink or eat or hang out, and the music was coincidental. If your story is "Yeah, we went to this bar and so-and-so was playing, and it was a great show," then you didn't see a concert. If you went to the bar because you wanted to hear music performed but had no idea who would be playing, you didn't see a concert. At least not for what could be included in this list.

That's the restrictions I placed on myself, whether it was necessarily required or not. The only "rules" noted implied one not to take too long composing the list. I'm not sure whether that was supposed to make it a test of one's memory, or whether there was some kind of psychological conclusion one could draw from which concerts one remembered, or if the person who came up with the idea knew that establishing a time limit might make it more likely people would participate; it wouldn't seem too daunting a task. People will squander tremendous amounts of time online, but asking them to devote more than 15 minutes of concerted effort toward such a task conceivably could elicit an I-don't-have-time-for-that reaction.

The point of the memes (to the extent I can discern one) is to interactive participation, presumably trying to draw in those who don't otherwise bother by expressly soliciting their input (that is, by "tagging" them, so they get a notice). As to what one is to gain from looking at a mere list of artists' concerts a friend attended, but without indications of where the concerts were, or how many times the friend may have seen the given artist, or which shows were good or not-so-good, or hearing any stories about the shows, I'm not sure. Maybe the list is intended as a first step. I suppose it could be more a matter of supplying the raw data and allowing others to react to it as they will. In comments left or other notes those details could come out.

Me, I learned my lesson about trying to get people to participate in a way that would be seen by a bunch of others—others who come from disparate parts of my social life.

~

Years ago I used to send out occasional emails to pretty much everyone in my address book (obviously before the days of even Friendster). I'd only do it a limited number of times a year (and, for the inexplicably curious, or those wishing to go down Memory Lane, those messages are archived here; check out everything listed from 2000, 2001, 2002, or 2003). Generally, if I sent a message to 100 people I'd get well over half of them to reply back with at least a hello. (That's way better than any blahg post or Facebook status update has ever gotten.)

Then one time I sent out a message where I asked people to just hit Reply and type the first thing that came to mind for one minute. No stopping to think, just typing (even if gibberish) and then hit Send.

However, I threw in: "If you're feeling daring, hit Reply To All." (Yes, I had all the recipients in the To line.) And some people did just that.

Suffice it to say, the first thing that came to mind was… not exactly for public consumption, especially when some of my older relatives were in the mix.

That was the last time I sent out a message where I didn't BCC all the recipients.

~

Facebook sometimes seems like having all your friends and family in the To line.

I suppose people tend to grasp the more public nature of what they may post on another's wall with that forum, rather than blithely clicking one button on an email—especially when essentially dared to do so—but nonetheless I don't solicit that sort of thing in the time I spend logged in to the site.

And I sure as hell wouldn't tag anybody. As previously noted, that is, quite literally, asking for it.

Monday, March 30, 2009

Moving around my mind

My fiancée recently moved in to my place. I live in a two-bedroom condo, so in theory there's enough room for both of us. However, in the days leading up to that, as I tried to prepare the domicile for co-habitation, I found it to be worse than it would be to get ready to move to a new place.

With an ordinary move, one gets a bunch of boxes and one starts packing. Everything is already in a modicum of order; kitchen items are in the kitchen, bedroom items are in the bedroom, etc.; it's a lot of physical effort, but by and large the primary mental effort comes in when one deciding what items one will need between the commencement of packing and the actual moving of the boxes, and not packing those until the end.

With the scenario of having a home already filled with items, I found it difficult, but not so much because of the obvious dilemma: figuring out how we'll fit everything. It's easy enough to throw out or donate a significant amount of my stuff to make room.

The problem I find myself having was deciding what to do first.

It doesn't sound tricky, I know, but I found myself walking into the spare room and seeing one thing to do and starting on that, but then that made me think of another task over in the hall closet that seemed better to tackle before the first one, so I would ditch that and walk to begin the next idea, but then another idea sprang to mind that struck me as an even better way to go, and so on. And every time I'd move something I'd realize how overdue much of the "bachelor pad" was for a thorough cleaning, and it seemed that during this transition (in addition to everything else) it would be the ideal time to perform that. However, I was attempting most of this in the evenings after I got home from work and ate dinner, so most nights it was a bit late to get to deep into that.

The thing about this: It seems like indecision, but really it was a lack of focus. It's not that I didn't know what I need to do; I knew too well what I needed to do and all of it came to mind, but not in any productive order. It was a sort of ADD for moving.

Which brings me to my actual topic.

In ruminating on this dilemma I started to think about how calling "attention deficiency" a "disorder" suggests that the brain's natural inclination is to remain focused on one thing at a time. However, that doesn't seem a realistic view of the mind's baseline for operation.

Heck, form a survival standpoint, it seems like being able to quickly shift from one item of attention to another is beneficial. Our ancestors who were too focused on advancing their attentive skills were possibly the ones who didn't notice the lion sneaking up on them.

Certainly the ability to concentrate is worthwhile and beneficial in our industrialized world, but that doesn't mean that paying attention is necessarily what we just do; conceivably, paying attention is a developed skill.

Thus it's not a disorder when one lacks that ability; it's a developmental deficiency. Which is a problem, certainly, but it's not a disorder. There are obviously actual disorders which cause what is called ADD, but it's not right to call the mere lack of attentive ability a "disorder."

However, as ADD is already taken, we cannot re-appropriate that to now mean "Attention Development Deficiency," so I propose we rearrange the terms as "Deficient Attention Development," or DAD.

A disorder is an ailment that's difficult to overcome; a deficiency is merely something there isn't enough of, so it carries the superior implication of being such that it can be resolved with simply getting more. Not that the "more" that needs to be gotten is easily acquired, but the important thing there is the hint of empowerment that it can be acquired at all; a "disorder" sounds like something largely out of one's control.

Now let's imagine someone saying the following: "His problem is DAD."

See? Rolls off the tongue. And pronounced as a single syllable word, to the untrained ear it seems to lay the blame for his problem squarely on his parent, which may or may not be true but is almost certain to be accepted by the listener as plausible.

Assuming the listener had developed the ability to pay attention long enough to hear all the way to the end of the sentence.

At this point I presume everyone is too afflicted with DAD to remember how I started this and expect me to resolve the ostensible dilemma about getting the condo ready to receive my fiancée's belongings.

But if you must know: All her stuff has been moved in, and maybe a third of it is unpacked. If you can't figure out why more of it isn't done, please re-read the paragraphs above.

Unless you have something better to do.

~

They're called "Dougressions" for a reason, people. If you were expecting them to stay on topic, you really weren't interpreting the pun that well.

We'll try to come up with an acronym for that at a later time.

Wednesday, February 11, 2009

Crossing the line

A few months ago I ruminated in my usual tongue-in-cheek way on how during the opening sequence of the Colbert Report election night special the word that is conventionally spelled "judgment" employed the alternative spelling "judgement." Afterward I left a comment on the Indecision 2008 site with a link back to my post, and I got quite a number of hits from that. A few visitors from there even left comments. One of which was a very not-tongue-in-cheek response to what I'd written.

However, I am not here now to write about an overly nitpicky criticism; I most certainly can take it or else I wouldn't put this out into the world. No, it's something that the person wrote in an effort to lend credibility to the criticism. The person identified him /herself (I don't know the commentor's gender) as a "grammar Nazi."

And I imagine when you just read that it scarcely elicited a reaction from you. It's a term I suspect many of us have heard enough times that it's almost commonplace. In our modern society, to be concerned with traditional grammatical rules (and, by implication, spelling, syntax, and other elements of language addressed by the MLA or the Chicago Manual of Style) draws a certain level of disdain from those who (shall we say) are not as concerned with it, and what was undoubtedly intended pejoratively at first has since been adopted by those who were its target: grammar Nazi.

Obviously it can be empowering for a subjugated group to take an insult and turn it into a badge of honor (of sorts); it strips the word of its power. For example, gays employing the term "queer" to refer to themselves is different than a homophobe saying it about them, and works to take it away from the intolerant.

And while I am of the attitude that, Hey, people can do whatever they want, I do think it's interesting how half a century after actual Nazis perpetuated the Holocaust the term "Nazi" has been co-opted to suggest merely that one is strongly devoted to a particular area—all of which has nothing to do with actual genocide.

While I concede the Nazis do not deserve to have their name "respected" (and used only in the original context), it started me wondering if this suggested decades from now other terms that are presently associated with hideous evil would be similarly appropriated for innocuous insults and then co-opted by the insulted as that semi-ironic badge of honor.

I doubt anyone was nonchalantly throwing around the term "Nazi" in the late '40s; I have no doubt that the term may have been employed as a charged slur, but it still held the provocative aspect of the original meaning.

(Of course, in the late '40s that "grammar Nazi" would not have been used would be as much a function of the stance on grammar at the time as it would reveal the exposed nerves about World War II. But I digress.)

To this day, I doubt anyone who actually lived through one of the concentration camps would use "Nazi" in any context other than to describe the monsters who tortured and killed their people; it's only a generation or two later, that those who remember the Nazis as the cartoonish villains of Raiders of the Lost Ark could place the term in a context where it's divorced completely from the atrocity but still linked with strict adherence with an agenda. But such is the nature of the passage of time; what used to be awful gets supplanted by new awful and is nostolgically redefined as almost quaint.

So, as I said, I began to ponder if, say, the middle of the 21st century would find users of message boards employing a term like "grammar terrorist" to put down someone who points out when a previous commenter has included an apostrophe in the possessive of the pronoun "it"? By then would "terrorist" be stripped of its current abject evil association?

And my next glib thought was: Of course not. By the middle of this century grammar will be relegated to the same level of importance as studying Latin has now—a quaint amusement for the erudite. So few will remember what the grammatical rules were that they'll be virtually no one left to pompously correct the overwhelming majority. And those few will be the first ones rounded up by the literal grammar terrorists, who will actually be terrorists against grammar, during the coming revolution.

And the pseudo-punchline was: So enjoy these grammar Nazis now, while you still can; your grandchildren will only remember them as a footnote in egregiously poorly written history books.

[Insert rimshot here.]

~

Subsequent to when I started composing that silliness, I did a modicum of research (which is to say I Googled something), which for me is a pretty fair amount. And that revealed one key thing: That hypothetical future is now.

The term "grammar terrorist" has already been used. There's a blog with that as its name already in existence. And there's another blogger also identifying himself as "The Grammar Terrorist" (so there seems to be a slight competition for the title).

In any case, my sardonic rumination about how fifty years from now the practice of applying "terrorist" as blithely as "Nazi" is done in contemporary day has proven to be attributing five decades too long to the process. The former is not as common as the latter, but there are those for whom the risk of association with the Bush administration's favorite vilification already holds no concern.

(Of course, neither "grammar terrorist" site has been updated in over a month, so perhaps it proved to carry more of a backlash than either writer realized when choosing such a moniker.)

So, apparently, for those on message boards who seek to find a term that could be so shocking that it would be legitimately insulting, it appears that they may have to invoke really vile terms; for example: "grammar child molester."

before it loses its hideous association.That, I must admit, strikes me as something that even one with the most ironic sense of humor would take pause before willingly identifying one's self as such. "Nazi" and "terrorist" may be innocuous by now, but "molestation" should still carry at least some stigma. Granted, it's crossing a line, but presumably "Nazi" and "terrorist" were on the other side of that line at some point, too. But "molester" may actually take the 50 years I speculated about with "terrorist" for it to lose its hideous association (but only if at some point soon child molestation is completely eliminated).

In any case, it does appear that particular term is still available as a website name*.

Why I'm striving to help people on message boards and website comment sections find novel ways of being offensive to each other is a question I cannot even begin to understand. It's probably insulting of me to presume they need the assistance; I should already hold total confidence in their natural depravity when it comes to being able to put down those who put down their grammar.

My sincere apologies to them. And to anyone who read this.

(I can only hope there will be some quality responses left in the comments.)

* Which may be one of the last elements proving our society has not collapsed completely.

Tuesday, January 20, 2009

Can music save your inaugural soul?

During the Inaugural Concert held Sunday at the Lincoln Memorial, Jack Black introduced one of the performers, prefacing what was to come as representing what one might hear on the radio while driving across the country. Then Garth Brooks came out and started singing "A long, long time ago / I can still remember how the music used to make me smile..."--the intro to Don McLean's opus, "American Pie."



However, he then skips over the rest of the first verse and jumps to the chorus. So, I suppose if your radio was on a station where the record was scratched you might hear that exactly as performed, but hey, the song is over 8 minutes long in its entirety, so a bit of editing was necessary. He then proceeds through the second verse (including the couplet "Do you have faith in God above / If the Bible tells you so?") and one more extended chorus, getting even Obama himself singing along.

In 1972 (when the song hit #1 on the charts) there were probably more people who recalled the proverbial "day the music died" (when in 1959 Buddy Holly perished in a plane crash along with Richie Valens and the Big Bopper). Now it's probably better known as that song Madonna butchered and the one used in the Chevy commercials. It's still one of the best rock and roll songs ever written.

And because McLean refused to state exactly what it was about, that's left it open to interpretation--of which there are many. However, the consensus is that it's a tribute to Holly (as noted in The Annotated American Pie) with references to other big name artists who came before and after, including Bob Dylan and the Beatles. Some choose to interpret it as having political overtones ("when the jester sang for the king and queen" perhaps referring to President Kennedy), but others find the song to focus too much on music for such meaning to have been intended. But such is the beauty of art: It is whatever the listener wants it to be.

One thing that is certain: The listeners want it to be a rousing party sing-along, despite the fact that they're singing the couplet:
"And good ol' boys were drinking whiskey and rye / Singing 'This will be the day that I die'."

While that seems overt references to possible alcoholism and death, that's only an obvious interpretation if one is, you know, paying attention to the words. A popular interpretation is that the closing line alludes to singing Buddy Holly's hit, "That'll Be The Day," where the chorus finishes with "That'll be the day that I die." So it could very well not be about actually dying but about reminiscing about the early days of rock while enjoying a cocktail. But somehow I doubt all the thousands at the concert have done the research, so I'd guess some of them were mouthing the words without dwelling too much on the specific words.

Because they're under no obligation to interpret those lines (or any of the rest of the song) as somber (even though even without knowing much about music history the lyrics do tend to be less-than-upbeat). It's got "American" right there in the chorus--so it's patriotic! And "pie"! Who doesn't like pie? And what kind of pie could be better than American pie? And then we get a nice reference to a popular American automobile with an internal rhyme with "levee" (which, other than with Led Zeppelin, is a word that doesn't get a lot of use in rock songs). Then there's the easy to remember "dry" and "rye" and "die"-ending lines.

The song has six verses. Six. And they're not short verses either. The song is long and the only part that's easy to remember is that chorus. Everyone somehow recognizes how brilliant the song is, even without being able to quote much more than the lines of the chorus, so it has transcended its rather maudlin-seeming origins and become a good song for a celebratory event (such as what this concert was supposed to be, presumably).

To dwell on the allusions to booze and shaking off the mortal coil with a more literal interpretation would almost certainly ruin the only portion of the song that's really accessible for a broad audience.

And as Americans, we are free to completely overlook that and interpret it as being as jubilant as the Isley Brothers' "Shout!" (which is into what Brooks segued after "American Pie").

~

But I have to admit: If I'm driving across country and after "Shout!" on comes Garth Brooks' "We Shall Be Free" (as happened during the medley he performed during the concert), I'm changing the station. Preferably to something playing some Stevie Wonder.

Sunday, January 11, 2009

Why can't I find a video like that?

From the things-I-really-should-stop-trying-to-write-about department:

Starting just before New Year's Day (or, rather, on Old Year's Day), VH1 Classic started showing what it considers to be "classic" music videos in alphabetical order by song title. Apparently they intended to show 2,009 of them, as the series was called "2009 for 2009."

Although that number of videos took a week, I admit I only flipped by from time to time, not really watching full videos, but pausing for at least part of the song. And from that pattern of viewing I discerned "classic" appeared to comprise the proto-videos from the early '80s through the full-fledged productions from at least the mid-to-late '90s (and possibly later).

Being arranged by title rather than by genre or by year, it made for some interesting transitions. One I saw that caught my attention was Pearl Jam's "Jeremy" (you'll need to follow this link to see it; it appears they don't want this video embedded)...

...followed by Rick Springfield's biggest hit, "Jessie's Girl":


The songs came out roughly a decade apart—"Jeremy" in 1991, from Pearl Jam's debut album, Ten (although the video didn't come out until 1992); "Jessie's Girl" from Rick's Working Class Dog in 1981—but they seemed to be from different centuries. Obviously, the art form developed quite a bit between those years, and the technology and production values advanced significantly during that time period, but that wasn't quite it.

I was in my early teens in 1981, and I remember hearing Rick Springfield on the radio a lot, but didn't see the video for a few years (when visiting someone who had MTV). By 1991 I was to my mid-20s, and by that time I had cable, and saw the video when it was new and still far from "classic." And on top of that, I purchased Pearl Jam's CD. But the difference didn't seem to be merely that "Jessie" was associated with my teen years and "Jeremy" with adulthood, or what I remembered from those ages.

I didn't think it was that Rick Springfield could be more easily dismissed due to making it big as a soap star and having his musical popularity peak and wane before Pearl Jam's even started (while Eddie Vedder and company seemed to take the rock & roll integrity path). It wasn't that the subjects differed so greatly (one song being about lusting after a friend's girlfriend and the other about a school shooting).

For a moment it seemed to be that early '80s production was inferior, but even that didn't quite pan out; later during the weekend, when the channel had gotten to the M's, I saw The Knack's magnum opus, "My Sharona."

That actually came out in 1979 originally, and I'm old enough to remember it from pop radio at that time as well. While that song did get a resurgence in 1995 with its inclusion in Reality Bites, and the video shown was the version featuring footage of that movie (and yes, the footage of the band from its original days does look dated), there was something about the song itself that stood up despite being now 30 years old. Presumably it had roughly the same sort of production that Springfield got for his song, but it didn't seem anywhere near as old as "Jessie" (even though it was a few years older).

After ruminating on it for a while another distinction hit me. "Jeremy" was now about 16 years old, but it didn't seem that dated; when "Jessie" had been around 16 years (1997 or so) it already seemed old. And the reason was simple: It just wasn't as much of a quality song.

Oh sure, "Jessie's Girl" was reasonably catchy. I'm not suggesting it was a bad song. I'm not saying I don't have it in my music library (I do). It just wasn't as good from a musical quality standpoint. "Jeremy" stood up better because it was a better song, regardless of how it was produced. It's not that "Jessie" was a more of a trite pop song; it was just an average pop song. And not that "Jeremy" was Pearl Jam's best, or the best song of its year, (although it had that excellent intro and outro), but it had... that intangible element of "higher" art which holds up over time.

Now, let's be clear: It's not that I don't like "Jessie's Girl"--because I do--but well, it is what it is (or was what it was). And to be honest, I'd probably rather listen to "Jessie" than "Jeremy" if I was looking for something to hear. (But of the three songs mentioned above, I'd pick the somewhat lascivious "My Sharona" and its killer guitar hook and thumping beat.) "Jeremy" almost certainly would get better treatment from music critics, but frankly, it is something I need to be in the proper mood for.

And that is probably the best way I can describe the distinction between the "higher" art that seems more timeless than the "pop" art that gets indelibly linked with the time in which it gets created.

But that's merely what conclusion I drew about it so I could sleep. Please feel free draw your own. You need your sleep, too.

~

One last thought, to be fair: Although Pearl Jam may have written a better song, I doubt they would have been as even half as good as Rick on General Hospital (had they tried their hands as midday thespians).

It all balances out in the end.

Thursday, January 08, 2009

Getting over it

There's a saying that recommends that you do something each day that scares you to spur personal growth. However, if we're talking things that truly inspire fear, conceivably it's a finite number of somethings that would qualify.

Now, admittedly, it's likely there'd be plenty of things, so it would take a while--years, possibly decades--but still, it stands to reason that if you kept on the suggested pace of one frightening task per day, eventually you would run out of things to be afraid of, and you would have to abandon the routine.

And missing doing the something each day--that probably would scare the crap out of you.

Maybe you want to pace yourself, and just do the scary things every other day. And as the list of fears grows shorter, perhaps spread it out to once a week. Thus, either you'd never run out of things, or when you do get to the end you wouldn't be quite going cold turkey.

~

Yes, that saying undoubtedly is best not deconstructed (or whatever the heck that was I did above). However, I like to think that doing so contributes to my personal growth. (No, I don't expect it to work for anyone else.)

Wednesday, December 31, 2008

In with the Old

As I've thought virtually every year-end in recent memory, I am intrigued by the notion of New Year's Day being a major holiday. I grasp that it has traditions, and it carries the association of optimism or rebirth, but January 1 is ultimately arbitrary. It's not like the Chinese new year, which aligns with the lunar new year; there's no astronomical event to coincide with the day in Western culture on which the year starts. And of course, it's due to Pope Gregory XIII that it's not in March any more.

Now, to be clear: By no means do I mind having the day off. All I'm saying is that when push comes to shove (as inevitably it will on New Year's Eve), what is being celebrated is that the digits on the calendar are changing. Woo-hoo! It has a 9 at the end rather than an 8! Let's have a parade! Give everyone the day off, because they haven't had one since last week!

Every other holiday is at least about something. It may not be much of something, but there's something—religious observations, historical events or figures, rodents who ostensibly predict the weather. New Year's Day is about only that—it's the first day of the year, which by relative standards is still considered by most to be "new" at that point. So it's certainly an accurately named occasion. But why is it a major holiday? Say all you will about the hope of renewal and it being a good excuse to resolve to better one's self, but let's face facts: it's because we get the day off.

Not only do we get that day off from work, but for many, we get off work early on the day before. It has become expected that office employees need extra time to prepare for getting blitzed out of their minds; leaving work at the usual time would be insufficient.

But while we're on the topic of New Year's Eve, along with Christmas Eve a week before, it's one of the two days a year when we don't have the day off but get off early. And as anyone who goes into work on December 31 can attest, we're probably not getting a lot accomplished before we get to head out from the office, so it's a rather pointless reason to even go in at all.

Therefore, we need to get another day off.

Analyzing this, the obvious conclusion is that we only get full days off if "eve" is not in the name. Thus, if we get the name "New Year's Eve" changed, perhaps we can get it declared a full-fledged holiday on its own, and hence a full day off work.

I suggest "Old Year's Day"—and I'll concede up front that it lacks panache, but what are we celebrating? It's the last day of the "old" year (by the same relative standard than makes the next day "new"). It's not having a catchy name that results in a holiday (as we've already determined, it's not like "New Year's Day" was all that inspired); it's the lack of "eve"—and that much is accomplished in the name.

So on this last day of 2008, allow me to wish all my readers (likely for the first time) a Happy Old Year's Day.

~

It has to start somewhere. This could work if we spread the word, people. If you know anyone with some pull in the matter (even the Pope if you have that in your contact—historically, that position has proven to have some influence in holidays pertaining to the calendar), please forward a link. Everyone will be grateful when we all have Old Year's Day off.

Of course, then everyone will start partying on Old Year's Eve, and employees will start expecting to get off early then as well, and then we'll need to concoct a non-eve name for it, but we'll cross that bridge when the ball drops on it.

Wednesday, December 10, 2008

Best that you can do

This morning I was doing dishes, with no TV or radio on in the background. I was left to my own brain for entertainment, and what popped into my consciousness?

"Arthur's Theme"

Yes, Christopher Cross' soft rock song from the movie Arthur. Well, not the whole song. Just the chorus: "If you get caught between the moon and New York City..." and so on.

Here's the thing: It's not as though I heard it recently and it got stuck in my head; I hadn't heard that song in longer than I could remember (years, possibly decades). I never owned the song, either on vinyl, cassette, 8-track, CD, mp3, or piano roll. I don't even recall seeing the movie Arthur in its entirety.

So, at some point in my youth I heard that song on the radio. That much I'm certain was the case; it was pretty popular and got airplay in its time. And it's not a bad song, by any means, but at no point would I consider myself to have been a fan of it.

And something else worth noting: I have over 26,000 songs in my music library, almost all of which I've heard more recently than when last I heard "Arthur's Theme" and most of which would rank higher in what I like than that song.

But when the moment arrived that any of those songs could have been referenced by my gray matter, instead came... a chorus ending with the line "Best that you can do is fall in love." And it kept repeating over and over, because my mind never paid enough attention to the rest of the song to know any of the verses.

What this says about the state of my sanity is best not discussed further. However, it does seem to indicate that the advent of and ubiquity of the portable mp3 player, while allowing me to have almost constant access to the songs I like, has less influence over my idle brain than did pop radio from decades past.

I'm not sure whether that is due to a profound difference in the format in which the music is presented or due to the greater influence of experiences from youth over experiences of the years after youth.

But assuming it's the latter, this means that a young person growing up today (in the era of the iPod) who decades from now is unfortunate enough to have a moment of quiet for his brain to fill will be more likely to get a song that he used to have on his iPod than a song he recalls from the radio.

And if he happened to have "Arthur's Theme" on that iPod from his youth, then that will undoubtedly cause a rip in the time-space continuum that will destroy the known universe. So, with apologies to Mr. Cross, we need to eliminate all mp3 copies of that song and keep children from being exposed to it, just to be on the safe side.

It goes without saying that I would have been well-served by a lobotomy, but the time for that to intercept my moment of getting lost between the moon and New York City has passed. But I will pledge to humanity and any other beings in the universe that I will always keep some source of background noise on at all times, so there'll be no future opportunities for obscure pop songs to invade my idleness and start this wormhole of potential devastation. I know it's crazy, but it's true.

Or everybody could chip in for the lobotomy.

Thursday, November 13, 2008

Broken

Last weekend on the new CNN show D.L. Hughley Breaks the News, he held this segment with Dr. Drew Pinsky:


D.L. alluded to being thrown for a loop by the election results. He was pleased, certainly, but his world view was based on a paradigm that no longer applies. In a strange way he missed the sadness of the cynical take on the way things were that had been the foundation for his attitudes. It seemed even a fucked-up status quo was simpler to deal with than a new not-as-fucked-up world.

For me, it's also a weird and unprecedented scenario. Nothing as profound as what D.L. experienced, but something that threw me for a bit of a loop.

My streak has ended.

The streak was that I'd never voted for a presidential candidate who actually won previously. (Four years ago I offered this post where I first mentioned the streak.) Sure, some of those years were what some would call throwing away my vote, but I'd argue that in the last four presidential elections I would have thrown away my vote regardless. There had never really been a nominee who elicited in me a sense of being someone I wanted to vote for; there was merely one candidate who was worse and needed to be voted against.

However, that's not representation. That's the proverbial lesser-of-two-evils. And while many would argue that's what the system tends to be, it doesn't make it less of a waste to vote merely to keep out the one more feared.

I will note that, being a Californian, it was always a virtual given that the state's 55 electoral college votes would be given to the Democratic ticket, whether I voted for them or not. If I voted in the evening, shortly before the polls closed, it was not uncommon for the news to have already called the state. In that case I could vote for a third party, which was closer to doing something I believed in; I did fancy the notion of developing a viable third party, and it seemed only by them getting a noticeable percentage of the ballots would that ever be likely to happen.

And really, how could I pass up the chance to cast my vote for Ross Perot? I mean, that was a once-in-a-lifetime opportunity to vote for someone who was quite possibly insane.

Anyway, starting in 2000, I'd attempted to keep W. out in two consecutive elections (not because I was sold on the Democratic candidates but because... well, do I need to elaborate?) and had that backfire. It seemed like if I voted for a candidate he was doomed not to be inaugurated. I'm not saying I was a jinx, but there seemed a distinct pattern developing.

I was therefore reluctant to sincerely vote for Obama. I didn't want to screw up the first time I felt like I was really voting for someone. However, I could not pass up this chance to vote for a candidate who I felt came as close to representing me as I could ever recall.

I never believed he'd win, even though all the pundits said he was a lock.

I'm not suggesting that my lack of belief is what contradictorily propelled him to victory. I don't like to think I have that level of influence, and I'm certain I don't. Still, I had a particular take on my relationship with the political system that was long established, and even if it wasn't a source of happiness it was something to which I was accustomed. I'd watched Bush the first, then Clinton (twice), then Bush the second (twice), all with a sense of eh-there-it-goes-again.

And this time I had to adjust to I-can't-believe-it. However, on election night itself, I was rather emotionally distant, mostly because I'd been hearing for weeks that McCain could not win (Obama could lose, but McCain couldn't win). So after all that saturation, at the point of watching Jon Stewart confirm that outcome it was somewhat like watching the favored team win the Super Bowl and more than cover the point spread, with the end result never being in doubt. However, over the days since then, I've had more chance to ruminate on how amazing it all is.

No, I don't mean just that a non-white male got elected; I also mean that I could favor a candidate (non-ironically even) and not have it blow up.

It's a new world, and it's going to take some time to get used to.

Monday, November 10, 2008

Pop-ularity

From the sit-down-for-this-one department:

I was listening to some TV On The Radio tracks I downloaded. They were from the band's material on the Touch N' Go label, and I could tell that they were good. Previously I had heard their last major label album (Return to Cookie Mountain), which featured a successful single ("Wolf Like Me") which didn't sound like the rest of the album. However, their sound tends to be that: Not all the tracks sound the same. That isn't a bad thing from an artistic standpoint.

Their tracks aren't generally "catchy" so they do take some time to grow on one (or at least on me); overall I was not inspired to listen to the Cookie Mountain disc after we got it more than once. It was not bad, by any means; it simply didn't strike that chord with me right off like some songs do--songs which are almost certainly of a lesser artistic quality.

However, it is very easy for me to listen to the TV On The Radio tracks and detect that there is talent behind them. I hope they come up in random playlists for a long time.

[This post on Aquarium Drunkard has some live TOTR tracks that are quite good.]

What intrigues me is that in their fall music preview*, Entertainment Weekly did mention TOTR's latest album, Dear Science. While I accept that in the early 21st century here that such bands are not relegated to underground publications, and that such a situation hasn't been the case in nearly two decades. Still, EW has such a mainstream, populist association for me (not that what they cover is necessarily restricted to that, I know consciously, but that doesn't change the connotation in my mind) that seeing a band that I'd consider as non-mainstream as TOTR to be something outside of what EW should spotlight. That's probably more reflective of the better marketing that such artists can get these days than anything.

So, in short, it's a different world than this probably fictitious one that is the basis for my associations. I fully admit that I tend to think that to be immensely popular is to be egregiously mediocre. (And that's not knocking egregious mediocrity; I happen to like a lot of songs which are that. I'm just identifying the relative artistic quality.) Thus, there's a certain level of difficulty for me to reconcile the non-mainstream and artistically worthwhile with being discussed in the same way as the mainstream and less artistic work that tends to be popular.

That's kind of stupid, I concede. I should be pleased that what is more worthwhile is getting its due, rather than the same ol' drivel being crammed down the throats of the masses. However, I came from a situation where there was that distinction, and it became something of a badge of honor. I can remember going to concerts in the late '80s and early '90s (my heyday for that) with my friends and looking around at the others in the crowd and thinking, Look at these poseurs. You can tell by the way they're dressed that they're not as into this band as we are. Clearly they're just here because the band happened to get a fluke hit on the radio. They probably can't even name a song off the band's last album.

The thing was: Those same people were probably looking at my friends and me and thinking the same thing about us.

It was very much a matter of being able to identify who was sincere in their appreciation of that which we took very seriously as opposed to who was merely jumping on a trend (and who undoubtedly would be jumping off as soon as that novelty wore off). When the "quality" artists were not household names, not being mentioned on TV, not something that the average person would hear about, knowing about those bands carried a tacit fraternity. Those people knew the secret handshake in a manner of speaking. Popularity was what the unworthy achieved, by allowing themselves to be sold to everybody.

That's not how it really was, of course. That was the bitter grapes attitude we unconsciously adopted to make up for not being in the mainstream. Everyone probably starts out wanting to be popular; smart people eventually figure out that being popular is as much of a curse (if not more) than a blessing.

I imagine that's what was too much for Kurt Cobain to reconcile. (Note: I was merely a Nirvana fan, not a Nirvana fanatic, so please don't jump down my throat for not having read every book about him—or, to be honest, any book about him—in the wake of him shooting himself.) I get the impression that Kurt came from that same mindset, and that's what made it particularly difficult for him, given that he more or less single-handedly (well, with the help of Krist and Dave and Butch Vig) and quite unintentionally played the key role in changing the way the situation was between what the mainstream media covered regarding the world of music. He should never have become that popular; that wasn't the way these things worked out, and that was how everyone had been comfortable with previously.

Yes, it is entirely convenient to attribute the shift to the public reaction to Nevermind (and in particular to "Smells Like Teen Spirit"), but I was 23 in 1991, and remember quite well how I perceived the distinction of what was covered by non-music magazines (representing the mainstream here) before Nirvana's major label debut came out and what is covered by them now, and I know it's what everyone (whoever that is, yes) says made the difference. I agree, based not on hearing it over and over in rock documentaries but on my personal experience. It could be completely wrong, but I don't think there's any actual "right" or "wrong" in this scenario; it is whatever one perceives it to be.

I digress, of a sort.

Anyway, I alluded to how it was likely that everyone aspired to popularity initially, but that some eventually gave up, either because they accepted it wasn't going to happen or because they saw through it and realized it wasn't all it was cracked up to be. However, there was another way that could go: They could actively sabotage the likelihood of gaining mainstream popularity.

Behold the Replacements, the patron saints of underachievement. Leader Paul Westerberg wrote some phenomenally good songs, and a watered-down version of their sound was taken to the mainstream by the Goo Goo Dolls (who were admitted 'Mats devotees), so the possibility of them getting more popular with the non-college rock crowd clearly could have happened, were it not for one thing: they stuck a metaphorical middle finger toward that.

By the very late '80s their label was trying to get them exposure by having them open for Tom Petty and the Heartbreakers on a nationwide tour. It's a difficult position for any band to fill that opening slot and try to win over another artist's fans, but Paul and the guys would come out on stage drunk (and sometimes in drag) and actively insult the crowd between songs--when they actually finished their songs.

And apparently that doesn't tend to make people want to buy your albums or request your songs on the radio. Funny that.

Sure, they recorded some songs that seemed pretty clearly intended to be radio-friendly (not that their better songs were radio-unfriendly, by any means), perhaps seeming a good idea while in the studio or to appease the label who was paying for the recording, but in the end they went out of their way to shoot down mainstream success.

I have long held the opinion that those of us who are big fans kind of like that fact. It may be on a subconscious level, but we appreciate that when we allude to the Replacements that we never have to make the follow-up statement about liking them before they got big; they never got beyond the realm of being relatively well-known in the world of those who aren't that well-known. Those 8 albums of theirs remain perfect because if you own them, you probably still listen to them; you didn't buy them because they were "hot" at one time (because they never were). They weren't spotlighted in EW or such magazines as something to look forward to.

Thank goodness.

I sincerely believe that would have kind of ruined it.

I have to imagine that had the 'Mats gotten as big as, say, R.E.M. (with videos in heavy rotation on MTV rather than just on 120 Minutes, playing amphitheaters rather than just concert halls, being a band whose name virtually everyone at least recognized, etc.)—and Paul Westerberg is a better songwriter than Michael Stipe, so it wouldn't be out of the question—then instead of just putting out a couple raw early albums, four really good albums, and two more that were Paul sliding into a solo career before officially breaking up and cementing their rock legacy as a tremendously influential band who never sold out (not that their label wouldn't have loved that), they would have transformed into something that all of the diehard fans would have stopped liking.

We were all underachieving as well, and on an unconscious level we almost certainly felt represented by them.

That's not suggesting they were shooting their prospects for big-time success in the foot just to keep from losing their audience; they simply weren't any more comfortable with being that successful as their fans were.

Of course, that's just my take on it; I could be wrong. And I'm sure I'll be told that I am. (That's why there's a Thoughts on This link below.)

* Yes, the EW issue in question came out weeks ago. I put this post on hold until after the election stuff subsided.

Friday, October 17, 2008

It's not the end

I wasn't sure whether to write some thoughts on the Angels recent playoff demise (winning more games than any other team this season and a favorite to go to the World Series, then a pathetic performance in the Division Series against the Red Sox) or to write about politics, and I realized it's kind of the same thing: events that get one's hopes up but almost cannot avoid disappointing when all is said and done.

You can say that's jaded and cynical, and I'd allow myself an extraordinarily juvenile moment to reply: No shit, Sherlock.

Cynicism is not admirable by any means, but it has never let me down like blind optimism. I'm just saying.

The only time the Angels failed to let me down in the post-season was in 2002 when they somehow harnessed the power of the "rally monkey" (a brilliant albeit inexplicable marketing ploy) to win the World Series. To this day, the only reason I think that happened was because I had absolutely no belief that they could win; they'd broken my sports heart 16 years earlier, when I was in the stadium to witness the moment when they were one strike away from going to the Series and blew it, and from that moment on I had no faith in them.

And in anticipation of your question: This season, with as well as they did before the playoffs, I did start to think they were almost a lock to go to the Series (where they'd lose to the Cubs—go ahead and laugh; it's funny). Am I to blame for their poor showing against Boston? Let's not be ridiculous. I don't have that level of influence over the universe (at least, dear goodness, I certainly hope not), but I have noticed that me getting my hopes up (or, more specifically in this case, taking as likely the outcome of an event) tends to result in the outcome being the opposite of what I was expecting.

I know how ludicrous that sounds. Believe me, I do. I admit this with no small level of trepidation. Again, I'm not saying it's me who's making these things happen; I'm saying I've noticed a pattern, and one where's it's entirely possible—likely, even—it's mere coincidence. Nothing would make me happier than to see this pattern end, but obviously, I dare not get any sort of hope up about that happening.

Please don't try to convince me otherwise, okay? I assure you it won't work.

Which brings us to the upcoming election. (Why? Because I have no better segue.) As I have already intimated (or perhaps even out-and-out stated) I plan to vote for Obama. Listening to what I've heard from him over the past year he seems to represent me better than what McCain has become (and e-fucking-gad Sarah Palin is a...). I'm not going to contribute to Obama's campaign or put a sign up in my front yard, but I do plan to cast my ballot for him next month.

It boils down to this: I do not forget that he's still a politician.

He has to be.

I can support him without deifying him. And frankly, I worry about the people who have done so. It's not that I don't think he'll win (I try to have no expectation one way or another—which should be a comfort to Obama supporters, given what I mentioned about the Angels above); I just think if he becomes president, he'll simply prove to be… well, president.

In that scenario, I see it this way: When it's all said and done, the history books may state he was a very good president, but there's little chance that with these sort of lofty expectations beforehand that when he is perceived as simply returning to earth that it won't be seem a disappointment to those who placed him upon that pedestal.

"Change" is merely a modest variation on the political status quo, and that's as much as we can possibly handle. The opposite the "same thing" is not change; it's an armed coup d'etat. And no matter how bad our government is, I for one don't need that much of an overhaul.

In any case, one thing is certain: Barack Obama will not let me down.