Category Archives: Braden-tented Glasses

You’re Not as Insightful as You Think You Are

I can find a lot of problems with the world around me, and those who know me understand that I can easily go on some long-winded rants about them. (Your prayers for those closest to me are appreciated.) But it’s not a one-way street from my perspective; I’m often interested in hearing people discuss catching similar tendencies of people around us, or having humorous-but-truthful complaints about the modern world, or some other related observation, be it serious or funny, in general.

But since “it takes one to know one,” and because I spend way too much time in my own head mulling over these kinds of things, I also get really, really bugged by generic, poorly-thought-out, boring, and often flat-out incorrect observations that are brought up time and time again by people who think they’re being clever or insightful. They’re not. They’re the intellectual equivalent of tourists at the Tower of Pisa.

Fair warning: this is going to get a little mean. But I’ve been called out for being stupid and unoriginal plenty of times in my life and I know it’s for the best in the end . . . and I also know it’s much easier to take such criticism to heart when you can quietly evaluate yourself while doing something private like reading a blog post, instead of having to tuck your tail between your legs in front of people whose opinions of you matter to you. So if you get offended here because I’m nailing you perfectly, consider it a favor and join me on the endless journey of constant self-evaluation. Now, without further ado:

YOU ARE NOWHERE NEAR AS INSIGHTFUL OR CLEVER AS YOU THINK YOU ARE WHEN YOU . . .

. . . go grammar-nazi on people “misusing” the word “literally.” Because I’ve never understood how so many people are so sure about the only proper usage of “literally” but have never learned the word “hyperbole.” Perhaps its hyperbolic use became so prevalent not too long ago that many people, only hearing it in that context, assumed it meant “figuratively,” and were shocked to learn the opposite.  When I was a kid, I recall thinking the word “barely” meant “not quite” instead of “only just.” Somehow I must have misunderstood its use in a sentence I heard and carried that with me for a while. Then I then learned what it really meant through the course of a conversation and haven’t forgotten. It happens. But the thing is, if someone were to somehow use the word “barely” in a hyperbolic or facetious statement, I’d get their meaning. I wouldn’t write them off as an idiot because they didn’t strictly adhere to its definition.

This example is first because it very clearly demonstrates the Lake Wobegon Effect among two-cent intellectuals that grates on my brain like fingernails on a chalkboard. That’s actually a common theme with these: people thinking they’ve taken one step ahead of the crowd and want to show it off, when in fact they’re completely wrong. I’ve spoken before about how hard it is for me to be in a conversation with someone when I know that the information they’re sharing is incorrect; this is that except magnified a few thousand times because so many people do it.

I’d have liked to take the time to further expand on why using “literally” as hyperbole is perfectly acceptable English, but I’d really only end up quoting this video anyway because they’ve done far more research than I have, and are a much more credible source.

___________________________

. . . talk about how stupid fat people are when they drink diet soda. Because that tells me that you just don’t know anything about diet soda drinkers. Sure, the word “diet” is in the name and that implies weight loss, but at the end of the day the people that are sticking to diet soda are people that prefer diet soda over regular sodas; their reasoning is not “I need to lose a few pounds,” but instead, “regular soda is just too sweet for me.” I’m serious. Ask around. I’ve known dozens of strict diet soda drinkers (most commonly Diet Coke®) and none of them ever popped open a can saying something akin to “gotta lose five pounds,” nor are they all or even mostly overweight. Is that to say that no one ever drank diet soda with the uninformed intention of dieting? Of course not. But I would suggest that reasoning is so uncommon that it does not warrant discussion unless you’re speaking to a person who has just admitted to thinking that way.

Okay, I admit that’s pretty anecdotal. But I hold that the person rolling their eyes at a heavy woman drinking a diet soda has less real information and far more conjecture to back up their view than I do mine.

While we’re on the topic, let me also call out all the liars who talk about seeing that “fat person,” usually a woman, in line in front of them at McDonald’s who orders half the menu (with everything Super-Sized) and then add on a Diet Coke® and justify it by explaining, “I’m watching my weight.”  That didn’t happen.  It didn’t happen when you were standing in line; it didn’t happen when you were working there; it didn’t happen that one time someone else told you about it.  Some stand-up comedian at some point in time made that up and it got repeated so much that some people began to think they witnessed it.  Sure, fat people at McDonald’s have ordered diet sodas with their meals, but they didn’t throw in the “I’m on a diet” line; refer to my previous paragraphs as their likely reasoning. (There’s also a tangent I’ll only mention in passing about how easily so many people assume that this is true because they believe fat people are stupid enough to think they can lose weight by drinking something labeled “diet” while eating a 1500-calorie meal.)

___________________________

. . . rant about how Christopher Columbus was a really bad dude and/or did not actually “discover” the Americas.

“Chris Columbus killed thousands of natives!”

“Chris Columbus was the first serial killer!”

“Chris Columbus didn’t think the world was round when the rest of the world thought it was flat!”

“You can’t discover a place where people already live!”

“And so on!”

We all know already, okay? Most of us under the age of 35 or 40 learned this in high school or earlier; most of those older than that have definitely heard about it at some point in the last 20 years. “And yet we still ‘celebrate’ Columbus Day!” you protest. Except I would argue that if it’s so hard to get people to actually memorialize anything on Memorial Day or thank a veteran on Veteran’s Day, no one is celebrating Chris Columbus on Columbus Day. It’s an excuse to have a day off from work and find some great deals on carpets. If you want to lead the charge to eliminate it or change the reason for the holiday, be my guest, but I’m tired of the internet–especially my inbox and newsfeed–filling up every October with white-guilt-laden lectures about “what you don’t know about Columbus.”

Let’s also not forget how ridiculous it is to judge people who lived hundreds of years ago based on modern morals and attitudes. Calling Columbus a “serial killer,” especially within the context of the era he lived, is an egregious misuse of the term. In another 500 years, many (or likely most) of our mainstream philosophies may likely be viewed as comical or tragically misguided at best, so let’s be rational when we read our history books.

But I get it. At some point, many people figured out how ridiculous it is that we’re “celebrating” an Italian man who sailed under the employ of Spain, landed in some islands that aren’t even part of our country, was directly responsible for the deaths of thousands and indirectly responsible for millions more, and his voyages only distantly (and also only indirectly) lead to the founding of our nation; thus he should be just a footnote in our history books and not an American hero. It’s just that we’d all be better off if we assumed everyone knows all of that (and more) instead of acting like we’re interrupting everyone’s regularly-scheduled programing to bring them some breaking news. (And you know what actually might be interesting? Learning why America turned Columbus into an American folk hero in the first place; far less cynical, wouldn’t you agree?)

___________________________

. . . say something intended for kids is terrible for kids.

The past couple decades we have increased the amounts by which we shelter and coddle our children to keep them from anything that would challenge them, (mildly) disturb them, or cause them to mature a little earlier than we’d hoped.  I’m sorry that this will end up sounding like another “kids these days!” rant, but it’s not really debated that phenomena like entitlement and helicopter parenting have been on the rise since the mid-1980’s.  Very few people consider these things to be good, but the problems continue to persist.  Why?  How can we all be speaking against something we see happening all around us, yet it shows no signs of slowing?  I think it starts by paying lip-service to the condemnation of things like helicopter parenting, and then actively condemning a “kid’s movie” for having intense scenes in them that might expect something of the kids (or, alternatively, watching a movie we saw as children and questioning our parents for letting us see it).  You know what? The Neverending Story and The Dark Crystal were really scary when I was a kid, but seeing them did not turn myself nor anyone I know into a disturbed sociopath or paranoid outcast. Yet I have truthfully heard many people question their upbringing because their parents had no problem with them watching Watership Down.

Your arrogance as an adult has caused you to underestimate what kids can handle, as well as made you into a hypocrite. That doesn’t sound insightful to me.

A great example is when someone tries to shock everyone at how terrible nursery rhymes are by explaining that “Ring Around the Rosie” is really about the black plague.  Because, first of all, that’s completely wrong, and secondly: what does it matter anyway?

I feel that anyone who brings this up as fact is not likely bringing it up just to impress their friends with some interesting trivia. No, instead the conversation always involves, “isn’t it terrible that we teach this to kids? Preschoolers sing this!” I guess I’m weird for not being concerned that kids may be singing a song at recess that has a deep, hidden meaning about a centuries-old tragedy that they’d never figure out, especially if they have never heard of the Black Death.

Maybe it’s because I grew up in an era of especially dark children’s entertainment, but when someone objects to themes in a piece of children’s programming being too intense (be it hidden or obvious), I feel that person should immediately be written off as to having anything valuable to say about anything. (Cough)

___________________________

. . . when you commit the logical fallacy tu quoque. Because it’s one of the lesser-cited logical fallacies, but one of the most common (if not *the* most common) today–certainly on the internet–and it’s a complete non-statement in response to anything it’s a rebuttal to.

Let’s say that in the course of an argument in a comment section, Person A makes a case for why the Oxford comma is entirely unnecessary by saying, “u dont even needs the coma its juts bad grammer lol.”

In response, Person B, an Oxford comma apologist, retorts, “Why should anyone think you’re right about something like the Oxford comma when you don’t even know how to construct a proper English sentence?”

My tendency is to agree with Person B on his support of the Oxford comma, and I also agree with the sentiment that Person A has no ground to talk about grammar when they obviously don’t understand grammar in the first place.  However, Person B has made no case whatsoever as to why the Oxford comma IS necessary.  They have simply observed the hypocrisy of Person A.  Nothing was added to this debate or topic as a result.

Here’s a recent real-world example.  Not long before writing this, one of the stars of the popular reality show Duck Dynasty, Phil Robertson, was suspended by the network A&E for stating his beliefs about homosexuality in an interview.  Naturally the internet erupted in a firestorm over “freedom of speech” vs. “tolerance.” At least one person in my Facebook newsfeed responded to the outrage by posting an old image of the Dixie Chicks protesting the war in Iraq, with the following text: “The same people who censored and protested the Dixie Chicks right to free speech opposing the war 10 years ago . . . are the SAME people fighting for the Right of FREE SPEECH today. Ironic.” This is an example of tu quoque.

The tricky thing about tu quoque is that any given instance of it almost always sounds like it’s a great point.  “Hmm, you know, it is mostly conservatives upset over the suspension of that Duck guy, and it WAS conservatives who boycotted the Dixie Chicks back in the George W days . . . interesting.”  Except what is the point?  Let’s take the time to break down what this actually says.

So people upset over the controversy surrounding Phil Robertson’s suspension are, according to this assessment (which we will presume true for the sake of argument), guilty of hypocrisy because years ago they spoke out *against* the right to free speech of some once-beloved musical artists. What does this tell us? It tells us exactly that, and only that.  These people are hypocritical.  Fine.  Except the issue at hand is not whether or not these people were consistent with their positions on inalienable rights, but instead the issue is whether or not Phil Robertson had the right to say what he did vs. if A&E was right to suspend him for it.  The person who posted that thing about the Dixie Chicks gets to walk away thinking that they’re bright and profound for calling out the hypocritical conservatives, but they neglected to actually discuss the issue at all.

This was my thought process in reaction to the Dixie Chicks post, and it essentially is my thought process every time I see this logical fallacy: “Do they think that free speech is WRONG?  Well, I doubt it, but that’s certainly the implication. If they believe free speech is a good thing, why are they condemning people for supporting it?  Shouldn’t they be glad that these peole finally came around? Wait . . . are they saying that because conservatives didn’t respect free speech 10 years ago that now no one has the right to stand up for free speech?  This is very confusing.”

And so on.  I could write on this error for days if I let myself, though I would stop covering new ground pretty quickly; so we’ll leave it at that.

___________________________

___________________________

BONUS ENTRY!

Here’s one more that doesn’t exactly fit this theme but is definitely related.

It’s about when someone makes an error or is a “two-cent intellectual” in a mass-distributed form of media and is called out for it. Instead of taking a step back and partaking in my beloved activity of self-evaluation, they justify their clear errors or poor judgment by saying, “It’s just entertainment.

People do this to defend the poor science behind the cited-as-health-gospel documentary Super-Size Me.  “Spurlock set out to make an entertaining documentary and he succeeded.”

People do this when they make a Youtube video filled with generic and/or incorrect “interesting facts” and their commenters hold them accountable for it. “So I’ve seen a bunch of comments on this video from people over-analyzing some of these facts. . . . Stop over-thinking everything and just have fun. . . . It’s meant to be entertainment. Treat it that way.”

People do this when The Onion makes a joke that’s considerably tasteless and people complain.  “It’s The Onion! It’s not meant to be taken seriously!”

However, the logic is as poorly thought out as the content.  You have presented some kind of media to the world–be it a Youtube video or a documentary or a humorous article; be it your original creation or something you’re sharing with others.  Except the information does not pass close scrutiny.  Now you’re in trouble.  You’ve been revealed to be someone who just takes things at face value because they are either easy or support your biases, or to be an outright liar, and it’s kind of embarrassing.  So what’s the response?  “Hey! This is not meant to be some kind of academic journal entry or anything! It’s meant to be fun!”

There are two problems with this, and both demonstrate how you thought out your defense about as much as you thought out the information in question.

First, the entertainment value of something presented in the spirit of, “ZOMG! YOU’RE NOT GOING TO BELIEVE THIS THING I’M GOING TO SHOW YOU!” is heavily influenced by how true it really is. (I wanted to say they “the entertainment value of factual content is directly proportional with how true the facts are,” but I’ve seen too many factually-sound PBS documentaries to try to support that claim). If you give me ten facts about H.G. Wells that are supposed to blow my mind, and it turns out every single one of them are made up, I’m not entertained by what you’ve presented. At best I’m apathetic, but if it’s actually ME we’re talking about, I’m irrationally bugged for weeks. Regardless, I am not entertained, Maximus.

Second, if the content of said media is truly “just entertainment,” then nothing you’ve presented has any bearing on anything and is as worthless as burnt paper. If you lecture me for eating McDonald’s because you saw a movie about a guy who ate nothing but McDonald’s for 30 days and his health fell apart, and then I could easily demonstrate to you how every conclusion he came to in that movie was exaggerated or fabricated, and then you retorted, “None of that matters–it was an entertaining movie!” . . . doesn’t that seem silly to you? The reason that information was provided in the movie was to tell people that eating fast food is far worse for them than they thought–so when people who understand the concept of too many variables can show that the information is wrong, its entire existence is suddenly unjustified.  It has no value anymore except as a reference piece in a class on how to be a snake oil salesman. How does, “well, it entertained me!” justify that?

I’m not sure, and I unfortunately know that this post isn’t going to have any effect on these things re-occuring in the future, to me or otherwise. But I guess as long as they do, I’ll always have something to complain about. And that’s something in and of itself.

Advertisements

My Peculiar Pet Peeves

I’ve got a long list of pet peeves like I’m sure many do, but I’ve noticed a few in myself that seem to go beyond noisy soup eaters and people who take too long to go when the light turns green.  Here’s some exposition on them.

1: Calling something a “Pet Peeve” when it is bigger than just a minor annoyance.

I believe that a genuine “pet peeve” should be something that, in the end, doesn’t really matter.  It’s something that bugs you but not necessarily anyone else, and if that pet peeve never happened again, the world would not necessarily be a better (or even worse) place.  A great example in my own experience: I get very passionately upset when discussing the dismissive and arrogant attitude Chicagoans have with the-rest-of-Illinoisans.  But I don’t call that a pet peeve because on some level (however small in the scheme of things) it matters.  I can argue a point as to why Chicagoans shouldn’t be that way; and I know of many Chicagoans who could and would argue back.  So when someone mentions that a person driving too slow in the left lane on the freeway is a “pet peeve,” I get peeved, because that’s actually a bit of a safety concern (not to mention a legal one).  But note that my annoyance at this ultimately doesn’t matter. Hence it being the first in my list.

2: Traffic reports on the radio made in the second person.

Have you ever noticed this?  I didn’t live in a place that even needed traffic reports until about seven years ago.  Then, when I started paying attention, I noticed that some days I had the information given to me and went about my day.  Other times I would listen and notice myself getting uncomfortable with what was being said to me.  Why?  There are a few traffic reporters here in the Seattle area that do their traffic reports in the 2nd person.

“And here’s Kimmie with traffic.”

“Well it looks like you’re having a really tough commute this evening.  You’re stuck in a three-mile backup on northbound I-5, and you’re slowing down on southbound, as well, as you’re distracted by that accident across the divider.  I-90 doesn’t look great as you’re coming off of Mercer Island, and you’re backed up pretty bad heading south on 405.”

Ugh.  It even made me cringe to write that.  It bugs me because in an attempt to make the traffic report *more* relatable by giving it in the second person, it has become *less* relatable.  Why? Because, no, I am not stuck in a three-mile backup on I-5 North.  I’m very likely somewhere else.  And if I AM in a three-mile backup on I-5 North, then your traffic report doesn’t help me very much, does it?  I already know I’m stuck and now your poorly-thought-out narrative style makes it sound like you’re mocking me.  I need a traffic report to tell me what things are like in places where I will *be*, not how they are where I *am*.  Do I get the same info either way? Absolutely.  But we’re not talking about logic here.

3: “Baby” instead of “The Baby” or “Your Baby” or “Our Baby,” etc.

I first took notice of this years ago when I spent a lot more time watching television, and there would be a commercial for some baby product. The commercial narrator would say something like, “. . . so that it doesn’t irritate baby’s skin,” or “. . . and its gentle formula makes it easier for baby to digest.”

Well, round about 13 months ago I got caught in the “all-babies-all-the-time hurricane,” myself, and this issue only compounded itself.  It’s as if all the having-babies and raising-babies industries and community forgot about articles and adjectives.

“You just need to do what’s best for baby.”

“. . . and that will give you more time to spend with baby.”

“Do some research on what things you prefer to have in toys for baby.”

People, I implore you.  What is so wrong with saying, “the baby,” “a baby,” and ESPECIALLY “your baby.”  “Baby” is not my child’s name.  “Baby” is what she is.  We don’t do this for other things . . .

“This will be the perfect gift for man.”

“Life can be a little rough when dealing with teenager.”

“. . . and it’s gentle formula makes it easier for guinea pig to digest.”

. . . so why “baby?”  I don’t understand and it’s annoying.

4: People I don’t know talking to me about my food.

In a broader sense, I just don’t like it when people I don’t know talk to me, period.  But I concede that in most situations it’s good for me to step out of my comfort zone and be forced to interact with people.  I think that’s good for all of us.  What would things look like if we weren’t so cold to strangers every day?

That said, I cannot stand it when strangers start small talk about my food that I’m eating or heating up.

“Oh, that looks pretty good.  Whatcha got there? Is that chicken in that?”

What the heck? Who are you and why do you think I’m okay with you putting your nasty vision all over my food?  I’m standing by the microwave to heat this up so I can eat it and continue working and go home.  There’s no reason for us to have any interaction about my LUNCH.

Useless small talk is always bad, but when it centers around a very private thing like my nasty-looking, cold, chicken curry and rice in a Ziploc container, it’s infinitely worse.  What response are they expecting?  Do they want a bite?  Because they can’t have one.  Do they want me to discuss how I made it?  Should I bring up the stores where I got the ingredients?  Or maybe my inspiration?  I’m very certain they’re not asking so they can make it themselves.  Am I now obligated to return compliments on THEIR food?  Should I bring up that I’m not sure which pepper slice is the one I dropped on my dirty kitchen floor, but it’s in there somewhere?

The worst example of this happening was years ago as I was heating up some KFC (freaking KFC!  Does day-old fried chicken and gluey mashed potatoes warrant a conversation?).  A guy started blasting me with questions about it–“Is that regular or extra crispy?” was one of many inquiries–and then worked his way into asking if I’ve ever been to Cleveland, Ohio.  When I said no, he proceeded to give me a specific location (as in street name and neighborhood) of a “great fried chicken place” that I should check out if I’m ever in the area.  And then he left.  To this day I wonder if he recalls that conversation as one of those cringe-inducing embarrassing memories.  I honestly kind of hope so.

As a quick disclaimer–if I know you, I don’t care if you comment on or ask questions about my food.  Seriously.  Don’t be afraid.  Chat away.  You see, to speak to another person about their food, I believe there should be an established relationship.   It is not small talk material.  Make a comment about my funny shirt, complain about the smelly work fridge, ask if I know where the extra salt is kept.  I don’t care.  Anything that works in a passing manner, but if you’re going to talk about what I’m eating or about to eat, we’d better be working toward some meaningful interactions sometime in the future.  Otherwise it’s like striking up a conversation with a stranger at a urinal.

5: People who can’t or refuse to make eye contact with you during a casual conversation.

This is the weird thing . . . I’ve never been able to find anyone else that notices this the way I do, but I can think of three people at least–none of whom have any connection to each other other than knowing me at some point in time–who, when getting really into the point they’re making in a conversation, look off far to the side and hold their vision there.  It’s hard to describe in words.

This is a normal conversation:

NormalEyeContact

And this is what the conversation looks like when I’m talking with the people that do this:

NoEyeContact

I *know* this has to be something  of a common thing.  Someone reading this will know what I’m talking about, or start noticing it.

6: When trivial pieces of information, which I know to be untrue, are brought into a conversation and I have to decide to be a jerk and correct them or to lie and pretend I don’t know they’re wrong.

Did you know if you soak a steak in Coca Cola for a week it’ll dissolve?

Did you know that Johnny Depp finished scenes as the Joker for The Dark Knight after Heath Ledger died?

Did you know that Washington State Unemployment determines how much you make on unemployment by picking a paycheck from the previous year at random and giving you a percentage of that?

Isn’t it crazy how fast the Die Hard movies have come out?  I mean, the first one came out in the late 90’s!

None of those statements are factual, and all of them I’ve found myself in conversations with people that either required me to kill said conversation by informing the other person of their error which inevitably and awkwardly brings the larger interaction to a halt, or to proceed with that conversation feeling like a complete and utter fraud because I’m pretending to be impressed by information that I know to be false.  This is the sad side to being someone so fascinated by such useless, trivial things.  I learn about them, I read more about them, and then I learn what’s true and what’s not.  As a next step, I then become fascinated not by useless and trivial things, but by misconceptions about useless and trivial things, which increases the probability that this happens to me.

But there’s a different side to this.  Sure, me immersing myself in knowledge of common misconceptions about a variety of trivia topics puts me in a self-imposed position to get annoyed by people who don’t know as much as I know about something.  But sometimes, it’s the other person.  Sometimes the other person either makes something up, listens to something that is made up, or doesn’t know how to be properly skeptical about things they read in email forwards, and then spit it out as fact.  And then the conversation gets *really* awkward because I now know that I will not believe anything they say for the remainder of the time I know them.

And then sometimes I don’t know WHERE they get their information.  Like that thing about Die Hard.  I’m not exaggerating that quote.  It’s paraphrased, yes, but not stretched at all.  I met a guy who expressed his wonder at “how fast Hollywood put out all the Die Hard movies” as he discussed that summer’s release of Live Free or Die Hard, and when I asked him to clarify what he defined as “fast,” he pointed out that the first movie was released in the late 90’s.  1996 at the earliest,  he was sure.  And he refused to listen to me when I told  him I remember very clearly watching the first Die Hard on TV in the early 1990’s.  (It came out in 1988, for the record; I’ll never forget that fact, now).

I actually spent a whole day talking to that guy and  he was this pet peeve incarnate.

7: The phrase “Wow! Small world!”

Because it’s not.   Spend some time on Google Maps and really take in how small your house or apartment is next to the nearest body of water you can find.  Then scroll out and take in how much space there is on Planet Earth and really ponder on how NOT small this world is.  When an amazing coincidence comes up, it’s okay to really marvel at how crazy that kind of connection is.  Because there are a lot of people.  And there are far more ways that coincidence could have NOT happened, as billions and billions do daily, than for it to have  happened.  Let’s be amazed.

The Stupid “Socialism” Experiment

One of the radio shows I listen to at work featured the following video and praised it as smart and clever, if not genius:

The message is, of course, that the things you earn in life are yours and no one should be forced to give those things up against their will to assist people who didn’t work as hard as you did, and ended up with less. As the end of the video states, this is a thinly-veiled commentary on the “immorality” of Socialism.

Except it’s really, really stupid.

I am not a socialist, I am not communist, I am not even liberal; I just cannot stand poorly thought-out analogies by people so cocky about their “message” that they haven’t even stopped to think through what they’re talking about.  Nothing in this video makes sense when you really take the time to lay out why a GPA is absolutely nothing like money and therefore presenting the crazy, unfair idea of redistributing higher GPA’s to failing students is not the same as presenting the idea of redistributing mass wealth to people dying of starvation.

Many students signed the petition because (I think) GPA redistribution sounds logical and compassionate at face value to someone who has left-leaning viewpoints.  But I’m not going to call them out for being gullible–it’s hard to catch all the holes in something like this when you’re on the spot and on camera. Some people tried to point out how idiotic this idea is, but just like trying to catch all the logistical holes in three seconds, it’s hard to really be able to pick it apart in all its ludicrousness in the same amount of time.  So I’ll take the time here.

Please take note and remember: I’m not here to advocate socialism or the redistribution of wealth as good ideas (I really don’t think they are); I’m here to demonstrate that you cannot walk around a campus talking about redistributing GPA scores and think you’re making some irrefutable argument about anything other than your own lack of analytical thinking.

1. No one inherits a GPA.  Yes, I get that not every rich person inherited their wealth, and more than a few people born rich became poor through their own bad choices somehow, but that’s not the point.  MANY people DID inherit wealth, and even those who were born into some money that went on to be successful and gain even more wealth were able to do so because of the wealth they had to begin with.  No one gets a good GPA because their great-great grandfather carried a 4.0 a hundred years ago.  Some can afford to not have to work which gives them more time for study, sure–but I defy you to to find me statistics that show that kids who can’t afford to not work through college get lower GPA’s on average.

2. GPA’s are not a resource or commodity.  It’s simply a numerical system created to easily demonstrate a student’s academic status.  Money, on the other hand, is limited.  And if you’re like some of the commentors on that YouTube video that want to say, “If wealth isn’t infinite, then how come the Fed can keep creating currency?”, come here so I can slap you (it’s stuff like that which has kept me, a notorious flame warrior in comment sections, from ever getting into it on YouTube). The fact that wealth and money are finite is the very reason it’s bad that the Fed keeps printing more money! They’re not creating more wealth–they’re devaluing what we already have! The point here is that the reason some students have GPA’s so low that they can’t graduate is NOT because all the GPA points were taken by those with 4.0’s.  They have low GPA’s because, for one reason or another, they didn’t make good grades.

3. A student with a 4.0 redistributing their points to other students does a lot more damage to that one student, and a lot less good for those other students, than a billionaire giving away a fraction of their wealth.  AGAIN–I’m not advocating the redistribution of wealth, but (discussions about the dangers of coming into sudden wealth aside) if everyone’s favorite go-to rich guy Bill Gates took $762.5 million (12.5% of his net worth) and distributed that evenly to five poor people, Gates would have far less damage, and those five people far more impact, than if a student with a 4.0 took 0.5 points and gave 0.1 points to five different students.  There’s technically no cap on total wealth possible, but obviously GPA has a cap at 4.0 (or maybe 5.0 if you go somewhere weird).  You might want to hit back at me with something like, “But the video isn’t actually about redistributing GPA, but is instead about how ridiculous it is to insist that wealth be taken from those who have and given to those who don’t.”  Except the analogy cannot hold up because, even in just this one regard, GPA and wealth are such different animals that you can’t logically say that doing A, which some people think is good, is essentially the same as B, which is obviously unfair.  A and B are not comparable.

4. Every student earns their own GPA, for themselves.  When I worked for McDonald’s, despite all of the long hours, the late nights, the frustrating customers, and the disgusting food and building, I was not doing much for myself.  Every dollar I put into the til, I got a fraction of a penny of that dollar.  The vast, vast majority of it went to the guy that owned the local franchise and the McDonald’s executives.  And I’m not even saying they shouldn’t have; they put in a lot more time, a lot more effort, and a lot more risk into that business than I did–but the other side of that coin is that they never would have earned a cent without people like me keeping the restaurant running and bringing in income.  Now compare that to grades in college.  There is not now, nor has there ever been, a college student who puts in hours and hours of study time, working on papers, pulling all-nighters, and never missing classes, so that the majority of their GPA points go to make the “top 10% of students” have even better transcripts.  GPA is essentially a lone venture, where as your wealth depends on other people as well as yourself.

5. To expand on all of these–if you’re able to actually be accepted into a college and you put in the work and the hours necessary or even more-so, it is VERY hard to flunk out.  Almost impossible.  But if you go out into the world and work hard for a company or put in a hundred hours a week into your own business, you can still fail, and actually, statistically, failing is pretty likely.  This is, I think, where this whole “Redistributing wealth is like redistributing GPA’s” thing falls apart the most.  It makes that horribly flawed assumption that people who are poor are poor because they didn’t try hard enough.  That could not be farther from the truth.  The makers of this video and the holders of this perspective want to push this idea that financial success is directly correlated to the amount of effort put in (like a GPA), but that leaves out things like the social class, family wealth, education level, geographic location, and even the year one was born (yes, I’ve read Outliers).  All college students who are extremely dedicated and work the hardest get the highest GPA’s.  In the world of money and wealth, the vast majority of workers who are the most dedicated and work the hardest most often maintain a comfortable middle class status.

I’m convinced that when the students behind this video were told by the people who spoke up that the GPA redistribution plan was dumb, they thought they were hitting their point home.  What they didn’t realize is that the part they were saying was stupid wasn’t the idea of redistribution, but the idea of comparing GPA to wealth.  I find it unfortunate that despite the gaping holes in their little “experiment,” they’re going to be patting themselves on the backs for years to come.  I think that’s what upsets me the most–I tend to be a bit more conservative overall, and come from a conservative family and background.  So when I see people that I, by default, consider my “brethren” (regardless of how distant the relation), I get upset because they’re poorly representing a perspective that I otherwise think has merit.  Probably.  I’ll get into what I think the serious difference between the hard-working wealthy people in this world and the actual “1%” is, some other time.

Childish Things

The words going through my head today are more introspective and autobiographical than usual; I recently read some Donald Miller, so that might be part of the reason.  I’m going to write about what has been one of the largest crutches of my life, and most recently was the cause of my abandoning my blog once again.

Prior to age eight, I had a small handful of run-ins with video games.  Once with an Atari 2600 at someone’s house my family was visiting.  Another time, one of my younger brothers and I got to play another Atari 2600 when a slightly-older-than-our-parents couple watched us for a day or two.  (They had a pinball machine, too.  That place was cool.)  I remember we played the obscure Atari title Maze Craze until we were dreaming about it.  Yet another time we were at a family friend’s house and she had a teenage son with the mother of all video game consoles, the 8-bit NES.  She let us play it, but we had no idea how to properly use one, so we were swapping cartridges with the power still on, and of course not holding “reset” when turning off The Legend of Zelda.  When he came home he was remarkably calm, especially considering we ruined some of his games.

. . . no, it was like that when we turned it on.

There was no doubt a few more instances thrown in prior to the Christmas of 1988, when my two brothers and I tore open what seemed like the biggest box we’d ever seen in the sight of the loving and weary smile of my recently widowed father.  It was what we’d asked for: a Nintendo.  But this wasn’t just the Nintendo; it was the Power Set.

It came with the Power Pad and three games on one cartridge: the standard Super Mario Bros. and Duck Hunt, plus World Class Track Meet for use with the Power Pad.  I remember that Christmas was a Sunday, too, because I know we didn’t get go straight to playing it all day; we hooked it up and had to go straight to church.

I don’t think anyone would ever fault a dad for getting his boys what they really wanted for Christmas on what would likely be the saddest single holiday of their lives.  But in the years that were to come, my dad would come to regret it nonetheless.  And now, as an adult, I do, too.

My life for the years that followed mostly centered around video games.  Every birthday and Christmas that followed until mid-high school was the time to get a new game.  In grade school I did the trading thing with my friends at school.  It was safe for the most part, but one day I did notice that my copy of Super Mario Bros. 2 was missing, and I never saw it again.  (I don’t know who I lent it to or who swiped it, but my money’s on Mike Strader.) That was the only casualty, though.  At least for my games; I can’t speak for the games owned by my brothers.  By middle school, my closest friend was also a big video game fan, and he had a lot more stuff than I did.  So my leisure time was games, and my social time was games.

My life revolved around playing video games so much that if I wasn’t playing them, I was talking about them, drawing pictures based on them, or just plain daydreaming about them.  One time I really wanted the game Final Fantasy so I let myself become so obsessed that I read and took notes in the strategy guide I’d gotten a hold of for it, and even at one point spent an entire day of forced chores mumbling “Final Fantasy Final Fantasy Final Fantasy” etc., under my breath.  Yes, it was as nuts as it sounds.

Several months later, I bought it and in the few years that followed, I beat it probably 20 times or more.

The toll it took on the academics of my brothers and me was so great that my step mom would sneak the controllers away at the start of the school year and we wouldn’t see them again until June; unfortunately games weren’t the only issue there, but they were a very large part regardless.  In the summers from around 1991 until 1996 or 1997, my brothers and I worked out this incredible compromise to keep from fighting over the NES: a rotating, hour-by-hour schedule each day.  Written down on paper.  On one hand it kept us from fighting (mostly), but on the other hand most summers, sometimes days upon days, were spent in front of our TV in the basement.  To be honest, we preferred it.

My ruined fourth semester of college was also due in large part to having free and open access to video games at any time, with me staying up until 4 a.m. sometimes playing my PlayStation (I had just gotten a TV for my room for Christmas).  Not long after I dropped the classes to avoid bad marks on my transcript, my parents kicked me out of the house.  When I got a place of my own months later, with no responsibilities other than work, I’d go 16+ hours playing video games sometimes.  Fast forward a few years to my first apartment with roommates for the summer prior my second and final year attending Southern Illinois University, I’d go even longer.  I once spent so many waking hours doing nothing but playing Grand Theft Auto III that one day, when I realized I’d left my phone in a gas station across town the night before, I got into my car and headed out to go get it and caught myself going 80 miles an hour on a 30 mph road.  That’s how I’d done it in the game for so many days, it was instantly natural when I was actually behind the wheel.  I think I first noticed the serious danger of the situation when my rationale overtook my instinct to run a slower car off the road.  Another time I practically locked myself in my bedroom and didn’t see my roommates except to eat and go to work so I could play Ocarina of Time; it took me about two weeks to finish.  In the years that followed, I had more than a few instances like that—game game game work eat sleep game.

So it continued on like that throughout my 20’s.  While sometimes I’d go as long as six months without touching a video game, without really even thinking about it, I’d always eventually get my hands on a new one, or get the urge to revisit an old favorite, and I wouldn’t walk away for months.  And that cycle was something I was content with and wasn’t a big deal until I got married.  The funny thing was, after I got married, I never did reach a point of boredom with gaming to where I’d put it away with nary a thought for a few months.  I couldn’t stop playing them.  That, coupled with entire evenings wasted in front of the TV for no good reason, led to the idea of the year-long “media fast” that my wife and I did.  It took a while for us to really get to the point that we didn’t sit around staring at a wall with nothing to do, but once we did, it was great.  I read tons of books, practiced guitar hours a day, ate dinner at the table.  Great stuff.

But of course it ended.  Funny thing—take a look at when my last blog post was.  September 20, 2011.  That was approximately one year after starting the fast.  At first, I had no desire to go back to playing games.  Dona and I had already began to enjoy evenings watching Seinfeld DVD’s over dinner, but I could take two episodes at the most and I was done.  I really didn’t feel like playing any of my FPS’s on Steam, but I did sit down to try and play Sim City 4, and the enthusiasm for that died within minutes.

“I’ve beaten this,” I thought.  After more than two decades of my life given over to ultimately useless, digital pursuits, I spent an entire year staying away alongside my wife, and had little desire to return.  Then I remembered a game I’d heard about in the year previous.

I knew that Minecraft was supposed to be addicting.  In fact, the first thought I had about playing the game was not a welcome one—I even went on Facebook asking people to talk me out of it.  But it was no use.  I tried out the free version of the game on Minecraft.net and subsequently spent something like 36 hours on it over the following three days.  I’m not stretching that number.  Since I could save nothing and had started over thrice, I accepted the inevitable and paid for the game and downloaded it.  At first I tried to restrict myself to an hour a day.  Then that became an hour on weekdays and three hours per day on weekends.  Within two weeks that was completely thrown out, too.

I would go to bed thinking about the game, and wake up thinking about it.  At work I would spend my lunch breaks watching Minecraft videos on Youtube.  I would get home and fight every inch of my being to resist going straight to the computer.  I’d sit down to play guitar, but get irritated over the smallest monotonies, and eventually just put the guitar down and turn on the game.  Eventually I wouldn’t even bother with the instrument at all.  It’s scary for me to remember what it felt like to turn on the game after hours of actively resisting the desire—complete euphoria.  My wife, not one to not let me know what she thinks, was constantly on me about playing too much.  Not so much because of how it affected her, though that was certainly a factor, but because she could see so clearly what it was doing to me and how Rational Braden would be very upset at the sight of that.  She was not treated kindly in response, I’m sorry to say.  Actually “sorry to say” is microscopic to how bad I feel about that now.

I had spent the months before Minecraft thinking out which books to read next, how to structure guitar practice time, or what to write about, or even early thoughts on how to start a business.  Once Minecraft came into the picture, I began spending almost every literal waking moment making plans for super railways, massive underground fortresses, mapped-out continents and oceans, and Nether-based transportation systems.  Perhaps most tragically, I actually found myself wishing for unemployment again, or for my wife to take a weekend trip somewhere, so I could spend days without interruption playing.  I had more than one weekend where I would put in more than 30 hours between Friday night and Sunday night.

Eventually I began to admit to myself under the surface that there was a problem, but it took the better part of two months for me to reach a point that I admitted to myself that I was truly facing a scary reality:  I would have to delete Minecraft and make a personal commitment to never get sucked into games again—even if that meant never playing another game for the rest of my life.

It’s interesting how that played out, because one morning just a few days after realizing that, I got up from my living room chair and sat in front of my computer and deleted everything I could.  I was still waking up a little, which thankfully impaired my rationale.  I did have some momentary freak-outs, since I was essentially deleting weeks upon weeks of “work” in that virtual world I’d lived in, but I went through with it and removed my access to my Minecraft account to the best of my ability (deleting fully is not an option Mojang offers).  Even if I were to get myself back into my account, all my progress would still be gone.  That actually becomes less of a hindrance the longer we go, because the longer it’s been since I played the game, the less I care about what I was working on prior.

But with all that said, I don’t miss the game at all and have no desire to return.  I didn’t like how that time felt.

Yet I’ve not completely beaten my video game bug.  I deleted Minecraft sometime in January, and due to some of my other games not running well in Windows 7, I was able to keep it under control for a couple weeks.  By then end of February, though, I was on Steam playing some of my inexpensive, dated, First Person Shooters, from Jedi Knight II: Jedi Outcast, to Jedi Knight: Jedi Academy, to nearly the entire Half-Life series, from Half-Life: Source to Half-Life 2 and its subsequent two episodes.  I’ve spent the last week working on achievements in Portal.  

So what keeps me here?  Why do I keep coming back?  For one, it’s very easy time spent on accomplishing what feels like a lot of things.  In the course of two weeks, I went from being a recently-hired theoretical physicist at an Arizona-based research facility to assisting a group of rebels fight a human-alien dystopian oppression.  I’ll say that reading books is ultimately a “healthier” activity, but no matter how well that prose is written, you can’t get immersed in a story the way you can in a well-designed video game, and Valve is undeniably among the best in their field in that regard.  Anyone who’s awed at how connected they get to Alyx Vance or furious they get at Wallace Breen knows what I’m talking about.  One of the most exciting story-based moments ever for me was the start ofHalf-Life 2: Episode 1 when the G-Man gets interrupted by the Vortegaunts block him, and he looks at you and says so seriously, “We’ll see about that.”  It’s hard to really appreciate out of context.

I love creativity; I love a great story.  Video games have evolved to the point that I can have both to my heart’s content and accomplish little else . . . the twist being, of course, that my heart will never be content and satisfied with them because I’m not sure that those innate passions were meant to be fully satisfied.

So here we are again.  Two days ago I deleted Steam and all of my progress of the last couple weeks.  It was a little easier than Minecraft, mind you, because all my Steam games are stories that I’ve finished.  I want to stand on top of a platform and declare that I’ve written off video games for the rest of my life, and that my passions for creativity and great stories will be channeled into music and reading and writing–but I can’t say that.  If Half-Life 3 or HL: Episode 3 ever come out, or when/if Portal 2 gets really cheap, I’m not going to last very long.  I just hope I can keep myself clear long enough to actually accomplish some things in the meantime.

Hey Man, Quit Wasting That Gibson!

Let’s open with a story.

Years ago a friend and I worked in a department store.  My friend was working one day and had to help cover the registers.  It was shortly after Christmas, probably January or December of whatever year it was (1999-2001).  My friend said he was ringing out a mother and her obnoxious 10/11/12-year-old son.  The son was whining about her not buying him something he wanted, and my friend got the impression that this kid often whined his way to getting his way, but the mother was, albeit sheepishly, resisting this time.  “No, I said!  We just had Christmas,” she said to him.  He returned, pouting, “Yeah, but I didn’t get nothing.”  Mom seemed a little annoyed, “A Gibson Les Paul is not ‘nothing.'”

My friend checked with me later, “Hey, are Gibson Les Pauls expensive?”

Yes, friend.  They are.

Pictured: Cha-ching

Wrapped up in that story is the essence of what I want to address here:  I really can’t stand seeing people own very nice (and very expensive) guitars (or any musical instrument, really) but not really USE them.  It is simultaneously irritating and stupid.  And understand that this isn’t just Gibsons (though they’re the most commonly abused as I’ve seen), but any nice,  high-end guitar or equipment.

Why?  Because those guitars were designed and built by people whose PASSION is guitar.  You can’t be wishy-washy about that instrument and make and sell one over which millions of guitarists melt over the sound.  Sticking with Les Pauls for this example–first of all, that’s the guitar designed by Les Paul.  The man was a walking legend by the time he was 30.  He INVENTED the electric guitar.  He played one professionally until he died at the age of NINETY-FOUR.  The guitar he designed has become synonomous with other greats like Jimmy Page or Frank Zappa or Pete Townsend.  You don’t get a job working in the American factory that builds $2000+ Les Paul guitars because you’re a layman needing work and you filled out an application.  You have to be an artisan.  It’s the same idea for any other high-end guitar, whether made by Gibson, Fender, Gretsch, Paul Reed Smith, or anyone.  Well . . . anyone but Jay Turser, but one really shouldn’t bring up Daewoo when talking about muscle cars.

I’ve known people with really nice guitar equipment that barely learned how to play, and really didn’t care too much to advance.  Look–if you don’t want to advance at guitar or any instrument, that’s your choice, but to have nice stuff and let it collect dust is shameful.  It’s like someone buying a professional-grade mixer and just using it once a month to beat eggs.  Imagine being a professional chef, or a even just a very enthusiastic cook and foodie, and visiting their home and seeing an amazing $700 piece of equipment sitting on their counter and learning that they really only know how to cook speghetti and scrambled eggs and don’t care to learn anything more; when  you point it out they chuckle, “Oh, yeah–that.  It’s nice, but I usually just order out, really.”  It’s close to the same thing for me when I see that Gibson ES-335 sitting next to that 2×12 Orange Combo amp in a corner in the room you never go in.  (note: I’ve never actually seen THAT, but you get the point.)  There’s a certain amount of honesty with ourselves that we should all have to be able to understand that we don’t need $2000+ of stuff if we’re going to use it twice a year.  That guitar and amp would be happier in the hands of someone who appreciates it, and you can go drop $200-400 from the sale on a Squier Telecaster and an 8-inch Peavy amp.  Everybody wins!

Now, to clarify . . . if you have that $700 mixer and don’t know how to cook or bake very well, but you got the mixer with intentions of doing and learning more–go get ’em.  So when a beginner picks up something like a $1200 Fender Strat, I still think it’s a bit of overkill for such early stages, but if they’re really going to work at it, I’ll happily keep my mouth shut.  Like the guy that I recently learned about (through sources I will not reveal in my blog) that spent $3500 on a Les Paul and is a total beginner.  Stupid?  Probably.  But if he sticks with it, what can I say against him?

Well that’s all on a personal level.  I have to KNOW someone before I’ll notice wasted guitars in their home, and if thats the furthest this annoyance went it wouldn’t be worth its own blog post.  But it keeps going . . .

😦

Okay, so the Jonas Brothers are very over-bashed in my opinion.  Not because they’re actually talented (from the little I’ve heard, I don’t believe it), but because before Justin Bieber came along, they were the popular flavor for the internet to hate.  So please understand that I’m not jumping on, nor trying to revive, that band wagon.  It’s that I’ve seen dozens of pictures of these kids around the interwebs, and in so many of them they’re “playing” guitars I’ve dreamed of owning for a decade.  Like lil’ scrunchy-face up there.  (And if you didn’t know what a Gibson ES-335 was when I mentioned it earlier–that’s it, in the hands of a child).  They don’t really USE them . . . do you think  he even touches that Bigsby arm, except maybe to move it out of the way?  It’s all for show, and that’s a waste.  But then again . . . the Jonas Brothers are owned by Disney, so they have the money to throw around.  What about bands that AREN’T funded by milti-billion dollar corporations?

A few months ago a friend commented on a video of the Plain White T’s song “Boomerang” (a band whose style reminds me, in the worst way, of that song “I’ll Never Let You Go” by Third Eye Blind; gross).  I had the video imbedded when I wrote this, but the account has since been removed.  He said that it’s ridiculous that three guitarists are all playing the same chord in the same voicings.  He’s right.  I add that it is also ridiculous that bands like this bother to buy such expensive equipment (they were playing a Les Paul, a Gibson SG, and a custom acoustic of a brand I didn’t notice) when they’re going to just play power chords and not try to do much else.

. . . for example . . .

I guess if you’ve earned the money, there’s not that much wrong with it (plus you can write it off your taxes if you make music for a living), and that leaves me with not much to say against it . . . except respect what you’re holding!  After having your band recording and touring for years, wouldn’t you want to improve your skills to improve your sound?  No?  I guess that’s just me.

A while back I saw a show with four bands.  The second of the four was who I went to see (named Moneen), and the opening band actually stole the show in my opinion (named Moving Mountains).  There were two Fender Telecasters and two Gibson ES-335’s between those two groups.  I should point out that the ES-335’s were VERY used.  Whether those guys bought them new or not I don’t know, but calling their appearance “weathered” is putting it lightly.  In both bands, the guitars were put to very good use; they were clearly loved and played often.  None of those guys are necessarily hair-metal-virtuoso-level guitarists, but they’re really good players that do a lot with their instruments.  You can click that link above if you want to look into Moving Mountains, here is Moneen showing skill and comfort with their instruments:

Then the first of the two headliners got on stage.  Eisley.  I have nothing inherently against Eisley.  Actually, after hearing the song in this video I might look into them a little more.  But take note of the guitarist NOT singing . . .

Since I’ve seen this band live, I can assure you that ALL of their equipment is top-of-the-line.  I was actually a little weirded out by how un-weathered their stuff was, but maybe they’d just done a shopping spree before the tour.  But did you watch the second guitarist?  That’s what she did most of the show, too.  I’m not saying she shouldn’t be in the band; I am saying you don’t need to spend thousands of dollars on guitar equipment if you’re going to play bare-bones basics.  In principle, it is a waste and, in some degree, an insult.

Then there was the headlining band:  Say Anything.  I don’t have much good to say about them in general so there’s not much to say about their equipment.

I think the last and primary point that I want to drive home with all of this is that we should remember that a guitar is a musical instrument made to make music.  It’s become such a symbol of so much else that even players like myself lose touch with that reality.  But what would be the result if people would learn to enjoy spending time with something like actually playing that expensive instrument they bought instead of refreshing Facebook or turning on Black Ops?  I can write some other time about the idea of not assuming that being good at an instrument means you have to join a touring rock band, but as I pull everything together that I’ve said in this post, that’s a large part of what I’m saying.  I think, anyway.  Or maybe I’m just jealous.

Five Non-Original Movies I Want to See Made

You’ve heard it all before, over and over again.

“Hollywood doesn’t have any new ideas!”

“It’s all the same stuff!”

“I liked Avatar better when it was called Dances With Wolves / Pocahontas / The Last Samurai / Fern Gully.”

“ANOTHER remake of a classic 80’s movie?  Why ruin that one, too?”

I tend to agree over all.  A modern take on a nostalgic television show is fun cinema; well-done movies based on characters once thought to be un-filmable are awesome; wiping away poorly-done franchises and starting anew is like a breath of fresh air; any one of those movies that are good are worth wanting to see a sequel from (when the story supports it).  But they can all get dramatically overdone.

It’s part of how “Hollywood” works.  The film industry isn’t first and foremost an artistic medium; it’s an industry–a business–that seeks to find customers the same way Amazon and Honda seek to find customers.  “What do people want?  Give them that!”  Sad, really, because unlike Amazon and Honda, movie producers aren’t very good at figuring out what people actually want.

Or maybe they are, and people are just idiots.

But where 20 years ago the idea of remaking a classic show was fun, and ten years ago the idea of a movie for every superhero or 80’s cartoon you could dream of was mind-blowing . . . it’s all gotten really old.  I’m mildly excited about the Avengers movie next year, and appreciate the way they’ve worked up to it, but it feels five years too late.  Rebooting Spider-Man is shameful.  Of all things, The Karate Kid did NOT need to be retold (I’d joke that Back to the Future is next, but I would not put that past them).  And why in the world couldn’t they let Pirates of the Carribean be a surprisingly decent movie without sequels?  Or FOUR movies?

I know that original ideas are risky in a business where a small film budget would feed an entire third-world country for a decade (think about that the next time you go to see a crappy romantic comedy), so I’m going to propose some films I’d like to see made based on other material that aren’t simply raping the same old corpses.  Please forgive my graphic imagery.  Let’s begin.

The Wizard of Oz

Blasphemy right off the bat, right?  You’re probably conjuring comparisons to the  new Willy Wonka movie, or Alice in Wonderland (both Burton films . . . hmm . . . okay, I declare that neither Johnny Depp nor Tim Burton are allowed within 100 feet of this movie).  But you’re forgetting about how amazing the new True Grit was, or how fun Ocean’s 11 is.  You’re also probably not thinking about how greats such as The Ten Commandments and Scarface are technically remakes, too.

But Wizard of Oz?  Judy Garland is as synonymous with Dorothy as Julie Andrews is with Mary Poppins, Stallone is with Rambo, Mark Hamill with Luke Skywalker, or *cough* Schwarzenegger is with Conan the Barbarian.  Yet I think this could be great, even though no matter how well it’s done, some people will act as if whoever the director is has tried to rewrite the Bible.  Actually, I think they’d be less concerned about the Bible.

How to do it:

They should add as little as possible to the story, and return to the book as the primary source.  Drop the songs, too.  I guess already the comparisons to Charlie and the Chocolate Factory are opening up, but bare with me.
They should NOT make the story dark or gritty; nor should they leave it with a for-kids-only cheesy, cartoony air.  Make it a true family film–one the kids can enjoy but one mom and dad and teens and young adults and child-free advocates can all get through without rolling their eyes once.  In other words, don’t overdo it, but don’t be afraid that the little ones might get scared.  That’s life.

They should bring back parts like the four jumping across the river, and the Lion rushing his friends through the poppies.  Put the green glasses on the characters when they’re in the emerald city, and bring the magical hat back that controls the flying monkeys (and thus making the monkeys neutral at worst, not evil).  Include the porcelain village and the headbutting dudes, and the other parts not in the old movie.  Mention the four corners of Oz, discuss the surrounding desert.  The only things that should be added or changed are parts to transition better between scenes, as the book feels rather disjointed from chapter to chapter.  And in the end, it wasn’t all a dream.  Let Judy keep that plot change.
They should NOT
add useless and boring subplots.  Like scenes of the family of the Wizard when he still lived in America, or cut-aways to the all-community search for Dorothy back in Kansas.  Stick with the original point of the story–if it’s too short and needs to be expanded (unlikely), then do that by fleshing out the four main characters.  Make sure it comes across, unlike in the old movie, that Dorothy’s three companions all had what they sought all along.  Don’t waste time on back stories of the munchkin village.

Most importantly–make NO inferences that Wicked is canon.  At all.  Have your own opinion of that book/play, but honestly I say that the whole “The bad guy’s actually misunderstood and the good guy is actually spoiled, arrogant, and manipulative” storyline is over-done and contrived.  Moving on.

They should cast an actual young girl in the role of Dorothy, instead of a teenager with a golden voice.  The Tin Woodsman (yes, you’d need to call him that so people can distance this film from the classic) would need to be a guy in a suit (but a really good mask, maybe CGI–no silver facepaint).  The Lion will likely be CGI, but the Scarecrow would probably work well as some kind of puppet.  With CGI when necessary.
They should NOT take the George Lucas approach and do everything on a blue soundstage and CGI even the munchkins.  They especially should not even toy with the Robert Zemeckis ping-pong-balls-on-quality-actors method. They should take the Peter Jackson approach and do everything as low-fi as possible unless it absolutely won’t work without more current technology and methods.  (But that’s not a Wizard of Oz thing, really, that’s an every special effects movie ever thing).  They also should not ignore how awful Aslan looks in the Narnia movies–so watch that.

Dragonlance: The Chronicles Trilogy

One of the biggest issues I’ve had with the types of “geek” movies that have come out non-stop for the last 10+ years is that they’re really only focused on material from a few genres.  Mostly superhero comics.  Beyond that, 80’s Saturday morning nostalgia–most of which, let’s be honest–would work well as a superhero comic (and many have).  But there’s one genuine nerd genre that’s almost completely ignored:

Fantasy.

It was my main one from middle school all through high school, too.  “But they made Lord of the Rings, Narnia, and now they’re making The Hobbit!” you say.  Oh, absolutely.  And they’re great (mostly).  But LOTR and Narnia are like the Batman and Superman of Fantasy.  They’re the big names, the genre definers.  What I want now are the Iron Mans, the Green Lanterns, the Watchmen.  Those lesser-known-by-the-general-masses-but-still-completely-awesome stories.

Also, there’s something that both Tolkien and Lewis don’t have nearly enough of–

Freaking Dragons

Dragons are what any sane geek loves most about the Fantasy genre.  Only Tolkien and Lewis can get away with such a low dragon-to-fantasy ratio.

Don’t–I repeat DO NOT–start thinking about crap like that Dungeons & Dragons movie from a decade ago, and don’t immediately think of Dragonheart, either (though you are allowed to say, “I am the last one!” to your heart’s content).  Where the former is so baffling-ly bad you question the IQ of those behind it, the latter is FIFTEEN YEARS OLD.  It out-dates Titanic.  A guy I had an adult conversation with last week was finishing up kindergarten then.  I guess there was Reign of Fire, too . . . but that movie was so forgettable that I added this sentence in a week after finishing this post.  With as far as special effects technology has come since 1996, I feel anyone who likes an exciting movie should start lobbying for more dragons.  And what better place to start than Dragonlance?  It’s still only a slight step into obscurity from LOTR, heavily story- and character-based, and lots of great visual and action potential.

They should grow a pair and do it as three movies with a full-scale production and huge budget with the best special effects they can muster.
They should NOT accept CGI that looks like it belongs in the lava scene from Aladdin.

They should focus on story and character.
They should NOT  let the special effects take over.  What makes a great movie great and worth watching over again is story and good characters.  It’s why Spider-Man 2 was great and memorable, where Fantastic Four: Rise of the Silver Surfer was not.  Dragonlance has those things, but I can see them being looked over far to easily in favor of cheap thrills.

They should make a tasteful trailer that hints at story and teases with known, key scenes.
They should NOT do that crap with thumping dance music and quick-fade cuts to a heartbeat and shots of Goldmoon’s inevitable skimpy outfit.

They should NOT cast a woman as Tasslehoff.

They should move quickly lest we hungry fans be bombarded with more crap like this or this.

Uncle Tom’s Cabin

This is where I shift from desperate movie and geek culture fan to “stuff that Braden would really just like to see.”

Over my most recent bout of unemployment, I took full advantage of living one block from a library branch and read a ton of stuff.  One book I read was Uncle Tom’s Cabin, a fact that still baffles me not only because I read the whole thing but because it’s currently my favorite novel.  The thing is though, there’s never been a large-scale production film made on it since 1927.  There’s been some independent films and a made-for-TV movie back in 1987, but nothing that’s as memorable as a movie based on this book should be.

It would be a pretty daring feat to make this movie, though.  Being a 160-year-old story based around a controversial social structure (to put it lightly), it’s been twisted and abused and deformed by all sides involved.  Many black Americans have rejected it because of the tired stereotypes it started, and because of the softening and twisting to the characters in “popular culture” done since the end of the Reformation.  Many white Americans have rejected it because they refuse to believe slavery was ever as bad as the book actually insists it was.  Others avoid it at all costs because they either find it too controversial, or they were forced to read it in high school and are still bitter.

I’ve only seen small clips of two of the movies made (the 1927 silent film and the 1976 independent film, both of which you can find in their entirety on YouTube), and then I’ve read the Wikipedia section on the 1987 movie, and while the 1987 one seems to be far more accurate, the makers of that film bragged about leaving out scenes like when Elisa skips across the ice on the Ohio River.  Why would you leave that out?

They should tell the story as it was meant to be told, exactly as it’s told in the book, with the message it’s had for 160 years.
They should NOT water it down to appease those that still try to suggest that slavery wasn’t all that bad.  Nor should they toss in parallels to civil rights or gay rights or modern social agendas.  Nor should they alter any characters to avoid “stereotyping” black or southern Americans, since those stereotypes were set after (and because of) this story.

They should leave in every exciting scene, and every emotional scene, and every challenging scene.  Let audiences experience a STORY and not just a message, yet not walk away forgetting the impact this story had on America.

Maybe it’s idealistic, but I see a great production of Uncle Tom’s Cabin being a wonderful exercise in racial unity between blacks and whites.  Maybe that’s silly.  Maybe.

Final Fantasy VI

Something that everyone knows has never been done well is the video game movie.  The big problem with them is that the source material doesn’t lend well to interesting movies.  Games are mostly about action or thinking about strategy in some fashion, and it’s hard to get that to translate well to film.  And in the event that there IS a story to tell, rest assured that some arrogant Hollywood exec is going to screw it up.  So why do I wish for this one?

The story of Final Fantasy VI (originally renamed Final Fantasy III for its American release) is one of my favorite stories of all time.  It’s very Star Wars-esque with hints of Jason Bourne and even Batman . . . but really that’s just the tip of the iceberg.  There’s so much there.  The whole plot taken out of the hunting for treasure chests, bonus side quests, level grinding, and long boss fights could fill an (interesting) two-and-a-half hour movie easily.  I’d bet you actually would have to do two films.

Hey, if The Hobbit can do it, so can Final Fantasy VI.

There’s so much character development that would need to be done it’s insane . . . because you could only reduce Umaro, Gogo, and Mog to bit parts or cameos.  Strago could play a smaller part, as could Gau, but Relm is important as (spoiler alert) it would definitely need to be revealed that Shadow is her father and we would need to care.  But Cyan and Setzer have very moving stories that need full attention.  Then you have the most important ones: Locke, Terra, Edgar, Sabin, and Celeste.  They carry the biggest part of the plot.  But hey, I’ve said before that character carries a good movie, and this would need that.

They’d just need to be careful that Kefka isn’t an alternate version of The Joker.  That could be very easy to mess up.

The biggest problem, honestly, would be what to name it.  You couldn’t reasonably call it Final Fantasy VI.  There’s no 1-5, and as much as I love FFI, FFIV, and FFV, and enjoy FFII and FFIII, I don’t want to sit through movies based on those just to get to this one.

Wow.  The more I think about this, the more I think it would absolutely work.  The writers don’t have to fill in a story to make up for the loss of all the gameplay time.  The characters are deep and moving.  The plot is interesting and engaging.

I call dibs!

A Three-and-a-Half Hour Biopic on Chicago

Haha!  I thought of this originally as a joke, but of course now I think it’s awesome.  Rock/Music biopics don’t typically fail, critically or commercially, but they can be awkward and jumpy.  I think, first of all, we all need to plan to spend a large portion of our afternoon at the theater when going to see a rock biopic of any nature.

That said, I’m willing to bet one could make a really awesome movie based on Chicago, but for a band with a nearly-45-year history, and just over 20 of it really interesting, you need to pick a smaller section to focus on.  I think it would be best to focus on the span from just before their formation in 1967 to right when they make it back on top of the charts again in 1982 with the release of Chicago 16.   It’s perfect.   You’ve got humble beginnings, existing for music, pop chart success, struggles with pop chart success, sacrificing art for pop chart success, climaxing at the accidental suicide of guitarist Terry Kath, falling out of favor with the music-listening public, and then climbing back on top.  Sure, there’s another 8 years, at minimum, that’s interesting to tell, but you’ve got to draw the line somewhere.  And that gives it a happy ending–but you could do the last scene and last shots at a huge, sold-out arena as they play “Hard to Say I’m Sorry/Get Away,” and their faces are filled with an unmistakable mix of relief and joy, plus an uneasy caution.

I say the best way to do the music is to take the Walk the Line route, and let the actors sing the parts.  The trickier part is getting them to play the instruments . . . you can’t fake musical mastery like all those guys had/have.  Well, no mountain worth climbing is easy.

. . . man, that’s a good idea.

Dibs again!

How Being Lazy Got Me Where I Am Today

Five years ago today I set out on my life’s second-greatest adventure.  The first is getting married, in case you didn’t figure that out on your own.  But on August 2, 2006, I left my college town and home of four years, Carbondale, Illinois, for Seattle, Washington.

Ahh, home. Well--"home" is actually about seven miles back and to the left of where this photo was taken. But I'm sure you knew what I meant.

It’s a pretty big deal to me that I’m here at all.  I could bore you for a long time and discuss how there was a time when I had so much uncertainty about my future that I was kind of afraid I’d end up in some small Illinois town for the rest of my life.  (No hate if that’s where you are and prefer it–it’s just that I really didn’t).  But that wasn’t what God had for me–turns out I just had to listen to him and take some risks.

What’s interesting is how I ended up being here in the first place.  I sometimes wonder if there were a single, seemingly trivial event or action in my past that set me on the path to not only moving to Seattle, but getting married to my wife.  For a while I thought it might have been if I’d not heard punk rock in the summer of 1995 . . . but I might have heard it and loved it at a later time.  Or maybe if I’d gotten a car other than my Mazda 626 in January 2002 . . . but who knows what car I would have ended up with otherwise?  It all seems like I was headed this way no matter what–but the other day a memory struck me and I realized that my entire present hinged on a single moment of laziness and apathy over thirteen years ago.

Early in the summer 1998 I was at Lincoln Land Community College taking a series of placement tests so I could get signed up for classes.  I wrote an essay and did some basic math and answered some science questions and probably some on history.  The last test I had to do was algebra.  Well, I was kind of tired of taking tests, you see.  I had an unlimited amount of time for each question presented to me by the computer, and I had the option to re-do any question before completing the test, and I may have even been told if I got an answer right or wrong when I submitted it.  The point is that this should have been an easy test for anyone that knows how to do algebra (which I did).  But I was bored and feeling lazy and wanted to get home to probably play video games or something, so I just selected random answers for the last two-thirds of the questions.  I got most, if not all, of those wrong.

The result of this act of impatience and laziness was monumental.

I was given a class schedule of Monday, Wednesday, and Friday from 10 a.m. to 3 p.m., no breaks.  I was placed in a zero-credit math class due to my performance on that assessment test.  The math class was at 11.  In this math class, I became better acquainted with a guy named  Aaron, who was the saxophone player in a local, Christian ska band.  Within a week or two, I also met a guy nicknamed “Skippy,” who I got to know that semester as well.

Well Aaron asked Skippy and I if we wanted to play in a swing band he was trying to start up.  Skip also played saxophone, and trombone was my primary instrument at the time.  We both said yes.  By the end of that semester, the band we formed had dissolved but a strong friendship between Skip and myself had formed–so strong that my circle of friends had almost completely changed by that point from what it was the previous summer (high school friends, mostly) to people I met through Skip or people I met at college, along with Skip. Point being, my social life, interests, and activities after that semester would be shaped because I was friends with him.

Around nine or ten months after that band fell apart, Skip borrowed a bass guitar from a mutual friend and started playing.  He called me on his first night with it and said, “We’re starting a punk band.”  I had been playing a little bit of guitar for a few months and was sure I could handle power chords in punk songs, so I claimed guitar and vocals for myself, and then mentioned another friend of mine to Skip that played guitar that could join us (named Aaron, but not the Aaron from before).   The three of us started the band the fall of 1999, had a drummer by early 2000, and the band came apart early 2001.  But the four of us in the group were still good friends.

So good, in fact, that Aaron and Skip simultaneously decided to attend Southern Illinois University at Carbondale starting the spring semester of 2002, and they made sure they were roommates.  When the fall semester of 2002 came around, it was my time to move on to a University.  I had picked SIUC precisely because Aaron and Skip were already going there.  I originally wanted to go any place OTHER than SIUC.  Carbondale is next to Murphysboro, which is where my dad’s entire family lives.  Sure, I love all of them, but I originally wanted to attend school in a place I wasn’t familiar with and had no connections to.  To start fresh.  However, being the social person I am, of course I followed friends when that opportunity presented itself.  And almost as if to secure my decision, I had some serious money-for-school problems arise by the summer of 2002, so it was a bonus that I had grandparents and extended family that lived in the area.  But the decision to attend SIUC began with my friends going there, and the fact that I still got to go despite money issues was secured by the fact that I had family in the area.

Well, the fall semester started and–skipping the gory details–my friendships with Aaron and Skip didn’t survive the first few weeks of school.  It was a tumultuous four months with lots of uncertainty, but by the start of 2003’s spring semester, I was fully involved with a local church in Carbondale now called “The Vine.”

By the next fall semester (2003, still), I had met a pretty young lady at my church named Dona, and we started becoming friends.  Right about the same time, the lead pastor of The Vine announced he felt led by God to start a new church in Seattle.  By the following summer, he and a large team of people from our church moved to the Pacific Northwest.  Not quite a year-and-a-half later, I felt strong conviction from God to move to Seattle, too, and be part of that church.  As I said before, I moved on August 2, 2006 and arrived 2 days later.

Nearly eleven months later, under almost entirely independent circumstances, Dona also moved to Seattle.  We were engaged just over a year later and married two-and-a-half months after that.

So there you have it.  I am married to the woman of my dreams and live in Seattle all because I was lazy one summer day at a community college and didn’t feel like finishing an un-timed test.  Had I finished that test honestly, I guarantee I would have passed it.  I knew algebra well enough to get most of them right, and those placement tests weren’t designed to be extremely harsh.  Therefore I wouldn’t have had any reason to get to know (the first) Aaron any better, and wouldn’t have met Skippy under the same circumstances, if at all, and wouldn’t have been close enough to Skip to be the first person he called when he wanted to start a band.  Therefore Skip and (the second) Aaron would very likely not have met and wouldn’t have been roommates at SIU, an arrangement that provided encouragement for me to follow them there.  Had I not ended up in Carbondale, I would have not started attending Vine, thus not being around for, knowing of, or even caring about the church plant in Seattle.  And I submit to be the most important: I would I have met my wife.  Funny how that works.

Oh, sure, you could argue that all of that stuff would  have happened one way or another because it was God’s plan, and I would agree . . . but it’s fun to look at it this way, and to make a case about laziness having a positive impact on someone’s life.