I’m proud of modern gaming and its ability to tell real stories. This is why I feel it is important to approach these tales on their own terms: to settle yourself within the shell created by the story’s plot and look at the point it demonstrates within.

This “Part II” post is in truth the third part of a sequence that began here talking mostly about Star Wars. Despite appearances, not everything in life is about blowing up the Death Star. There are lessons to be learned: per my abuse of Shakespeare’s quote on brevity, the “point” is the soul of the story, and the “plot,” genre, setting, and million-dollar-budget special effects are the limbs and outward flourishes. Getting stuck on blowing up the Death Star is foolish when an important lesson about family is waiting for you.

This is particularly relevant to videogames, where, among other things, the player might personally be trying to blow things up. Videogames are more than just entertainment (games are culture), and the example from my last post, Paul Gresty and The Frankenstein Wars, was the “Part I” to set up today’s “Part II.” It was a story about second-class citizens (stitched-together Frankensteinian soldiers), with the added hook that you the first-class general could become one of them. This should be chilling.

Let’s start a bit earlier than that.

I used to think it a shame we couldn’t program things like “personality” on top of Doom. These games were little more than straightforward violence when they came out. However, as I’ve mentioned before, this changed when technology advanced and games like Looking Glass Studios’ Thief (fan site here) became possible. As the latter creators discussed in their “making-of” material, they had many ways of transmitting little packets of information to the player, like written notes and overheard conversations. These weren’t possible in a Doom world of grunting monsters and painted-texture walls.

I was proud for stealthy games. Years later, I came to be proud for horror games: not only could a game tell a story, it could tell a story that affected you. I developed a larger library of horror games than I would have expected, given that I made it all the way through the 80’s and 90’s without owning a single horror movie. And games have only advanced since then.

So they can affect you, but what does this say about getting real meaning? If Star Wars has messages for us, what about the next super-horrifying first-person shooter? I like this section of interview with Ken Levine from the development of Bioshock, which can be found within the Cult of Rapture’s podcast section:

“I always think there are three ways that gamers deal with games that have a lot of story in them. There’s the way that a lot of gamers deal with them, which is like ‘Dude, where’s the next thing I’m gonna shoot, you know?’ And that’s cool and we totally support that, you know, people want to play the game and, like ‘Oh, Ryan, he’s the bad guy I guess, he’s this, you know, dude.’
“Then you have the people who want to listen to what he says, and get the vibe of what he says, and say ‘Oh, he’s trying to kill me because X, Y, and Z’ where the first type of gamer would just be like ‘Oh, this dude’s my enemy and I want to kill him. And he wants to kill me.’
“And then you have that third level kind of gamer who really wants to find every audio diary, and analyse every ghost sequence, and look at every scene in the world, visual scene, read every poster, read all the story text that comes (we have a whole encyclopedia as it were in the game of all the objects), think about all the connections between all the characters. And we really support all those levels. There’s a huge amount of depth of story, an almost novelistic depth of story in the game if you really want to get into it. But also if you just want to get in there and shoot, you know, that’s cool too, but I think especially the people listening to this podcast might be interested in some more of the deeper aspects of the story, and it’s there in spades.”

That’s one thing. Notice how well it overlaps with Jonathan Blow’s speech titled “Design Reboot” (which I keep linking with the lovely animation of choice quotes by Superbrothers):

“Why do people play games? . . .
“1. Games can provide entertainment/fantasy/escapism. . . .
“2. Meaningful artistic expression. . . .
“3. A means of exploring the universe.

Ken Levine’s first level of gamer deals only with Jonathan Blow’s first point of gaming. The first level of gamer needs nothing other than Doom, needs to do nothing more than blow up the Death Star.

Which is not as good a thing as Ken Levine implies.

I’ve argued before about morality in game design. Jonathan Blow is also known for saying:

“When millions of people buy our game, we are pumping a (mental) substance into the (mental) environment. This is a public mental health issue. We have the power to shape humanity. How will we use it?”

Notice what happens when we go above Ken Levine’s first level, up to where “people want to listen to what [the villain] says”:

Luke Skywalker listens to the Emperor on the Death Star and you understand the conflict now: in truth, it comes down to how your family problems are losing you all of your friends. The player in Bioshock listens to Andrew Ryan and you understand the interplay of power, corruption, freedom, and tyranny. (Jonathan Blow criticized the latter’s success, but at least they tried.) In The Frankenstein Wars, the “villain” is the sibling: care to understand your sibling any?

It makes it all the more clear why we need games like The Frankenstein Wars: the player of Doom kills hordes of enemy soldiers, but have you ever considered being the one to send soldiers to their death? Is it a justifiable cause that makes you do it, or is it pride?

What about being sent to war yourself? What about becoming the expendable grunt that will be slaughtered with hyper-realistic 3-D-modeled armament?

What if life and death mattered to you?

Which brings me to a final point in gaming that makes me proud. One would think that big-budget explosions and hyper-realistic weaponry meant the death of real storytelling; obviously we would all be distracted by the explosions. But in today’s world of gaming the storytellers continue to make the push. There were stealth games, there was the horror genre, and now . . . there’s the “walking simulator.”

This term was coined for Dear Esther, by thechineseroom. By labeling it with the joke-that’s-also-serious of “walking simulator,” it’s made clear that there’s nothing to do in the ordinary sense: the entire experience is immersion. You absolutely and without a doubt cannot play these games and expect surface-level escapist excitement to sustain you.

So gaming is getting us about as far as possible as it can from Ken Levine’s first level of gamer; or, at least, the independent game developers are. And even Bioshock had a real point to convey, even if it had its flaws.

If Star Wars still speaks to us beneath the special effects, it’s good that videogames — which have become one of the biggest sources of media for public consumption these days — are just as capable in their own ways. The viewers just need to understand that the story is there at all, that when they get beyond their opinions about the outer shell — the plot — they might gather an important point that makes their life all the better.


I said before — no, wait, I said it over and over — that there is a distinction between the point and the plot. The novel you are writing may have a surface structure with a complex plot, but underneath it you still have some point you’re trying to get across. Or, perhaps, one you do get across, whether you’re astute enough an author to manage it or not.

The same logic applies to giant pop-culture stories like Star Wars: there are familiar, yet important, lessons about family to be learned beneath all the blaster fire. Goodness, the same logic applies to videogames. You may not like faerie tales, or big special effects movie extravaganzas, or big special effects videogame extravaganzas, but there’s still a point underneath that you could learn.

But why do it this way? I argue that it’s important to approach a tale on its own terms: to settle yourself within the shell created by the story’s plot and look at the point it makes within.

You’ve been in this situation: you’re enthusiastic about a story you just read/watched/heard, you want to tell somebody else, and the person you’re telling shoots you down with “I didn’t like it; it’s just about [insert quality here],” such as “it’s just about stupid dragons,” or “it’s just about the special effects,” or “it’s just about blowing up the Death Star.” Here your associate is getting hung up on the surface. And you try, goodness but you try: “It’s not about that! Once you get into it, you see it’s about so much more!”

Well, the issue is that it has all these layers; it has both a plot and a point. So, why place all this complex structure on top of the point? Why build a Death Star? Why not . . . make literally every movie into a realistic depiction of modern people in your hometown having family trouble like the Skywalkers do?

It should be obvious from that alone why the abstraction.

But here I’d like to reference the words of Paul Gresty. Followers of my blog know about my work on Kickstarter projects. Paul Gresty is the author of Arcana Agency: The Thief of Memories, and, most recently, I’ve been working with him on Fabled Lands: The Serpent King’s Domain.

He has other projects including The Frankenstein Wars. You can guess “what this one’s about.” Obviously, as the creators put in their tagline, it’s about:

. . . war and horror, heroes and villains, and the soul of humanity at stake!

But I also like to keep quoting Neil Gaiman, who once put in The Sandman:

“Never trust the storyteller. Only trust the story.”

What’s hiding in the story? Regrettably, neither Paul nor I can find the exact quote anymore, but Paul once described The Frankenstein Wars like this:

[You ask the implications of a world where soldiers are stitched together from corpses. Soldiers become second-class citizens, eminently expendable, and so treated as “lesser.” However, the terror has another layer: with a single bullet, you could join them. You too could be the second-class citizen.]

There you have it.

By stepping into the story world — which is to say, literally any world other than a realistic depiction of modern people in your hometown — you can explore ideas beyond your expectations. If you in the real world are a “first-class person all the way,” how could you possibly be a second-class citizen? Why would you care? Goodness, why would you read a realistic depiction of real-world second-class citizens? Too depressing — or, if you are an egotist, too irrelevant.

But then you read The Frankenstein Wars and you are forced to think about it from a perspective you’ve never known.

Let us pause for a moment to appreciate the reason all fantastic literature in all of history exists.

No, really. By this mark we have justified all fiction, all theater, all campfire stories, all make-believe, if only you listened closely enough to find a point. The theatrics enable you to learn life lessons and develop empathy in situations that would be impossible if you restricted yourself to your existing down-to-earth life.

Hence I encourage approaching a tale on its own terms.

This applies to other things, too.

I’ve said before that I bought Unreal Tournament for the story. It sounds impossible, since surely all gaming is about the flashy special effects, and first-person shooters even moreso. Not so; and I just might have another entire post to make about how proud I am of today’s gaming and its ability to express story.

Which I believe will be my next post.

When you approach a story, such as Star Wars in my post title or anything else, the common way to understand it is “this story is about war” or “runaway technology” or “dragons” or “school competitions” or “piracy on the high seas with historically-inappropriate accents.” As that, it can be compelling. It can compel you never to look at the story again, if, e.g., you hate pirate stories.

This is a darn shame.

To a large extent, it is because the reader/viewer/listener missed the point. Star Wars is a helpful example because it is so well-known and people typically have a clear like-or-dislike attitude toward it. It’s difficult to avoid forming an opinion when culture shoves it down your throat every day . . .

But thanks to that suffusion, the ideas in it are everywhere. You know what the “Death Star” is. Some real-world people claim to be “Jedi.” And all these people seem to forgive that the movies were fallible creations like any other; they just wanna play with a light saber woo!!!

That’s all the surface.

Neil Gaiman had a helpful observation:

“Fairy tales are more than true: not because they tell us that dragons exist, but because they tell us that dragons can be beaten.”

The Death Star is your “dragon.” Or maybe Governor Tarkin is, who knows. Is anybody surprised — anybody — when the hero(es) win(s) and the Death Star is destroyed? Of course not. So why would anybody — anybody — read another fairy tale, or watch another space-fairy tale like Star Wars, when you know how it’s going to end?

Because there is something else behind it. How did you get to that end?

I have repeated over and over across these posts, starting when first I observed Sir Terry Pratchett’s words on it, that there is a distinction between the point and the plot. Sir Terry mentioned how fans tried to give him ideas for stories. “You should do a pirate story!” they’d say. “Alright,” Sir Terry would reply, “what would it be about?”

He couldn’t just do a pirate story. Saying “pirates!” would tell you the plot, but what was the point? What would the life of piracy reveal? What moral, message, feeling, would you the reader take away in the end?

It’s as with those quotes from Jonathan Blow I referenced when discussing games as culture:

“Why do people [enjoy stories]? We already know one of the answers is pretty obvious.
“1. [Stories] can provide entertainment/fantasy/escapism. . . . But if this is all that [stories] were, I would be intensely dissatisfied. Because fantasy and escapism is not fulfilling to me. At the end of the day, I want to feel like my life has meaning.”

Sir Terry never just did a pirate story, as there was no point. He couldn’t just write escapism.

I like the structure of Shakespeare’s quote about brevity, so let me make this overlong. The “point” is the soul of the story, and the “plot,” genre, setting, and million-dollar-budget special effects are the limbs and outward flourishes.

Blowing up the Death Star is essential to the plot. As is beating the dragon. How do you, the hero of your own story, get there? By believing in yourself; letting go; reaching out with your feelings. Funny how that lesson sounds like it can be applied elsewhere. Where else?

Well . . .

Star Wars movies continued beyond just the first one, and a casual plot point about Luke Skywalker’s father became more involved. Seriously involved. An entire prequel trilogy addressed the Skywalker family ancestry, and these days a new sequel trilogy is being released to follow them further. Not to mention Rogue One (Optional Subtitle) A Star Wars Story. And they keep blowing up the Death Star. How does this work?

In a sense, every Star Wars movie is about blowing up the Death Star (or its equivalent). But that’s not what it’s about, i.e., that’s not the point. Really, every Star Wars movie is about family.

In the original trilogy, reconciliation with the lost parent becomes key to the Skywalkers. Lo and behold, a new Death Star shows up just in time to be the one thing keeping them apart. Conveniently, the most personal heart-to-heart conversation is held while standing in the thing — this giant unfeeling monstrosity that is “destroying all your friends.” Reconciliation saves the relationship (all of them!), and, as a pleasant bonus, halts the hero’s own journey to “the dark side.” Pretty good deal when we thought we were talking about space stations.

In Rogue One, yet again, the story is about the loss of a parent. What’s getting in the way? The Death Star. What’s it doing there? Well, it’s kinda the parent’s fault. Sorry about that. “But I was doing it all to protect you!”

Those words . . . huh. I thought we were talking about space stations.

Plot, genre, setting, and multi-million-dollar-budget special effects? Not nearly as important as the people in the audience noticing those key words and remembering them when somebody off the screen says them. Somebody like themselves.

So Star Wars is about believing in yourself to surmount your obstacles; about family in all its conflict; heck, it’s about faith, too, and we could talk a lot about world religion in its relationship to the Jedi. And therefore someone could ask a reasonable question:

“Why didn’t you SAY SO?”

Why all the fantasy? Why all the special effects? Why didn’t you “just” make a story about those things that you’re saying are oh-so-important?

I will address this . . . in my next post.

In my last post, I started a discussion that previously had remained un-discussed. Sadly, to conclude it I must make a post rife with opinions and preferences. I created this blog to explore storytelling and get out my creative thoughts, and thus it might surprise some that I complain about the present post as being “not concrete,” but that is the difference between building a story with a solid framework and arguing whether an actor did “a good job.”

One could say de gustibus non disputandum est, but that’s just too bad when all forms of entertainment are culture. And Star Wars is certainly culture.

Star Wars is a surprisingly divisive topic for such a beloved part of popular entertainment. But perhaps this ties well with my point: I argue that the Star Wars prequel trilogy was made with the same standards as, and with great fidelity to, the original Star Wars trilogy. It’s just that the reception of a story depends also on the audience. Many have words to say on this topic, from Avner the Eccentric to William Shakespeare:

“The kinder we, to give them thanks for nothing. / Our sport shall be to take what they mistake: / And what poor duty cannot do, noble respect / Takes it in might, not merit.”

Many felt disappointed by the then-new Star Wars prequels. In a time gone by, when I had only seen the first prequel, I was “not allowed to speak” in the grander conversation about the trilogies and did not get to explain these take/mistake matters to my associates. Surely the movies made mistakes: when watching the original trilogy a lifetime prior, I, as a child, simply didn’t notice mistakes in the originals. But then, I didn’t understand the movies anyway. Which shapes my segue back into the essay . . .

The legacy, the attempted legacy, and my grasp of the legacy

Consider how the six Star Wars movies have been received by children. Over the years, when I saw discussion on the internet about “my favorite movies” and “my children’s favorite movies,” there would be notes about “Well, my KIDS seem to like ALL the Star Wars movies, even the prequels.”

People said that the new trilogy was flawed, but, as I argued, the original trilogy was as flawed. The plot was as weak: things happened . . . just because they happened. You had to be charitable to overlook the mistakes and enjoy all the fun adventure. Charitable like a child. There are nonetheless some rare individuals who dislike the original trilogy, and I’d bet they just had different standards for what charity they’d give.

My conclusion was that people were feeling equally uncharitable after all the other stuff George Lucas had done. Remember that he had revised the original trilogy around then, which was fine when it came to certain visuals, but, well . . . the question of whether Han Solo or Greedo shot first has been so polarizing that it has been taken as a defining feature of George Lucas’s betrayal.

And then . . . the midi-chlorians.

This was one of the first and biggest complaints about the new prequels. In that era, I still hadn’t seen the other movies, so I still “couldn’t speak” to people who had formed such strong opinions. But I had my guesses as to what happened.

When George Lucas told us about midi-chlorians in The Phantom Menace, people felt betrayed: “How could you explain away the Force?” Well, I doubt he was trying to. A generation ago, “the Force” spoke to our spiritual needs, and every religion in the world pointed to Star Wars and said “Look! Look! That’s how our religion works!” Since then . . . we have become obsessed with DNA.

What if he was trying to duplicate the reception of the original trilogy? What if he was trying to ride the wave of public sentiment and help us enjoy the Force MORE? Sure, yes, I agree that it didn’t work: we continue to want something spiritual that cannot be explained by the physical. But how could he know that a little insertion of science WOULDN’T go over well in a society that now revered science?

So these thoughts were bouncing through my head for years. YEARS. What truly happened in the other two prequels? Opinion seemed to be that the second one was pathetic, and the third was only interesting in that it was “darker.” Still, I wasn’t “allowed to speak” until I saw them.

In truth, what would I really have to say if I didn’t have specific examples to present from the whole prequel trilogy?

Time to watch the prequels.

So I watched the prequels

The Phantom Menace is still pretty bad and I don’t think I can bring myself to watch it again. The problem is that I don’t seem to care: the movie did not have content to engage me. Next up:

Star Wars: Episode II – Attack of the Clones. I am stunned. Here the trainwreck of negativity must come to an end: I ask the public why people complained about this movie. Is it the title? Let’s not forget that this was the same series that gave us “the Death Star.” And remember when Harrison Ford was in this other movie about “the Temple of Doom“? That’s right: this era gave us names that make us wince today. Now that we understand we are here to be charitable, to work WITH the movie, shall we see what happens?

The prequel starts off with a bang. No, I’m not making a joke about the explosive used for the assassination attempt: I’m talking about the second assassination attempt immediately thereafter. The scene engaged me, drew me in. It was only near the end that I realized I was watching a Star Wars-style re-enactment of a classic samurai/ninja scene: there was the loyal samurai (Jedi) protecting the sleeping noble lady by slicing the venomous creature away from her bedside. In the dark. Without hitting her.

And then Obi-Wan Kenobi jumped straight through Venetian blinds to catch a droid in midair. “Oh, that’s right: Jedi are amazing.”

It kept going. It also kept making direct parallels to the original trilogy. Do you remember Princess Leia in Jabba’s barge killing her captor with the very chains that bound her? Yeah, Padme got to fight her executor with the very chains that bound her. And it wasn’t just a blind repetition of the original, but a new event that fit within the scene, not looking out of place.

What was the problem? Apparently, one complaint was that Anakin and Padme had unrealistic interaction. I disagree. I feel that Anakin’s presentation was of somebody struggling with the Dark Side. My only complaint is that, when Padme said to stop looking at her like that, he should have been shamefaced: we have quite enough presentations of relationships as harmful to women (remember: Twilight) that we don’t need more casual disregard.

And if I were to complain about any acting, then it would be Christopher Lee’s. I’m led to understand that he is a movie legend, but in this movie? Not so much, despite being given the opportunity to do both a Darth Vader impression AND an Emperor Palpatine impression (yet more efforts to make the prequels parallel the originals). When he fired Force lightning, did his hand even shake? It’s like he was depending on the special effects to make him look good. I absolutely did not believe that he had the power.

On the other hand, Palpatine’s performance was suitably chilling. All he had to do was put up the hood and he was the Emperor (-to-be). I hope it was as much fun to reprise the role after all these years as it looked.

You don’t have to like Anakin and Padme’s acting; just as I don’t have to like Count Dooku’s. But this movie actually ENGAGED me, giving me reason to be charitable where it was weak. In other words, it was a normal movie, and I’d hope that we can stop whining now. What else?

Right, there’s another one

I’m writing this immediately after watching Star Wars: Episode III – Revenge of the Sith. Yes, yes, per all the reviews, “it’s darker.” However, these words seem to have been said in an effort to make up for how “we all know that George Lucas is terrible at making movies and just had a fluke with how good the originals were.” Again, I have to say that’s not so: he’s ALWAYS been this bad/good at making movies.

And, again, the sheer number of parallels to the original trilogy is stunning. They even got John Williams to rip off more classical music for them. Antonín Dvořák’s New World Symphony? I recognized it; did you?

Lastly, it seems very likely that George Lucas reacted to the backlash against The Phantom Menace by minimizing the parts that didn’t go too well. When I heard reference to the midi-chlorians again toward the end, I realized that the whole of the two final prequels had been cleansed of the things. Further, the idea of the Dark Side being able to prevent death . . . and invoking the biology-based midi-chlorians to do so . . . well, follow me here. This is good storytelling:

For one part, we have a balance in the narrative: the Jedi are unaware that the Sith may have the power to cheat death, but the Sith are unaware that the Jedi may have “Force ghosts” (Palpatine expected Yoda to leave a corpse). For another part, the Sith may have biology, but the Jedi have spirituality. That is, after we were disappointed by the arrival of midi-chlorians in the first prequel, only the Sith came to care about them in the later prequels: the Jedi spoke only of the spiritual matters that we in the audience wanted in the first place.

Of course, now I just want to see Sith-powered “Force zombies.” But I’m getting ahead of myself.

In any case

I come back right where I began: it’s fairly obvious to an adult re-watching the original Star Wars trilogy that these movies are flawed, but we in the audience are charitable and actively make up for the flaws. Now that I’ve watched the remaining prequels, years after displeasure with George Lucas has broken from its fever pitch, I’d argue that they are fine. And flawed. And incredibly faithful to the original trilogy. And I like them.

Now we are on the brink of a new Star Wars movie (or more). Once again, we have preview material showing a menacing character with a modified light saber. Light quillons? Awesome, I can’t wait to see that in action. Other people seem cautiously optimistic as well.

I hope it turns out to be a good movie; and, particularly, because it will have a number of repeat actors, we can expect it to have a certain fidelity to the original. Perhaps with George Lucas holding less of a prominent role, it will even have fewer “blunders” and “betrayals.” But if, once it is released, it nonetheless does something weak or foolish, I ask that the audience keep a little perspective. Please remember that we’re talking about Star Wars here. Who’s more foolish, the fool . . . or the fool who STILL buys overpriced tickets and waits in long lines because “wow, it’s Star Wars“?

P.S.: Now that Star Wars: Episode VII – The Force Awakens has been released upon the galaxy, I am pleased to see that it meets all my hopes and expectations as outlined herein. It is a “perfect Star Wars movie.”

This is an essay perhaps 10 years in coming. Not so much about gaming as about storytelling in general. It turns out that there are many factors in the reception of a story: we in the audience might think of ourselves as objective viewers, but no, we too are part of the experience. Our expectations and understanding and more. Or as Avner the Eccentric said:

“You thought you could just come and sit and be the Broadway audience. No. You’re the audience, and you’ve got work to do.”

I’m prompted to speak because a new Star Wars movie is coming soon. Are you excited? Remember, though, that people were excited about the prequel movies, and that turned out a bit complicated. So here today’s essay begins.

It all started in conversation with friends somewhere 10 years ago. I had watched the first of the new Star Wars prequels — excuse me, I had watched Star Wars: Episode I – The Phantom Menace. But I hadn’t watched any of the others. General public sentiment was that the prequels were terrible, and, going on The Phantom Menace alone, I had to say I was disappointed. Not disastrously, though, and I had hopes for the rest: after all, it was obvious the effort George Lucas had put in to making that first prequel match the original trilogy in style.

So my view was that the two trilogies were basically the same sort of movie. I spoke with the group about that. Then it happened:

“Have you seen the other two?” “Well, no.” “Then don’t talk about it until you have.”

Of course. What was I thinking? It is literally impossible to have an opinion as an outsider. It doesn’t matter if you’ve seen the advertisements, watched the “making of” material, read reviews from informed experts, and followed conversations from everyday people, you can neither think nor speak of a creative work unless you have absorbed it from start to finish.

You know, like Twilight.

. . . People, this is why the Vampirely blog exists. You don’t have to roll around in poison ivy to get the impression that it is poisonous. And once you’ve torn yourself away from horrified fascination at that blog (do check it out if you haven’t), you may recognize this sort of statement as part of a larger double standard: “You must be an insider/outsider to be ALLOWED to speak,” with either format used depending on whom is speaking. I could go on yet another essay about THAT.

But I want to have the conversation that I missed 10 years ago. Because hey, guess what just happened? That’s right. I WATCHED THE OTHER TWO PREQUELS.

Thesis: The Star Wars prequel trilogy was made with the same standards as, and with great fidelity to, the original Star Wars trilogy

I’m not trying to convince you, unknown reader, that the prequels were any good: just that, if you think they were flawed, be advised the original trilogy was just as crippled with flaws. And I’m not trying to convince you that your beloved original trilogy was any bad: just that, if you think it was wondrous, be advised that the prequels were filled with as many wonders.

Follow me on a journey of discovery.

The essayist in the days before the prequels

I loved the original Star Wars movies. Ta da! I’m already on your side, aren’t I?

Except there’s no guarantee that you love the original trilogy. There are quite a few people who believe they were terrible. How can this be? Simple: people have different tastes, different preferences, and different mistakes they will forgive.

Have you ever seen, read, or performed in A Midsummer Night’s Dream? The play ends with a remarkable play-within-a-play, a device for which William Shakespeare was rather famous, but one he put to surprising use here. He explicitly taught his audience the “right way” to deal with a play they didn’t like: to make it better in their head. To make excuses for the mistakes and help tell the story.

I was to find this important shortly.

On the arrival of The Phantom Menace

As with most people (except those who disliked the original trilogy; see above), I was excited to hear that George Lucas was going to fill out the trilogy of trilogies. And, wow! Look at that preview material! A light saber quarterstaff? Creative!

Somewhere around the release, I enjoyed a “making of” feature that they showed on TV, and it too was surprising. I was impressed — deeply impressed — with how dedicated they were on fidelity to the original trilogy. And there was a line about one specific detail of the movie that I only half-heard, and I’d really like to remember it better: something about “capturing the sneer.” I will return to this in a moment.

What did I think of the movie?

Wow, those computer animated characters were annoying. Yoda was so much more expressive as a puppet than as a 3-D model. There were stupid jokes where they didn’t belong and our beloved droids were just comic relief.

In the plot, it seemed things happened . . . just because they happened. I couldn’t feel like anything important was going on. And then, however much importance was placed on the light saber quarterstaff in the previews, they killed its wielder and got rid of the element I liked! Hopefully they had plans to make things more interesting into the future, because they didn’t have much left going for them.

Still . . .

There was something about “the sneer.” Watto, that slaver . . . it looked like he had a sneer scanned straight from the face of the bartender of the Mos Eisley Cantina. Did he? I can’t find any evidence online that this was a fact, but it stuck with me. There were other similarities, too.


Time to re-watch the originals.

On the realization that the original trilogy was not given to us from the heavens

I enjoyed the original trilogy, but of course I had seen it most when I was a child. Eventually I got around to re-watching those three movies.

It was remarkable. For one, I was reminded that the original had jokes throughout. As a child, I perhaps had taken it too seriously, just as many children from a slightly earlier generation mistakenly thought that Batman and Get Smart were all serious.

I also saw more parallels than I ever realized. There was even a bit of an embarrassing moment for me. At the end of Star Wars: Episode VI – Return of the Jedi, when Darth Vader had his helmet removed, I found myself at a loss for words and exclaimed “It’s Anakin!” Well, yes, of course it was Anakin: but what I meant was that I felt an instant visual connection between that actor and the young child that George Lucas carefully found to represent him in The Phantom Menace.

I also saw the problems.

. . . Wow, but the original movies were flawed. Did you realize that? Yes, you probably did . . . unless you took William Shakespeare’s advice and made up for their own failures.


You are in a snowspeeder on Hoth. You want to shoot down an Imperial Walker. What do you do?

If you’re a fan, then I’m sure you’ll immediately have an answer like this: “Their armor’s too powerful to get through, so first you have to topple them, sort of stretch out their neck. When you do that, the strained neck is a weak point where you can blast through their armor.”

Great. Then it’s a darned shame this “neck” stuff was never explained in the movie, isn’t it?

Seriously, go watch Star Wars: Episode V – The Empire Strikes Back. Do it. All you hear is somebody exclaim how their armor’s too powerful to get through, and then, a few moments later, someone blows up an Imperial Walker. If you blink at the wrong moment, you won’t even realize that the shot is fired at the stretched-out neck. YOU in the AUDIENCE have to piece together the explanation and make up for the flaws in the original. Or, as in the case for many of us, we small children never understood the plot anyway and we had to ask our parents why things happened.

Things happened . . . just because they happened.

Or how about content that not even fans can defend? Take Luke Skywalker’s training in Star Wars: Episode IV – A New Hope. You’ve heard this happen: the nervous laugh in the audience. Okay, so, he lowers the blast shield on his helmet and blocks the laser shots, but Mark Hamill’s acting is lacking. He just sort of wiggles the light saber prop around and then pops back up on his heels like a little child re-enacting the same scene. You simply cannot believe that he has learned anything about the Force, and, consistently, I’ve heard a disbelieving laugh from any audience with whom I shared the experience.

So where does this leave us?

At a good place for a break. I will allow a recess for you to digest the above, then it will be the return of the author in my next post. After all, there are a few more movies to consider together, and perhaps you need a moment to re-watch them before a new one comes out . . .

I love storytelling. I feel that other people should care about storytelling, too, because it is so very important to us: it is a fundamental substrate of human existence. A lesson I learned from Neil Gaiman’s The Sandman is that a story is a metaphor for life and life is a metaphor for a story. All our forms of entertainment, including the modern invention of videogames, are culture, and “culture” is just shorthand for (among other things) “who we are, what we do, and what we enjoy.” Stepping sideways into music, there are always the words of Amanda Palmer in her Ukulele Anthem:

You may think my approach is simple-minded and naïve
Like if you want to change the world then why not quit and feed the hungry
But people for millennia have needed music to survive
And that is why I promised John [Lennon] that I will not feel guilty

This sort of reasoning contributes to my general enjoyment of all sorts of games regardless of format. (Though I am aware that each medium has its own qualities that may be employed to good result in the artistic creation. But such brings its own discussion.) For years, when broaching this topic with people I would refer them to my videogame collection: most everything I owned was there because of the storytelling. This included an oddity or two . . . which will come in two paragraphs as the main reason I am posting today.

But before I get there, first let me note the timeframe of this collection. Around the turn of the century/millenium, videogame technology had advanced to the point where real storytelling was possible. I took particular note of Looking Glass Studios’ Thief (fan site here). Years prior, in seeing early first-person games and all their straightforward violence (see Wolfenstein 3D and Doom), I’d imagined the development of a game where enemies had personality, a real life. Your actions might be violent in the end, such as assassination, but this hypothetical game would have computer-controlled characters do such things as sleep, talk, and get angry. They’d HAVE background, instead of BEING background. Then Thief came along and did exactly so (minus much assassination).

The feeling that the world is not “just background,” but that it is alive and filled with living, breathing people, is what many gamers such as renowned author Sir Terry Pratchett enjoyed in the Thief series. I agreed. Thus it was that I bought Unreal Tournament for the story.

It sounds impossible. This is the first-person shooter that made “frag” into a (gaming-) household word. And to this day I have never met another human being who realized that Unreal Tournament HAS a story at all. But I did. Why? And how? Simple: I was the only human being I knew who bothered to read the character backgrounds presented before each match. Thus I saw that the world of Unreal Tournament is one filled with living, breathing people; one where the enemies have personality, a real life. There is even a little mystery about who and what the final enemy of the game is supposed to be. I liked this, and I felt that the background enriched my experience as I played through the high-quality first-person frag fest.

Ken Levine, during the development of the game Bioshock (which is very violent but also has extensive story), discussed how the goal was to ensure the game worked on three levels. On one, the story could be ignored beyond “okay, so, that’s the boss” and it would be a good action game for people who wanted it. On another, the story would be integrated well enough that gamers could observe “oh, I see what motivates these people” intermixed with the gameplay. Then on yet another, of course, the story would be there for people to devour in its entirety, pouring over each log and line to understand the world.

The fact that a batch of “mindless enemies” can be so interesting leads me to now, where I’ve decided to run with this and develop a story world (the same thing as a game world) based around fighter background information. Part brainstorming, part game design, and part just having fun as always. And I will do it in my next post.

Variety is – Part II

August 9, 2014

As I explained, variety is the fundamental substrate of human experience. There is nothing to discuss if there is no variety. Adding the right amount of variety and “visual interest” to a piece of art can make it better. If videogames are art (a part of culture), surely variety has some sort of role in game design.

For today, I had planned to describe its importance across a sample of games, but then wound up with a post mostly about Thief. I’m not talking about Thief‘s more recently-released reboot of the series; no, I’m talking about the original games that left such an impression on gamers that they continue to rank highly in “best of all time” lists, and prompted the recent reboot. The original Thief, and then Thief II, are excellent examples.

But let me start with two others. One day I was interested to see gamers comparing Hellgate: London and Diablo II. The Diablo series made an impact on gaming history, and the first sequel arguably improved on many aspects of the original. When Hellgate: London was presented, it was as an amazing new game from some of the original Diablo creators. Surely it should have been a competitor or even replacement for Diablo II. Why wasn’t it?

The comparison looked to variety, and it’s sinister counterpart “boredom.” In Hellgate: London everything was gray. Gray concrete, gray pavement. Travel from one place to another and you hardly could tell anything had happened. But in Diablo II there was always something of interest. Across the four regions of the game you went from green and brown, to brilliant yellow, to dark green, blue, and gray, to brilliant red and black. Even if you didn’t play the game and simply saw it on someone else’s computer screen, things were always changing.

So in that light let me talk about Thief and Thief II. (Fan site here.)

When it was being developed and released, the original Thief offered two demos, both of which allowing you to do a little housebreaking. My long-term gaming consultant, my brother, played these with me and in the end we knew we wanted the full version. Then we got it and I, for one, was stunned: every single level introduced something new, sometimes drastically new, extending far beyond housebreaking.

My brother and I came to divide it into “thievery” levels, with normal housebreaking, and “Indiana Jones-style” levels, where you found yourself leaping around and mantling surfaces in a 3-D dungeon avoiding traps and nightmares. So, taking just the first four levels, there came “thievery / Indiana Jones / Indiana Jones / thievery” with each teaching the player new game mechanics. This is important: a progression of content matters not just for the sake of interest, but for player learning. The beloved game Portal is studied — yes, studied — for its ability to guide the player through learning tasks in an engaging and enjoyable way, introducing new content at the right speed and with the right tools for understanding.

Even when Thief got back to “normal thievery” in level 4, it struck a high point of artistry with a complex and memorable cityscape — that is, variety in sights, sounds, and setup. (And if you really want variety, I haven’t even mentioned the Escher level yet.) I adored all of this and I hold up Thief as the prime example of variety in a videogame.

Enough so that it could be too much. Note that levels 1 and 4 as I just described . . . were the demo levels. Between the two in the full version is a system shock (mildly-punful joke intended) of dungeon-diving. My brother observed it’s a good thing he played the demos first, as the knowledge that “more normal thievery levels are coming” is the only thing that got him through having to completely recalibrate his expectations for 2 and 3. Nowhere in all the advertisement for Thief did the developers prepare the audience for such variety.

Still, I loved it. When a demo for Thief II came available almost immediately afterward, I played that too. I personally wasn’t impressed by the new mechanics so I didn’t push to play the full version. But then I came to see people listing Thief II, not Thief, as their pick for “best of all time.” An endorsement by Sir Terry Pratchett of all people (a fellow who enjoys playing videogames while writing) finally compelled me to try Thief II.

Now . . .

I could dive into the level design in Thief II. An entire level of picking up tiny coins two by two? Waste of the player’s time.

But I was struck by something else: the developers eliminated variety. Instead of engaging in “normal thievery” and “Indiana Jones” in alternation, you go from levels 1 through 8 robbing from the same human guards in the same geometric buildings over and over again, with a little variation as the Mechanists increase in prominence. Any monsters are just for flavor, entire mechanics (“holy water”) are absent, and even new mechanics (“secrets”) are largely ignored. Despite the standard fantasy concept of “trapped treasures,” for almost the entire game you need not worry about traps. You almost never mantle a wall or climb a rope. You almost never see magic. You never see the light of the sun, period.

Even as Thief II attempted to mimic the original with an explorable “cityscape,” the streets had none of the variety. Buildings were all the same color and architecture, regions made all the same sounds.

At the start of this essay, I discussed the “visual interest” that the successful Diablo II has while the struggling Hellgate: London does not. Doesn’t this mean that Thief II, by visuals alone, should have been a flop?

It would seem that “breaking into houses and beating up on human guards” itself scratched an itch. The popularity of Thief with Actual Killing (I mean, Assassin’s Creed) makes it obvious that people will buy such games anyway, and so variety isn’t everything. But even so, I’ve kept shaking my head and wondering about Thief II: what do people like so much?

Well, I’ve spoken with people on that — people like Sir Terry Pratchett. At the 2009 North American Discworld Convention, he discussed what Thief games give us: an immersive experience. A sensation of being in a living world, where you can look out over the city and watch the people going about their evening. As such, he and his friends hold up one level as the prime example: all Thief II players will immediately recognize level 10, the “Angelwatch” level, as the one where you cross the roofs of the vibrant city and feel life around you. This isn’t the low-variety “cityscape” I mentioned above: Angelwatch is the cityscape concept done correctly.

Okay, follow me here:

Angelwatch was the demo level released shortly after the original Thief. Full of buildings with vastly different colors and architecture. Spread with different types of humans and peppered with different monsters; not for flavor, but where they belong. As far as I could tell, it was designed when the creators were still using the principles from the original Thief, and as such exhibited more variety than any other level in Thief II.

“And so variety isn’t everything,” because apparently people-who-aren’t-me think that Thief II surpassed the original. But what people remember about it isn’t the boring picking-up-two-coins-at-a-time level: it’s the living city level, imbuing the whole experience with a vibrance it would lack otherwise.

Isn’t it funny how variety improves even the “best of all time”? Thus, yes, variety has a valuable role in the storytelling and artistry of modern games, just as it does in any other aspect of life. A lesson to remember.

Variety is – Part I

July 27, 2014

Let us take a moment to appreciate that the URL for this post shall forever be “Variety is Part I,” no dividing punctuation.

Variety is more than “the very spice of life” (per William Cowper). Variety is the fundamental substrate of human experience. And, from there, it should come as no surprise that variety in art style or movie visuals or videogame content is important for “spice.”

So let me give you a “generalization alert” here: I’m about to draw parallels between things that people already know. In this case, I’m comparing storytelling and art to the basic human experience. Can you handle such mind-boggling generalizations?

Consider vision science, i.e., sensation and perception, i.e., that part of psychology concerned with how your visual system works (among other senses, depending on focus). It’s not enough to ask “How do we see things?” because the very question makes an assumption: that “things” are what we see. It’s more accurate to say we derive the existence of “things” after more basic calculations. At the most basic level . . . we are change detectors.

Change is information. Turn your screen black for a moment and look at your blurry reflection. If you needed to summarize what you saw, how would you do it? State “There’s an inch of horizontal forehead, then two more inches, then another two-and-a-half”? No, that’s a waste of breath. More effective is to note “Here’s a line; on one side of the line it’s my skin color, while on the other side it’s my hair color,” and suddenly both hair and forehead are understood. Pick another line and you get the edge of an eye, for instance.

It is these edges, these changes from one state to another, that define what we see. Conveniently, basic eye anatomy is designed to detect edges. Look it up online or take my sensation and perception class if you need more explanation: it’s a fact of the eye that we seek and emphasize change. Not just change across space but also change across time, as, of course, the motion of an object is also part of perceiving the “thing.”

Now consider the people who take advantage of the powers of vision: artists. I enjoy reading webcomics, and as I’ve taken my daily fill I’ve heard artists using the phrase “visual interest.” For all I know, it’s official art terminology as taught in schools — for all I know, it’s an arbitrary yet convenient phrase picked out of linguistics.

What does it mean? I don’t know, but it’s admired in places like Calvin & Hobbes: the comics are “interesting” because the characters don’t just stand around and talk. Not only are Calvin and Hobbes off dashing down a hill when it’s relevant to the story, but they’re walking along logs and clambering over rocks when it has nothing to do with the matter. Something happens: poses vary, camera angles vary, scenes vary, everything varies; especially in this comic, famous for varying basic panel format not to mention content.

Thus does the webcomic artist, say, speak of how a character design could use another detail here or there for visual interest, but people in many domains use this same idea. I remember stepping into a college dorm and hearing the phrase “you need to put posters on the wall or something” over and over again. The posters themselves didn’t have to be good, and, in some rooms, wow were they not; but people seemed to expect something in the blank space, here and later in life. Your porch looks better with a potted plant; the walls look better with a painting; the floor looks better with a rug; something to “break up” the flat expanse. To form an edge and make a change, else it’s all the same everywhere and therefore, by definition, nondescript.

It seems variety is just to be expected in art as in life. What about in videogames? Next time I will discuss it in game design using a few examples — with level of interest to be varied.

Culture I say

April 17, 2014

Let’s talk culture.

In college I developed a definition of “culture.” My journey toward it came from odd places: like teachers telling us students that we had to attend “cultural events.” Attendance even had grades attached to it. Here, look at this syllabus for an orientation class at the University of Maine:

“Each student is required to attend two cultural events . . . . Cultural Events may include, entertainment events, lunchtime lecture series, Art exhibition etc. Only one athletic event can be used. Turn in ticket stubs with your name on the back or a short description of the event and your personal reactions on a separate sheet of paper” [sic]

Whoa, really? Why? What’s so valuable about “cultural events” that you can justify requiring students to attend?

Well here, maybe an explanation can be found in this program description from the University of New Hampshire:

“In order to expose students to the broader constructs that frame our societal environment, as well as enhance their worldview and facilitate the acquisition of a global perspective, the McNair Program will provide access to cultural events for participants to attend. These events will include the fine arts, activities of ethnic diversity, and community/geographical events unfamiliar to McNair participants. During the academic year, participation in at least one (1) cultural event is required of all McNair students. During the summer component, all cultural events on the summer calendar are required.”

Oh, now that’s interesting. “Culture” is about “the broader constructs that frame our societal environment.” And yet we’re still talking about (per UM) “entertainment events” and “art exhibitions.” Yes, UNH also gave the example of “activities of ethnic diversity,” but pray tell: what are those? Demonstration of ethnic dance, perhaps? Workshops in making arts and crafts? All the things that make people happy or make their world more livable.

Culture is entertainment. Perhaps entertainment and art, if you feel those are separate categories.

Or at least, culture is entertainment when we speak of “being cultured.” Ask yourself: what is a cultured person? Images come to mind of an upper-class individual quoting Shakespeare. Which, come to think of it, is exactly in line with these college links I provided: once upon a time, universities existed to create “gentlemen,” the properly-cultured individuals of classical education.

But we need not look to upper-class snobs to quote Shakespeare. As you know, the average person is capable of saying that “all the world’s a stage” or complaining “lord, what fools these mortals be.” Culture, it seems, is nothing but a shared geekdom. It is the idea that you have experienced some entertainment (or art) and so have I. It is the assurance that if you ask “wherefore art thou Romeo?” then the people across the way know you’re not calling them Romeo; you’re quoting Shakespeare.

This means that videogames are culture.

Absolutely no way around this. Some people ask “Can videogames be art?” Less-presumptuous people ask “Are videogames art?” because the first question assumes it currently is not. But no one, NO ONE, questions whether videogames are entertainment.

Last time, I wrote about Jonathan Blow’s speech titled “Design Reboot” from 2007 (with the lovely animation of choice quotes by Superbrothers). Here’s some more:

“Why do people play games? We already know one of the answers is pretty obvious.
“1. Games can provide entertainment/fantasy/escapism. . . . But if this is all that games were, I would be intensely dissatisfied. Because fantasy and escapism is not fulfilling to me. At the end of the day, I want to feel like my life has meaning.
“2. Meaningful artistic expression. Coming from a different angle than other media. . . . Music doesn’t feel like a movie or a poem. In fact, if you have a song that is sad and a poem that is sad, the sadness from the poem is going to feel fundamentally different than the sadness of the song.
“3. A means of exploring the universe. . . . Games are formal systems . . . and systems like that are biased toward producing truth (or at least consistency). . . . You can think about mathematics. You start with some axioms that are defined or assumed as true and then you have some rules that you can use to combine those axioms . . . until eventually you end up with something that makes a statement that must be true that you didn’t know when you started.”

Thus we have videogames. Valve’s popular Portal series is quoted by people who share this geekdom: “The cake is a lie.” “We do what we must because we can.” “For science. You monster.” How did we get to this point? Portal is a puzzle game that fully explores its mechanics, granting the player an interesting new “means of exploring the universe” (“thinking with portals”), and then going forward logically. The gameplay engages the audience, as does the humor in the unfolding story; and, as the story proceeds, it explores the humanity (and lack thereof) of the characters in the play. I mean the plot. Thus is it both “entertainment” and “meaningful artistic expression.”

It is culture. The “cultured gamer” has played Portal. Just as you can say the word “Tetris” (no link possible, for it is ubiquitous) and the people across the way know you’re not sneezing.

And now gaming culture has been around long enough that the earliest gamers, predominately starting from the 1980’s, are now the grown-ups raising children. Just search online and you’ll see bloggers asking when and how it’s okay to introduce children to their own personal geekery (in movies, comics, or games). The people raising the next generation, the people running and spending money on today’s businesses, are people who’ve played Tetris.

So today’s game developers are advised to remember their creation is not “just a game”: games expand our mind and our language, they “frame our societal environment,” and they’d jolly well better “enhance [our] worldview” in preference to shrinking it.

I’ve performed in Shakespeare, and I’ve played in Portal. The great playwrights of past centuries are all dead. Videogame developers aren’t. Are you prepared for history to hold you to the same standards?

Last time, I argued “A well-designed game has the correct amount of choices, elements, mechanics, and so on, with little excess.” It’s not just that “too little stuff” is a bad thing, but “too much stuff” is as well. Such flies straight against a casual understanding of gaming.

Standard practice for videogame RPG’s is to release an update with a new character class. Standard practice in collectible card games is to release a new “set” with new mechanics. Standard practice in, well, any game with levels (as in “levels you play”) is to release “downloadable content” with new levels.

The company proudly trumpets “the such-and-such expansion,” “now introducing the so-and-so class!” It is exciting. It is big. It can be very fun indeed! But then, just as with all advertizing, it is repeated enough that a million consumers believe “patching a new character class” is exciting, and big, and something every game must do. They believe “more is better.”

So now I argue how that is wrong. Good game design means the correct amount of player choice (or content in general), and well-designed expansions are a real thing. But you still may make a game worse by adding more material.

My examples: classic chess, Jonathan Blow’s Braid, Blue Manchu’s Card Hunter, Nintendo’s Super Smash Bros. Brawl, and Capcom’s Capcom vs. SNK 2.

I already cited Braid as an example of the push to come with up a game mechanic, fully explore it, and then stop before creating pointless “filler.” Now consider chess:

“A choice” in chess means “a move of one game piece.” Basic movement on a square grid could be horizontal or diagonal. Maybe even both. You could move a limited number of squares or an unlimited number. Now, about exploring the possibilities: do we have an unlimited horizontal mover? Yes, the rook. Unlimited diagonal? The bishop. Unlimited in all directions? The queen. Limited in all directions? The king.

Then it gets a little funky with the pawn, and let’s not even talk about the knight, but we’ve run most of the way through available combinations. So if I were to make an “expansion” to chess, could I add any pieces and still have a good game design? Say I added a limited-movement piece that could only go diagonally: would it be exciting, and big, and worth your money to buy? No.

So let’s go to the extreme. Capcom vs. SNK 2 and Super Smash Bros. Brawl grant “a choice of a fighter” (or a team). These games are very popular as the descendants of unique and fun predecessors. They also are the epitome of “just add more characters; more is better”: Capcom vs. SNK 2 has 48 characters, some of which are literally worse or better versions of others; Super Smash Bros. Brawl has 37 characters and the same situation.

Hand a game controller to a new player who is unfamiliar with these series and ask “Who would you like to play?” The result, as per Barry Schwartz, is paralysis. A new player cannot understand how to choose, much less avoid choosing “the wrong character.”

I selected the unusual link for Capcom vs. SNK 2, above, because it shows how the players themselves have put characters into “tiers” and then “banned” the ones that are “too strong for tournament play.” It doesn’t matter how popular these games are, or how good their predecessors may be: the game design has become worse by adding more material.

In the end, design decisions can be assessed by their effect on player choice:

Consider Card Hunter, in beta development, with big decisions still underway. The first “choice” in-game is “a choice of three party members.” Characters come from three races and three classes. If you chose one of each race and class, it would be akin to chess: you could perform every “move” in the game.

Blue Manchu states they will add a fourth class. The impact? It is no longer possible to play every “move” in the game. Your choice becomes “Which one class (at minimum) will you leave out of your party?” This choice is still meaningful: it is far more meaningful than “Which 45 of these 48 will you leave out of your team?” As with most videogame RPG’s, the player may enjoy going back later to play with the class (or classes) left out. It is good design.

A well-designed game has the correct amount of choices, elements, mechanics, and so on, with little excess. When designing, your challenge is not to figure out new content to throw at the player under the guise of “more is better”: your challenge is to explore your mechanics until you know your content is good. Then you can understand what else might be good.