It’s always nice when two seemingly unrelated interests in your life tend to line up and become useful together. Obviously, I’m interested in game design, and have been since a young age. But in the past five years or so, I’ve been very interested in some of the writing and talks from many in the skeptic / rationalist movement. Because these people are frequently at odds with religion, and religion is traditionally thought of as a “source” of human civilized morality, they end up doing a lot of talking on the topic of morality itself. Essentially, their argument is that we don’t need religion to be moral. While this is not at all the topic of my article, the fact that these very intelligent people break down the concept of “morality” for me made it very helpful in analyzing how they may or may not work in videogames.
So what is “morality”? It comes from the Greek root moralitas, which means “manner, character, or proper behavior”. A lay definition that most are brought up to understand is something like, “morality tells us the difference between right and wrong”. Of course, it’s not a descriptive set of natural laws; plenty of people break the laws of commonly accepted morality all the time.
And while there’s certainly a good deal of evidence in biological morality (human beings have a natural ability to feel empathy for one another, which has certainly aided our survival throughout our history), people’s sense of morality is not the same from nation to nation, from year to year. So morality is something we apply to our civilization – it is prescriptive. We use it as a tool to create the kind of society that we would like to live in.
How does it do this? It essentially is a motivation system. It attempts to motivate us towards doing “good” actions and away from doing “bad” actions. It’s generally agreed upon that you’ll be accepted into society and possibly even publicly acknowledged for doing “good” deeds, and you will be rejected, imprisoned or possibly even killed for doing “bad” deeds. So it is a system of motivation which we use to achieve a certain goal – both for ourselves, and for society as a whole.
It’s sort of natural to expect that morality systems would find a great home in a videogame system. We already have a character moving around throughout a world; we’re already giving him decisions to make, so perhaps we can motivate him with morality. Particularly in large, open sandbox-type RPGs, it seems like a great idea to put the player in a position where they’ve got to make a decision on moral grounds.
The problem is: videogames already have mechanical motivation systems. In fact, that’s all they are. Look at a classic American style CRPG: Kill monsters to get money and loot. Complete quests to get experience, and either complete quests or reach new parts of the world to advance the greater plot. Kill the final boss to complete the plot. These are all already very strong motivators. Some RPGs, such as roguelikes, have nothing but these mechanical motivators and still work perfectly well in terms of motivating player behavior.
There are essentially two kinds of “moral decision” systems in games. The first one is the Black and White morality system, most prominently featured in something like Fable. In this game, you get more “good points” for doing good deeds, and more “bad points” for doing bad deeds. These good and bad points have mechanical effects, giving you access to new spells or special abilities. Further, the farther you go in one direction, the better off you are from a mechanical perspective. So, in this system, you basically are just choosing a “character class”, and then you simply have to figure out what all the “good” answers are, and do them. If you instead play “as intended”, and make each choice on your own moral decision, it’s quite likely that your character will be somewhat grey by the end.
This “Grey” state is mechanically less-than-optimal, in a system otherwise totally all about gameplay mechanisms. In other words, the point of something like Fable is to ostensibly “beat it”, reach the end, defeat some final boss and get some final cutscene. Doing this requires mechanical resources – equipment, experience, spells, etc. The goal of most videogames is a mechanical one by its nature, and so it makes little sense to expect a player to just forget about that and make some choice on moral grounds with no consideration for the effect it has on his mechanical game-state.
And regardless, it’s very confusing for players to have two motivation systems – essentially, two totally unrelated goals staring them down all the sudden. Is my goal to get the end cutscene? Or is my goal to be some kind of moral person? I’ll get back into this in a second.
The second type of morality system is the type where it doesn’t keep track of your Good Points & Bad Points, but instead simply takes the plot in different directions based on what you do. Unfortunately, especially when coupled with save-scumming, this basically just has the effect of players re-doing a choice to find out, again, which is mechanically optimal. At the end of the day, these two brands of morality system aren’t terribly different, although I’ll revisit this one at the end of this article.
A lot of people believe strongly in the doctrine that videogames should have loose goals. That if I want to play this for the little morality system, that’s fine. If I want to play it just to “min-max” (focus on mechanical goals), that’s equally valid. And if I just want to mess around and not pursue any goal at all, or if I want to create some unique goal, that’s fine too.
This has a similar problem to the problem of the one-size-fits-all glove. It sort of fits everyone, but doesn’t fit anyone well. Allowing players to choose their goals on the fly can also be called “unclear goals”, and the problem with this is that the weights in a decision can never be balanced. Balancing weights in a purely mechanical, single-goal game is already extremely difficult. But trying to balance weights in a system where you can’t even know what the goals will be is literally impossible.
You know how most single player videogames have pretty clear and obvious optimal moves? This is because of this impossibility. You’ll never create a system where the optimal move is ambiguous without extremely clear goals as a starting point.
Now, if you’re going to respond with, “well, I’ve played a lot of games like that, and they were fun”, don’t bother. Anything can be fun. I’m interested in pursuing guidelines for creating truly great things, and morality systems as we know them are a problem. Similarly, unclear goals are a problem.
I’m not going so far as to say that morality has absolutely no place in interactive entertainment, but I am going so far as to say that morality has no place in things that we usually think of as videogames. There is one kind of system that I could imagine, however, that would allow for morality to function.
My first thought was, “well, perhaps there can be a sandbox situation, and you’re just asked to make moral decisions and then sort of graded on that”. But this is totally broken. Firstly, a morality system needs a counter-weight goal in order for it to be interesting. Whether or not people do the right thing in real life is only interesting because sometimes it benefits you more to do the wrong thing, and equally, sometimes you’ll do things that are not in your own self-interest for the sake of morality. This means there has to be some other resource/goal that isn’t related to morality directly.
So what you could have is some kind of game where there is a mechanical goal, and reaching that mechanical goal is difficult and requires lots of resources and strategic play. However, there are also many other “people” (computer-controlled) who have some amount of control over a large amount of the resources. You have to interact with people to get resources.
By doing some actions, you would make some people reject you and not want to deal with you, or possibly even try to hunt you down. And some actions would have the opposite effect. These effects could based on what town the person lives in, and even an individual preference.
Now, this already exists in many American CRPGs such as Fallout. However, the big differences would be the following:
- Saving/Loading not possible. You must not be allowed to know the exact repercussions of your actions, and you must be forced to accept those repercussions, or else asking the player to make choices is simply giving them a chore to do.
- For this reason, the game should be randomized as well, so that replaying it is possible. Locations, NPC moral reactions (and probably character names and appearances) and resources should all be randomized.
- Due to RPG snow-balling effect, in Fallout, it’s actually feasible to, especially after a certain point, simply kill every human you come into contact with, totally nullifying the morality system. This proposed game MUST end well before any point like this occurs. You should never get so powerful that you can easily take on several opponents at once. And when you die, you’re dead for good, so even attempting this should be a major decision.
In this kind of a setting, I can see morality actually working. In general, you’ll have to try to toe the line in terms of morality, although with a tough enough mechanical goal, and some limit on time resources, you’ll be tempted at some points to make morally grey or even immoral actions. This will cause some level of rift between you and NPCs. Perhaps you can do some investigation with NPCs beforehand to try and suss out what you’ll be able to get away with, but your power in this regard should be limited. You shouldn’t get infinite time to talk to all NPCs and exhaust every freaking dialogue tree like you can in most RPGs.
Perhaps some people will call it a “Morality Roguelike”. I think we need to not call everything that has “permadeath” (another term I hate because all it means is that there are actually decisions to be made) and random content (absolutely essential for a single-player game to have any ambiguous decisions after the first play, which again will be even more important with the aforementioned permadeath) “roguelikes”, because as I just explained in the two parentheticals, these are two fundamentally important properties for all single-player games to have.