Consider yourself warned
People are influenced by media of all kinds, whether it’s propaganda, advertising or the powerful results of identifying with certain stories. And yes, we are also influenced through playing games. I believe that many of the worlds problems come from a lack of self-understanding, mainly due to lack of opportunity for meaningful reflection on values, spirituality, religion and other social ideals. Games are a medium that can be used to provide a simulated context where the player can wrestle with a pre-designed moral dilemma. If you can craft the game that is morally ambiguous and has a meaningful dilemma packaged in an entertaining game, we can use them to raise people’s consciousness on a number of issues.
I am interested in how games can make meaningful change in the world, on the level of values, ethics and morality. Yup, I am interested in how games can help educate people. Educational games. When is the last time you read those two words together without wincing? But it’s true, that’s what I’m crazy about. And I know there are a ton of failed educational games out there that are worth our chagrin, but if you continue to read on, the games I’m thinking about, don’t really think of themselves as educational.
My bias, on a personal and professional level, is that I work towards social justice for Indigenous peoples. But to be able to have a meaningful conversation with non-Indigenous people, I need them to recognize certain values and ways of understanding that they have denied or haven’t had the opportunity to reflect on. I want to create games that wake up these values inside of them on a personal level, to wake up their love for the things that I love. From there we can move forward. Video games are my target medium for two main reasons: they model agency within systems brilliantly and they are the most popular medium presently for many modern countries. I want to hit the biggest audience, and it just so happens to be a medium that is amazing at providing a context for meaningful agency. Let me take a couple steps back and talk some wishy-washy philosophy though first…
“Values may easily clash within the breast of a single individual; and it does not follow that, if they do, some must be true and others false” -Isaiah Berlin
I am of the philosophy, that in this present increasingly connected world our minds are filled with all sorts of confusion about values, priorities, spiritualities, coffee, games and such. And many people don’t often have the time, energy or -most importantly- the opportunity to deal with all that. We are exposed to all the worlds religions and ideas and then everybody is tasked to somehow sort it all out. As if anybody has time to make sense of the universe and all the stuff in it whilst doing everything else society demands of us, like making Christmas cards for our extended family.
What I believe games are potentially good at, are providing a safe place to work out the moral and ethical confusion we may have about the world, while not appearing to be places to work out the moral and ethical confusions we have. Nobody buys a game thinking, “Man, I got a lot on my mind. Better pick up Mario Maker.” But I believe people are attracted on a subconscious level to certain games, and I am also of the belief that some people subconsciously actually want to make sense of all the confusing stuff in this universe. Nobody(?) likes to be confused and we seek out medicines for what ails us. We often turn to media and art for not only comfort, but also for consciousness raising. Arrival was good fun, but it also got people thinking, hopefully, about how much impact a theory of language can be on the way we understand…. well, everything.
So how to craft a game that provides an opportunity for players to deal with their issues? First off, if there are design principles here, one is that these games need to have moral ambiguity. Most moral games, books, movies etc., have a particular goal for the consumer. The designers/artists of these media often have their own telos, their own end-goal for the player, yet before people even get into the game it the game, the players already have that yucky feeling that this game is trying to teach them something. It’s like those two guyson your doorstep about to knock on the door who pretend to be all about meaningful discussion, but are actually trying to convert you?
Don’t try to convert your player! People turn off immediately if they think you are trying to teach them something. They want to figure stuff out for themselves. For the purposes of what we are trying to do in this project, we absolutely must not try to teach, but try only to provide meaningful situations where players have the agency to deal with the issue on their own. Having a specific moral goal in a game is patronizing at best, and ultimately doesn’t allow for critical decision making for the player.
Human beings need to create their own, what philosopher Charles Taylor calls, “Best Account” of the universe and its ways. Understanding that people need to raise their consciousness through their own personal contextual understanding is what he calls the ‘BA Principle‘ (See: Sources of the Self). Instead of trying to force people to accept a foreign model of the universe (conversion), give them the opportunity that provides feedback on working with their own present model, their best account. The thing is, a raised consciousness is not something that can be forced onto people, it must be cultivated within the person’s own consciousness. People aren’t passive receptacles of ethical learning. Oh! It’s also important to recognize that many people do not want to raise their consciousness, at least consciously. Have you ever met what you considered a close-minded person? Yeah…
What is the main mechanic here at work in this design though? We want to craft the game with a purpose in mind, but that’s different from having a direct goal of trying to implant a certain value that never existed in the player. Inception reference. The purpose is providing a context for meaningful personal (potentially community?) development to occur. In a previous post I outlined how some games present a situation that pits certain significant values against each other, as opposed to pitting good against bad (Good vs Bad isn’t really a meaningful moral challenge to players). Getting all philosophy 101, an example could be pitting the values of security against freedom. Or in my case, if my goal is social justice for Indigenous Peoples, rather than trying to craft a game that has someone reflect on the brutality of colonization (specific), or other some important issue, I believe aiming at non-Indigenous player’s values could potentially have much more long-term impact.
The moral dilemma presented in my example is between wanting to progress but needing to recognize cultural and individual differences.
Let’s say universalism is my target problem. Universalism diminishes significant paradigmatic cultural differences and I consider it a theory that contributes to assimilation of Indigenous peoples to colonizing nations. Yet, no matter how much I might know universalism to be wrong, people need to be able to figure out, for themselves, their own thoughts on this issue. But how to design for it without appearing all educational and pushy? Make it a fun puzzle game first and educational second, through theme.
My example is a somewhat simple. The player encourages robots who start on one part of a map to go to another part of a map as the goal. When the robot gets the the goal, there is a new slightly more challenging puzzle level to play through. The player’s job is not to control the robots, but to learn (at first) about the robots’ idiosyncrasies and help them achieve their goal of getting to the door.
The game space can be one screen big and a top down view. On the left side of the map there is a robot standing there and on the right is the goal. You can investigate, say, by clicking on the robot. Perhaps you can ask it certain questions and find out their fears, motivations, likes and dislikes (or whatever you want to add). Once you find them out, you can try to help them be motivated to move to their own goal. Let’s say on the first level, the robot is scared. You might have a small inventory of items and actions you can use/do. For the sake of keeping it simple, maybe by giving them a stuffed toy, they are no longer scared and go to the the door on their own.
I have a ton of ideas of making it an actually viable complex and challenging puzzle game, but no need to go into detail here. Eventually, as the player progresses to increasingly difficult levels, we might have different kinds of robots that have similar ways of seeing the world as each other and not until you can understand their underlying values, can you understand how to help them achieve their goal of getting to where they want to go.
So how does this help Indigenous people? In a very sidelong way. The game is, on the surface, a puzzle-game working with robots, but the underlying value is not only understanding individual differences in people, but collective differences in people on the value level. Instead of treating people as motivated through universalism or biological reductionism, the nuance of important significant differences is highlighted. As players, by practicing the skill of asking questions and learning about others in the game, they may be more adapted to dealing with those who have powerfully different outlooks than themselves. Differences as significant as Indigenous and non-Indigenous worldviews.
There is a risk here of course. It might backfire and teach another lesson, of how to manipulate people. That is absolutely not the lesson that I want to teach. It can be minimized though through careful design and aesthetics. For example I try to minimize by using robots and not Indians, or white people, or whatever. The goal is understanding and empathizing with difference, not controlling and manipulating others for selfish reasons.
It’s not perfect, but is an interesting idea to work with. Let me know your thoughts on it 🙂
Also see this video by extra credits about what they call ‘tangential learning’