In game design, balance is important. If you create a variety of options for the player to choose from, but one is superior to the rest, then the rest might as well not be in the game because the player will always choose the the best item.
Dominant strategies are often an accident. Whether it is a lack of play-testing or an oversight, designers don’t usually put them in on purpose.
But it is easy to see how the existence of a dominant strategy ruins things. Instead of having a lot of choices as the designer intended, the player effectively has none.
But in most cases, the player is trying to optimize their play, and the existence of an always-optimal choice means the player is always going to make that choice.
There are also choices that are always terrible. They also might as well not exist because the player will never choose it over a superior option.
In real life, balance is not guaranteed. People make all sorts of choices in all sorts of circumstances.
For many people, these choices aren’t real choices at all.
For instance, who to vote for. While we seem to be gaining a U.S. presidential candidate every other week, eventually it will get pared down, with our two-party system causing many people feel like they only have two choices: bad, and worse.
Technically, they have two other choices: not voting, or voting third party. But many feel that these aren’t real choices. One abdicates responsibility, and the other feels like you barely doing any better since the majority of people think they only have two real choices and so your third party vote ends up having a negligible effect. You feel like you’re railing against the wind because not enough people joined you.
In other cases, the choices might be there; you just can’t take advantage of them.
In some countries in the world, practicing your faith is deadly. Talking about the problems of the government is deadly. Protesting is deadly. You could say that the citizens still have a choice, that they are independent, but it would take unusual courage and strength for them to stand against their oppressors. It’s heart-breaking. The door to the cage might be open, but those armed guards don’t look like they’ll let you walk through them unscathed.
In countries like Greece, bad policies have resulted in the majority of the population paying for the sins of a few major players. The people can’t leave the situation easily, and it is frustrating because the way out of the situation isn’t obvious.
It’s easy to take our independence for granted. People have fought for our rights for centuries, whether it was winning our independence from foreign enemies or our livelihoods and dignity from domestic ones.
People can complain about the President’s policies or the way Congress can’t seem to cooperate to put together meaningful legislation, and they don’t generally need to worry about retaliation from the government.
You can leave a job with terrible conditions and find another, or start your own business, or go on strike and demand better conditions. Yes, some choices here are more painful or terrifying, but not overly so. We as a nation frown upon monopolies specifically because the lack of real choice is seen as harmful. We get concerned when one company seems to be able to set their own terms independent of competition or the health of their workers.
You can change your religion, and aside from sharing in awkward family meals or attempts to make you feel guilty, the consequences don’t tend to result in a shortened life expectancy.
Sometimes the guards to the cage door are only ourselves. Maybe we’re blinded to the opportunities, or we don’t have all of the information to make an informed choice, or it takes more effort than we realize, or our circumstances make it difficult, or maybe we aren’t bothering to participate.
But we can fix or change any of those circumstances. We can learn more about the situation. We can make plans. We can get help.
Don’t waste your opportunities. Don’t take the easy route. Don’t go with the weaker strategy in life just because everyone else around you is using it.
Take advantage of your independence. You have choices, and even if it is hard to do so, you can make them.
In almost any endeavor, you can go it alone, or you can get help. You can spend all of your time researching and practicing and tweaking until you figure things out, or you can buy a book or hire a consultant and have someone tell you what they have already figured out after years of his/her life were spent on the topic.
Leveraging the work that has been done by others is a shortcut, and it is perfectly fine to take them. If you want to learn how to do software development, you don’t need to build your own computer architecture, as you can leverage the existing Von Neumann architecture in most modern machines. You don’t need to start from first principles. Someone already figured it out, and you can take advantage of it.
This kind of advice is ingrained in our culture.
Don’t reinvent the wheel.
Don’t spend your time doing that task when you can hire someone to do it for you faster and at a level higher quality, which saves you time, too.
This is the way it has always been done, and it’s the best way we know.
On the other hand, sometimes we advance the arts and sciences by starting over and exploring our assumptions.
In Bret Victor’s talk The Future of Programming in which he pretends to be an IBM engineer from 1973, complete with transparencies and a projector, he talks about the problem of people who think they know what they are doing:
He starts out explaining the resistance to the creation of assembly code by the people used to coding in binary. Coding in binary WAS programming, and assembly was seen as a waste of time and just plain wrong.
He goes on to talk about exciting advances in programming models from the late 60s and early 70s, and extrapolates some tongue-in-cheek “predictions” about how computers will work 40 years in the future, predictions that lamentably did not come about. Today we still code much the same way people did back in the 60s.
Ultimately, he warns that there is a risk to teaching computer science as “this is how it is done”.
The real tragedy would be if people forgot you could have new ideas about programming models in the first place.
The most dangerous thought that you can have as a creative person is to think that you know what you’re doing, because once you think you know what you’re doing, you stop looking around for other ways of doing things. You stop being able to see other ways of doing things. You become blind.
Game design applies here, too. Video games from the 70s, 80s, and 90s were quite varied. People were figuring them out because no one knew what they were. They tried everything.
Eventually some key genres popped out of this period of experimentation, and some control schemes and interfaces became common. It’s hard to imagine real-time strategy games without Dune 2‘s UI conventions.
It occurred to me that game design, like any evolutionary process, is sensitive to initial conditions. If you want to stand out, you need to head back in time to the very dawn of a genre, strike out in a different direction and then watch your alternate evolutionary path unfurl.
When people think of a match-3 game, they have something in mind because all match-3 games tend to be similar. Triple Town ended up being quite different, yet it was still recognizable as a match-3 game, and people loved it.
Some people merely need to leverage existing infrastructure. People are using Unity for game development because, much like Microsoft’s XNA before it, it handles all of the boiler-plate for you, and it also provides a lot of the technical tools in an easily-accessible way so you can focus on the development of the game rather than the technical details of making a game.
But some people are pushing what’s been conventionally thought of as possible. Spore, for instance, had to procedurally generate animations for characters that weren’t prebuilt, which meant someone had to figure out how to do so. There was no existing 3rd-party library to leverage. The shoulders of giants here weren’t high enough.
But what bothers me when reading this book is the warning about trying to completely invent a new algorithm on your own. Skienna argues that most problems can probably be adapted by sorting the data or otherwise thinking about it in a way that an existing algorithm can solve it.
And he’s right.
But someone had to have figured out these algorithms in the first place, right? Someone saw a problem and had no way to solve it, so he/she came up with a way, optimized it, and published it.
But today I’m expected to just learn what they did and use it, and I feel like I’m being told to stay away from actually trying to figure out a better way on my own, as if all of the algorithms that can be invented have been invented.
And if I just want to solve particular existing problems, it’s probably practical advice.
But if I want to explore an entirely new kind of problem, what am I supposed to do with old assumptions and solutions? Square pegs don’t go in round holes, and I don’t think we want a future where we are taught that round holes are the only kinds of holes in existence.
But part of the reason is because he’s concerned about a lack of opportunity in game development:
…my point is that it’s no longer about just making games. It’s not about games that look good, games that play well, games that have a message, games that are different, games in a popular genre or theme; No, instead it’s all about games that stand out, and games people want. You can’t advertise or market your way to success. Those things help, but only if the game itself has that potential. Almost every successful indie you know has put multiple years in to their projects. And for every indie you know, there are hundreds you don’t. It’s not practical to just make games and hope to make a living.
We really did our best with Sunset, our very best. And we failed. So that’s one thing we never need to do again. Creativity still burns wildly in our hearts but we don’t think we will be making videogames after this. And if we do, definitely not commercial ones.
And the problem is that just making a good game is no longer enough. The job of the modern indie developer is to make a good game & put it in front of millions of people.
And I think that means that we need to change how we think of indie game developers. From basement coders to people who understand marketing & business. After all, what we’re doing is running small businesses.
It sounds like the easy days are behind us, and it is going to take real work from now on to not only make a good game but also to do the ugly, messy things that it takes to run a business, such as marketing and sales.
But wait…hasn’t this always been the case?
I remember reading about the swelling of the supply in games on the Indie Gamer forums ten years ago. Someone was nice enough to keep track of the releases from week to week, as well as the top games, and eventually a conclusion was reached: if so many games are getting released every day, and it takes you anywhere from months to years to make a game, that’s a lot of competition you have to wade through to get noticed, and that’s only if you don’t count the many games released AFTER you’ve released yours.
So marketing and promotion were seen as key differentiators. People dedicated to these roles popped up because there was a big opportunity. Game developers wanted to work on games and outsource their marketing.
And this was back during the popularity of Flash portals, before the modern mobile era.
Talin says there are lots of reasons for failed products. Crappy products, crappy marketing, crappy distribution, crappy placement at the stores etc.
But, ultimately it usually comes down to the fact that not enough people wanted to play your game. Especially in this day and age when you can put your game up just by uploading it to some file website. If your game is truly something tons of people get addicted to it will spread around this new wired world. If on the other hand people don’t want your game nothing is going to make them want it.
People were still using shareware to market their games back then.
So, yes, the tools to make games today are easier to access than ever, which means anyone can make games, which means anyone is making games.
It’s crowded, and it is hard to stand out.
But it has always been a business, and most of the serious indie game developers knew this fact. It isn’t some new revelation. The tactics might change, but the understanding that you needed to do market research and get people to know your game even exists was always there.
I don’t like cliché, but “If you build it, they will come” isn’t a viable, sustainable strategy for a game developer. It hasn’t been one in a very, very long time. Maybe when the first personal computers were being released, and your competition was almost no one, then sure, just having the only game in town might work.
And if you are only interested in making games as a hobby, then go to town. Make the games you want to make and see if people might enjoy them. Maybe you’ll make some pizza and beer money as a bonus!
But if you are interested in a sustainable living making games on your own, it’s hard because you aren’t just making games anymore. You’re doing market research. You’re doing product management, which is different from product development, which is different from project management. You’re doing contract negotiation, hiring, firing, accounting, accounts receivable, accounts payable, and more.
And if you are doing it by yourself, you still wear all of those hats even if you neglect a number of them.
But none of this is really new. It’s just an awkward truth that has to be learned by each generation.
I was at a baseball game last night, and I was disappointed.
It wasn’t just because the Iowa Cubs blew an early lead and lost it in the end. It was because while I expect the major league players to give up on first base runs, I expected the minor league players to try harder.
In baseball, if you get a hit and think you won’t even get a chance to run to second base, you are allowed to overrun first base. That is, you don’t need to keep your foot on first base to stay safe. You can run past it, and so long as you don’t indicate that you’re going for second base, you just need to focus on getting to first base before the opposing team can force out.
In little league, we were taught that even if it looked like the other team was easily going to field the ball and get it to first base before you could get there, you run as fast as you can. They might make a mistake and throw it over their teammate’s head. They might panic because it could be close. It’s baseball. Anything can happen in baseball.
And yet, I watched time and time again as the minor league players kept slowing down before getting to first base, as if it was a foregone conclusion that they were out.
From my seat in the stands it might have been hard to tell, but it looked like a number of those plays were closer than their lack of urgency implied. If they gave it a bit more effort, if they hadn’t given up, how many base hits would they have had that night?
Worse is that there were all of those people in the stands, many of them children. They’ll see this example and take it with them to tee-ball or little league. And why not? It’s what the real baseball players were doing.
And that’s what was more disappointing than the loss. It was the example being set.
When Clint Dempsey tears up the referee’s notebook, he’s setting a bad example. He’s supposed to be the international veteran in that game, yet he acted like a child not happy that his parents are telling him that there are rules he has to follow.
When you show up chronically late to your job, you’re setting an example (by the way, Self, that was directed at you).
When you yell and scream at your spouse in front of your children, you are setting an example.
When you post petty, ugly, or hateful things on Facebook, you are setting an example.
And these examples send messages to people, mainly “This is how a real ______ acts.”
Fill in the blank with “baseball player” or “software engineer” or “Christian” or “partner in a loving relationship” or any role or position you can see someone holding.
How are you acting in your roles in life? If you were a stranger witnessing your actions day to day, would you be proud of the example you’re setting?