Home       About Me       Contact       Downloads       Part 44    Part 46   

Part 45: Genetic Algorithms

December 15, 2011

I'm still integrating pieces of my demos and recovering from my pulled shoulder muscle, so there's nothing new this week. Instead, let me drag out a bit of old code and show it to you.

I read a book called "Artificial Life" by Stephen Levy, back when it was published in 1992. It was my first introduction to the idea of "simulated evolution", also known as the "genetic algorithm." In the book, he described a neat little demo of the algorithm, which I reimplemented, first as a C++ program for Windows 3.1, then much later as a Java applet. If you've never heard of genetic algorithms, read on, since it's pretty neat stuff.

The Critters

Imagine we have a simple critter. It has a position on a grid, and a direction which it is facing (N, S, W, or E). It has only one bit of information about its world -- is there food in front of it. It can do only four things -- move forward (eating any food there), turn right, turn left, or sit. It has 5 bits of memory, which are implemented as a state number, from 0 to 31. It can change state after every step in the simulation. Its goal in life is to eat as much of a test trail as possible during the simulation.

The genome of this critter is a table with 32 entries. Each entry has the action to take (move, left, right or sit -- 2 bits) and the new state (5 bits). We have that information for when there's food in front, and when there's no food. This is 14 bits, packed into two 8-bit bytes, for 64 bytes total.

You can easily design the simplest useful creature. You need just one state and two rules. In state zero, if there's food, move forward and eat it. If there's no food, turn right. In both cases, stay in state zero. This creature will follow an unbroken trail of food and eat it all. If there is a break, it will sit and spin, since the move-forward rule will never fire.

To be more general, the creature would have to implement some kind of search pattern for when the trail breaks. Changing state would allow it a simple memory, and it could wander around a bit until it picks up the trail again.

Fig 1: A sample critter

Figure 1 shows the genome of a sample critter (on the right), the trail it is trying to follow (in blue, on the left), and the path it actually takes (in green, on the left.) The current position and direction of the critter are shown with the red arrow.

The critter starts in state 0, pointing east, at the top-left corner of the grid. The highlighted state machine entry says that if there's no food there (which there isn't), move forward and go to state 2. You can follow through the entries and see what this fairly random critter does in subsequent steps of the simulation. It runs for 200 steps. Entries that are never used are in gray.

In the demo, a completely random population of 10,000 critters is created, and the best one is shown after each generation. The controls on this part are straightforward:

  • Uncheck "Trail locked" if you want to modify the target trail. Left mouse button sets a cell, right button clears it. The trail doesn't have to be locked to run the demo -- it just prevents you from accidentally changing the trail when you click on the window.

  • "Default" will restore the default trail.

  • To step through the displayed genome, hit the "Step" button. To start over, hit the "Reset" button.

  • Use the "Best" button to see the best critter of this generation (the default). Use "Random" to look at other critters.

Open a tab here to run the demo.

Evolution

The first generation is completely random, and as you would expect, most of the critters are hopeless. Select some in the demo (use the "Random" button) and you'll see them wander aimlessly or spin in place. The best critter usually makes a fair amount of progress. The successful critters are so simple that they do occcur at random. What is more interesting is that you can get this population to evolve better solutions.

Fig 2: Evolution Parameters

The genetic algorithm used here is straightforward. We take the population in pairs and let them compete. The "Compete" parameter (see Figure 2) controls the odds we take the winner vs. the loser. The higher the competition, the more often winners survive.

Next, we breed a new generation from the survivors. The "Breed" parameter changes the odds that one parent will be replaced by its child. To breed state machine A with state machine B, pick a number N from 0 to 63. Copy rules 0-N from parent A, and rules N+1 to 63 from parent B. This is called "crossover" and is inspired by what happens in biology.

Each child may also be a mutation, created by inverting a random bit in the genome. The "Mutate" control changes the odds from 1 in 200 all the way up to 1 in 2.

The other controls here are the main ones you will use:

  • "New" creates a new generation. You can specify the size in the popup window. The default is 10,000 critters.

  • "Next Gen" creates a new generation. The best of the generation will be shown in the trail and genome areas.

These controls are in the center-right of the demo, just above the graphs.

Progress

The two graphs at the bottom (Figure 3) show the progress of evolution as you bash away at the "Next Gen" button. The left graph shows the history of the population, with the generation number on the x axis. It shows the best score, the average score, and the number of unique genomes ("strains") still in the population.

Two critters have the same genome if the rules actually used are identical. They may have other "junk DNA" in the unused rules. This junk might actually be reactivated during breeding, so it's not irrelevant. But the number of strains does indicate the diversity of the population. As evolution proceeds and the winners dominate, you'll tend to see diversity crash.

Fig 3: Population statistics

On the right, you see the distribution of scores in the population. As evolution proceeds, you'll see bars move to the right. As the population becomes dominated by a few strains, the peak will narrow to a single bar. This graph and the path of the best critter are the most interesting things to watch.

Things to Try

Open a tab here to run the demo. Find the "Next Gen" button at center-right. Click on it repeatedly and watch things change. The trail of the best critter at top-left and the population graphs at the bottom will be interesting.

To inspect a critter you've evolved, step through its genome using the "Step", "Random" and "Best" buttons below the sample state machine.

Next try creating a new population ("New" button), and changing some of the parameters. You'll notice that increasing the mutation rate doesn't matter much. It just creates some noise in the score distribution as low-scoring critters get generated by mutations. However, it can also keep evolution from getting stuck.

Lowering the competition odds can remove all reward from being a successful critter and slow evolution. Increasing competition can force progress more quickly, but at the risk of getting stuck. Giving the critters more steps or a higher population can improve the results, but at the cost of a longer compute time for each generation.

Evolution gets stuck when diversity drops. This tends to happen quickly in this small population. A slightly more successful strain will occur and spread until it dominates. Without variety, all children are basically the same and evolution stops.

To see a pathological case of this, unlock the trail and set the first portion of the trail solid, with no breaks. With this as the target trail, the simple one-state critter described above (step if there's food, otherwise turn right) gets a very high score. All it has to do it move forward and eat. If this critter occurs at random, it almost always breeds true (since it only has two rules), and quickly dominates the population.

You can also try changing the trail after a winner has evolved. These critters tend to be very specialized and if there's very little diversity left, won't be able to adapt.

The demo is a lot of fun. Unfortunately, I no longer have the book I took this from, so I can't credit the original designer of the state machine. If you know, email me or leave a comment.

The source code in Java is here.

Use in Games

There's a huge amount of material out there on genetic algorithms that I haven't read. What use is this in a game design?

If you have a problem where you can precisely evaluate success, but don't know how to get a good solution, this technique might do the trick. You wouldn't be using this within the game. Instead, you'd set up a huge collection of random solutions, breed and evaluate them, and then use the final evolved solution in the game.

For example, if I wanted to design a state machine for animals in the game, so that wolves hunted sheep in groups, and sheep flocked and raised the alarm when they saw wolves, I might use a genetic algorithm. I'd represent the wolves and the sheep with state machines and have codes for the various actions. I'd simulate groups of sheep presented with wolves, and groups of wolves presented with sheep. I'd have thousands of possible state machine values and evolve them.

Or, I might want to create ground cover over a terrain. By evolving state machines that decide when to place a flower on a piece of terrain, I'd have something I could evaluate quickly which is still deterministic (meaning the flower would still be there when you returned to the same spot.)

The hard part of using a genetic algorithm is is setting up evaluation criteria. In this demo, all we wanted was for the critter to follow the trail of food. For the wolves and sheep example, we'd need to decide what a "success" is. We don't want the sheep to never be eaten, and we don't want them all to be eaten. It would be tricky to set up, and would take a lot of experimentation. The end result might have an interesting "naturalness" to it, since evolution tests the state machine with a large variety of inputs.

Since I think this is all very cool, I'd like to find some use for it in the game. But that's for the future. Back to my GUI work!

Home       About Me       Contact       Downloads       Part 44    Part 46   

blog comments powered by Disqus