Saturday, April 2, 2011

Learning Genetic Algorithms, the max problem

One of the nicest things about Genetic Algorithms is how general they are- notice that the operators we looked at in the last post did not make any reference to the problem they where involved with solving (except fitness evaluation, but that can easily be replaced with whatever we want). The main thing we have to think about when solving a new problem with a GA is how to encode solutions to the problem.

One problem we might want to use a GA to solve is the "best" series of choices. For example, we may have many compiler flags, purchase choices, forks in the road, or any other type of yes/no decision to make. We can certainly do more then just yes and no, but for simplicity lets consider the situation where we must make 6 choices. To make it even simpler, it will turn out in our example that to get the best answer, we should always choose yes. The maximum fitness is 6 and the perfect individual will get this fitness by choosing "yes" for each choice.

So how do we write down random symbols that encode a series of 6 choices? Like so- 001010, 111000, 101010, 001011. This will be our population- each individual will chose "yes" for the indices that it has a 1 in, and "no" in the indices that it has a 0 in. Since we want the individuals to always choose yes, the fitness function will just count the number of 1s in the individual.

Remember the first step is determining fitness- f(001010)=2, f(111000)=3, f(101010)=3, f(001011)=3. Now lets let selection choose a new population- 111000, 111000, 111000, 001011. The way this is actually done is by creating something like a pie chart, where each individual gets a slice proportional to how high its fitness is. A little spinner is spun 4 times, in this case, and the individual that it lands on gets copied into our new population. This is called Roulette Wheel selection.

I want to mention that there are other selection methods. One, called Tournament Selection, just holds a series of tournaments to determine who gets copied into the new population. Each tournament consists of a random selection of individuals from the population (normally 2) and the competition is just selecting the individual with the higher fitness most of the time, and randomly choosing a less fit individual as the winner some of the time (normally 75% of the time the better individual wins). An individual might participate in many tournaments, or it might not be chosen for any tournaments at all.

Now lets do some mutation- 111000, 111000, 111000, 001011 becomes 111000, 111000, 111100, 001011.

And Crossover, pairing individuals up randomly- (111000, 111100), (111000, 001011). Lets say both pairs are chosen for crossover, as the rate of crossover is normally set pretty high (70% or so). The first pair gets the space between the third and fourth index as the cut point, and the second the space between the fifth and sixth. (111|000, 111|100), (11100|0, 00101|1), where the "|" is the cut point, becomes 111100, 111000, 111001, 001010.

We are going to go for more than one generation in this example, so lets start from the beginning- f(111100)=4, f(111000)=3, f(111001)=4, f(001010)=2. Notice that not all that much progress has been made and we certainly don't have a perfect solution (which would be 111111). Well, lets keep going- selection gets us 111001, 111001, 111001, 111001. It looks like one individual has taken over the population- we say the population has converged.

This population has no diversity, which is a very bad thing. Diversity is vital in a GA to get good solutions and partial solutions to create good results. Now we have to let mutation randomly mutate the 4th and 5th digits from 0s to 1s in order to discover the best individual. This is one of the contributions of mutation to a GA- it will introduce new genetic material into a population that no longer has the diversity to find good solutions through crossover.

I won't go through the details of the rest of the run- imagine that mutation takes several more generations to mutation those last couple of indices to 1s, and we find that the resulting population is all individuals that look like 111111. Maybe a 011111 is thrown in there too- mutation can decrease fitness just as easily as increase it.

This problem- finding an individuals of all ones- is called the MAX problem for GAs. There are many standard problems for different techniques that try to isolate properties of the problem solver (GA in this case) so that variations of the basic methods can show is they do better than others in the aspects that these problems try to showcase.

I sneakily introduced the "yes or no" encoding as 1s and 0s to introduce the most common encoding for individuals in a GA- a binary string aka bit vector aka {0, 1}* aka base two numeral.

I think next time we will move on to Genetic Programming on our long road to RGEP. Good times ahead!

2 comments:

  1. I think it would be interesting to see a simple example of a situation where there are multiple peaks an valleys in the solution space, so that people can see why you need to do more than just hill-climb. Maybe an example of a hill climber getting stuck on a local max.

    ReplyDelete
  2. What a good comment! I will totally do this.

    ReplyDelete