0

I got interested in GA and want to do my own.
This is the task, I want to achieve:
I got a "world" 16x16 fields. I create 16 bots with random genes. Each gene is an array with 4 numbers from 1-19(16-19 will turn the bot direction and 1 - 15 is the amount of field the bot will go in a specified direction). In this word I take a random position and trying to make the distance from the leader bot to the target as small as possible.

The way I create new generation:

  1. Picking 8 bots with the lowest distance and putting them into the next generation(without crossover)

  2. Doing crossover for the 8 best bots I picked in '1)'(so I get 8 new bots)

  3. Mutating randomly 2 of the crossovered bots and finally putting them into the next generation. Now I have 16 bots in the new generation.

And the problem is: I only get the distance == 0 in 1/100 of all my tries. But I get the distances 1 and 2 quite often(I wait until generation 1000 and then I give up, trying one more time) Is there is a way to improve this? Or is it not possible to do it better with GA?

1
  • show us what you have tried Commented Mar 3, 2018 at 4:13

3 Answers 3

5

There are a lot of things that are going wrong.

Some general comments

  1. Genetic algorithms are usually a course of last resort for an algorist. You use them when things like Dijkstra (which is most appropriate for your use case), linear programming, specific constraint satisfaction techniques, &c all fail. Presumably, you're using them because you want to explore this area.

  2. People who use genetic algorithms rarely expect them to achieve the global optimum of a solution. "Good" local optima are usually the best you can do. A GA will find these fairly easily, but will have a hard time "zeroing" in on a solution. (Papadimitriou, a computer scientist at UC Berkeley, has shown that evolution doesn't, in fact, maximize fitness, but rather, the mixability of genes.)

Crossover vs. Mutation

Crossover is used to interchange large parts of a genome which is known to work. Mutation refines pieces of a genome. Roughly speaking, crossover helps you combine two good solutions in hopes that this will quickly bootstrap you to an even better solution, while mutation explores the space near a solution.

Crossover can also destroy a good solution by breaking it into two pieces that don't make sense independently or combining two pieces that produce a non-sensical output.

In many situations, mutation is sufficient to explore an entire space, albeit slowly. This is the case in your space since score decreases monotonically with distance from your goal. In a more complicated space, crossover may help you jump over barriers between local minima.

Putting It Together

My recommendation is that you reduce the amount of crossover in your populations given time. Initially, crossover might help you get some rapid gains in progress. But, as time goes on, and, especially, near the end of your simulation, you'll want delicate refinement. This techniques is similar to simulated annealing.

Sign up to request clarification or add additional context in comments.

2 Comments

Isn't selection already supposed to take care of bad individuals? So why reduce crossover rate in the long run, when it is designed to escape local minima/maxima and for this reason it becomes increasingly important as the search progress?
@Patrick: The proportion of individuals selected against in each generation helps determine the rate of convergence. The article on simulated annealing is worth reading. Also, I didn't make this explicit in my answer, but since OP's objective is convex there are no local minima to escape.
1

I want to add something based on my experience with GA. During my studies I found out that using the "elite selection" for creating the N+1 generation very often creates also a plateau on the solutions: it is true that you are going strictly and very fast to the optimal solution but you can find a local minimum and remain blocked there (see orange in the picture).

So what I did: I added a step of randomness (more than crossover and mutation of best elements) in which apparently the solution is worse but can produce a jump into a new minimum that can be the global one (your zero, the green jump in the image)

What you can do? try using 7 of the best elements for the N+1 generation and than the 8 picking a random element from the N generation (can be the worst).

enter image description here

Comments

0

Time to debug evolution!

What do the final solutions (paths) look like? I presume, that they can only go NSEW. if so, then it's easy to get trapped in a local (off by one or two) solution.

Also, it would be useful to watch how the best solution evolves over time. This can be very insightful (and fun to watch!)

Good luck debugging!

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.