Cool demo, but from a practical perspective there's absolutely no reason to use GAs to solve a non-linear optimization problem. I know people like them because they have a really pleasing and intuitive backstory, but as a grad student in AI I can tell you that they suck at actually solving anything. Mostly this is due to the fact that GAs take what is already a difficult, non-linear problem (the problem you're trying to solve) and immediately, explosively, complicate it (what's your mutation rate? what are the chromosomes? how are crossovers handled? how are you deciding the answers to these questions?)
I think the real problem with most uses of GAs (including this one) is that to the extent that they work at all, they're merely performing a random walk search driven by mutation, and the "crossover" operation is, for all intents and purposes, just functioning as a random-restart of the search, a jump to a new spot in parameter space rather than a useful shuffling of units of functionality.
Sometimes a random local search is exactly what the doctor ordered, of course, it's almost entirely trivial to code and especially if you tune the variance over time you can get pretty good results, though I'd agree that in general you can usually get better results in less computer time with a more typical algorithm (though you've still got to choose the algorithm, implement/integrate it, test it, etc., so you lose a lot of programming time unless it's already part of your framework).
Genetic algorithms should really shine when it's not clear how to cast the problem as a finite dimensional optimization problem, for instance if you're trying to evolve some constructive procedure to solve a problem rather than tweak variables to improve performance. You're absolutely right, aothman, most problems, even most problems specifically constructed to show off GAs, don't fit into this category, so there's no real point except that the algorithm sounds cool...
Well, to be fair the parameter problem is common for all kinds of local search and is an interesting research topic. That being said, I agree that GAs are seldom the most efficient, other kinds of local search are typically better.
For problems where there is a natural way to express chromosomes, mutations, and crossover that captures interesting and relevant information GAs can be useful as an additional meta-heuristic.
I recently built a game with similar mechanics (http://www.bigblockgames.com/games/goblin/), and this is basically what my first day of development looked like, just sped up between takes.
Except at least this doesn't create many cars unstable enough that the structure starts oscillating before shooting off to infinity. So I guess GA 1, Michael 0.
Currently, it just attracts people to the site (which it's done in a pretty extreme way). We'll probably follow it up with a level pack that's for sale, though.
In other words, we lose a little on each hit, and make up for it with volume!
I've let it run for 10 generations in the background, and keep coming back to be entertained. It's fun to see the cars tackle the course. I've already gotten attached to a few of them - I cheer them on when they make it over a hill and cry with despair when they don't. =)
I let it run overnight, getting up to 60 generations. I cannot tell if the cars are actually improving that much, but the graph says they are. I saw one get to 245, but I missed that screenshot. Unfortunately a lot of the good cars spawn with their wheels upside down, or rotated, I wonder if spawn orientation is evolving along with the car. It would appear not. Here is the best car of the 62 gen.
I did that too and got it up to the 114th generation. Unfortunately, the car has gotten stuck and the game seems to be going nowhere. So I'll never see the 115th generation...
Sorry, let me clarify. The cyclist in Palo Alto rides a reverse penny-farthing, while the vehicle generated by the genetic algorithm is the traditional penny-farthing.
I've seen that before - there doesn't seem to be anywhere that you can see his code, especially his selection function. I'd find that pretty interesting.