May 14, 2020
The goal of this project is to get some experience running and comparing evolutionary algorithms. The benchmark I would like you to use is games, but if you have a specific problem that you would rather work on, feel free to propose it, however it must be an agent which acts in some way. In this project, you must either:
- Compare two existing evolutionary methods
- Study the parameters of one existing method
- Test new modifications to an existing method such as a new crossover or mutation method
- Propose and benchmark a new evolutionary method
The method, software, and platform all are up to you. You will be evaluated on the rigor of the experiments you do, but the ambition of the platform or challenge you do is up to you. You must work in teams of 3-5 and you must verify your team and project plan with me before May 4th.
We will work during class hours on the project on May 7th, 13th, and 14th. Please post your completed video presentations by the end of the day on May 17th.
Here is a presentation explaining the evolution of agents and showing an example:
Here are some open-source evolutionary libraries you can use, but you can also feel free to write your own implementations:
- CGP implementations
- NEAT and HyperNEAT implementations
- Sferes2, C++
- Quality Diversity algorithms in Python
- Evolutionary Strategies in Python
- GRNs in Python
- MTCGP.jl
- Darwin.jl
- NSGA-II
- FlexGP (Java)
- ECJ (Java)
- GPLAB (Matlab)
- CMA-ES in Python
- MABE, C++
- eaopt, Go
- MAP-Elites in Python
Here are some game platforms. Note that some of these include multiple games such as the Atari platform. You are not required to evaluate on all games, just one.