PaGMO / PyGMO
License: Academic Free License 3.0 (AFL 3.0)
Web Page: http://esa.github.io/pygmo/
Mailing List: mailto:firstname.lastname@example.org
- Constraints Handling Techniques in Evolutionary Algorithms Because of the stochastic behavior of Evolutionary Algorithms, some constraints handling methods use empirical parameters defined by the user to take into account constraints violation during the optimization process. Unfortunately these parameters can drive the optimizer to local optima and are usually effective for a single problem. This proposal consists in implementing in PyGMO/PagMO more or less sophisticated constraints handling techniques that are problem independent and to test them against a set of typical problems. In an implementation point of view, most of the constraint handling techniques can be seen as a modification of the initial constrained problem, algorithm or population. Thus, the implementation will consists in redefining the initial problem/algorithm/population through the use of operators/modifiers.
- Expanding the multi-objective capabilities of PaGMO I will expand the multi-objective capabilities of PaGMO. At the moment only two algorithms for multi-objective optimization are available (NSGA-II, IHS). The aim of this project is to extend the library of multi-objective optimization algorithms including algorithms like MOEA/D, Multi-Objective Particle Swarm Optimization, Niched Pareto Genetic Algorithm, Strength Pareto Evolutionary Algorithm and Pareto Adaptive ε-Dominance
- Hyperfast hypervolumes for Multi-objective optimization Project involves computation of hypervolume in higher dimensional spaces for a set of hypercubes sharing one common point (reference point that strictly dominates the whole set). It is a geometrical interpretation of quality measure for a Pareto set. It can be used as a comparison between two population of individuals, and as a fitness function during the evolution phase in the genetic algorithm. First scenario allows for exact computation, since time is not as crucial of a factor, as it is in second case (executing hypervolume computation during the algorithm). This requires implementation of both exact and approximated methods. Additionally, few methods have been proposed that use hypervolume as a input for the fitness measure. I intend to implement at least one of them during the GSoC as well.
- Interactive Cloud Optimisation via Google Talk (XMPP) Project goal is to create PyGMO servers cloud with xmpp as a transport layer. Users of the cloud would be able to control, send, execute and monitor the execution of optimalization tasks remotely, using Google Talk. Cloud itself would be able to balance load of nodes, and would take care about the job execution issues (timeouts/interrupts/data loss etc). The appropriate communication protocol would be created, dealing with both the communication between nodes and between users and nodes.
- Racing Population Individuals and Algorithms The main objective of this project is two-fold: firstly, to enable individuals (and meta-heuristics algorithms) in PaGMO with stochastic quality indicators be distinguished efficiently via racing algorithms; secondly, to implement interesting functionalities in PaGMO leveraging the racing capability. The most straightforward enhancement is to enable stochastic optimization problems to be solved more efficiently without too much modification to the existing algorithms. A perhaps more interesting consequent of the racing module would be an in-house automatic parameter configuration system for PaGMO optimizers / meta-heuristics. This could potentially mean that in future, algorithms implemented in PaGMO will no longer require (heavy) manual parameter tuning.