Guided Mutation: Estimation of Distribution algorithm + Genetic Algorithms   [Qingfu Zhang's homepage]

Basic Idea:

One of the key issues in the design of evolutionary algorithms is how to generate offspring. The proximate optimality principle, an underlying assumption in most (if not all) heuristics, assumes that good solutions have similar structure. This assumption is reasonable for most real-world problems, e.g., the percentage of common edges in any two locally optimal solutions of a travelling salesman problem obtained by the Lin-Kernighan method is about 85% on average. Based on this assumption, an ideal offspring generator should be able to produce a solution which is close to the best solutions found so far. Suppose the current population in an evolutionary algorithm with local search consists of the best locally optimal solutions found so far, a new solution generated by the conventional mutation is close (similar) to its parent, but may be far away from other better solutions since the mutation does not utilize any global information extracted from the current population. EDAs extract global statistical information from the previous search and then represent it as a probability model, which characterizes the distribution of promising solutions in the search space. New solutions are generated by sampling from this model. However, the location information of the locally optimal solutions found so far has not been directly used in EDAs, there is no mechanism to directly control the similarity between new solutions and a given solution. The idea behind the proposed operator which we call guided mutation is to combine the global statistical information and location information of the solutions found so far to overcome the shortcoming of GAs and EDAs.

This paper uses an EA with guided mutation for tuning a heuristic for a complicated network problem. In most combination of EA and problem-specific heuristics, EAs works with heuristics on the same search space. In this paper, the EA is used for tuning the control parameters in a problem-specific heuristic. 

Abstract—Guided mutation uses the idea of estimation of distribution algorithms to improve conventional mutation operators. It combines global statistical information and the location information of good individual solutions for generating new trial solutions. This paper suggests using guided mutation in iterative local search. An experimental comparison between a conventional iterated local search (CILS) and an iterated local search with guided mutation has been conducted on four classes of the test instances of the quadratic assignment problem.

More papers on Guided Mutation can be found in my publication list.



last updated March, 2006, Q. Zhang