This paper proposes a new method for accelerating the search speed of genetic algorithms by taking derivative evaluation and conditional random selection into account in their evolution process. Derivative evaluation makes genetic algorithms focus on ...
This paper proposes a new method for accelerating the search speed of genetic algorithms by taking derivative evaluation and conditional random selection into account in their evolution process. Derivative evaluation makes genetic algorithms focus on the individuals whose fitness is rapidly increased. This accelerates the search speed of genetic algorithms by enhancing exploitation like steepest descent methods but also increases the possibility of a premature convergence that means most individuals after a few generations approach to local optima. On the other hand, derivative evaluation under a premature convergence helps genetic algorithms escape the local optima by enhancing exploration. If GAs fall into a premature convergence, random selection is used in order to help escaping local optimum, but its effects are not large. We experimented our method with one combinatorial problem and five complex function optimization problems. Experimental results showed that our method was superior to the simple genetic algorithm especially when the search space is large.