Related fields
Evolutionary algorithms are a subfield of evolutionary computing.
• - Evolution strategies (ES, see Rechenberg, 1994) evolve individuals through mutation and intermediate or discrete recombination. ES algorithms are specially designed to solve problems in the real value domain. They use self-adaptation to adjust the search control parameters. The de-randomization of self-adaptation has led to the current Covariance Matrix Adaptation Evolution Strategy (CMA-ES).
• - Evolutionary programming (EP) involves populations of solutions with mutation and selection and arbitrary representations. It uses self-adaptation to adjust parameters, and may include other variation operations, such as combining information from multiple parents.
• - Estimation Distribution Algorithm (EDA) replaces traditional replay operators with model-driven operators. These models are learned from the population using machine learning techniques and represented as Probabilistic Graphical Models, from which new solutions can be sampled [45] [46] or generated from guided chromosome crossing over [47].
• - Gene expression programming (GEP) also uses populations of computer programs. These complex computer programs are encoded in simpler linear chromosomes of fixed length, which are then expressed as expression trees. Expression trees or computer programs evolve because chromosomes undergo mutation and recombination in a manner similar to canonical GA. But thanks to the special organization of GEP chromosomes, these genetic modifications always result in valid computer programs. [48].
• - Genetic programming (GP) is a related technique popularized by John Koza in which computer programs, rather than functional parameters, are optimized. Genetic programming often uses internal tree-based data structures to represent computer programs for adaptation instead of the list structures typical of genetic algorithms.
• - The genetic algorithm clustering (GGA) is an evolution of GA where the focus shifts from individual elements, as in classical GA, to groups or subsets of elements. [49] The idea behind this evolution of GA proposed by Emanuel Falkenauer is that the solution of some complex problems, such as clustering or partition problems where a set of items must be divided into a group of disjoint elements in an optimal way, would be best achieved by taking features from groups of items equivalent to genes. These types of problems include container packing, line balancing, grouping with respect to a distance measure, equal stacks, etc., in which classical GAs demonstrated poor performance. Making genes equivalent to groups involves chromosomes that are generally of variable length, and special genetic operators that manipulate entire groups of elements. For container packaging, in particular, a GGA hybridized with the Martello and Toth Dominance Criterion is possibly the best technique to date.
• - Interactive evolutionary algorithms are evolutionary algorithms that use human evaluation. They are typically applied to domains where it is difficult to design a computational fitness function, for example, evolving images, music, artistic designs, and shapes to fit the aesthetic preference of users.
• - Ant colony optimization (ACO) uses many ants (or agents) equipped with a pheromone model to traverse the solution space and find local productive areas. An estimate of the distribution algorithm is considered.
• - Particle swarm optimization (PSO) is a computational method for multiparametric optimization that also uses a population-based approach. A population (swarm) of candidate solutions (particles) moves in the search space, and the motion of the particles is influenced by both their best-known position and the best-known global position of the swarm. Like genetic algorithms, the PSO method depends on the exchange of information between members of the population. In some problems, PSO is often more computationally efficient than GAs, especially in unconstrained problems with continuous variables.
• - The Memetic algorithm (MA), often called hybrid genetic algorithm, among others, is a population-based method in which solutions are also subject to local improvement phases. The idea of memetic algorithms, which unlike genes, can adapt. In some problem areas they are shown to be more efficient than traditional evolutionary algorithms.
• - Bacteriological algorithms (BA) are inspired by evolutionary ecology and, more specifically, bacteriological adaptation. Evolutionary ecology is the study of living organisms in the context of their environment, with the goal of discovering how they adapt. Its basic concept is that in a heterogeneous environment, there is no individual that fits the entire environment. Therefore, one has to reason at the population level. It is also believed that BAs could be successfully applied to complex positioning problems (cell phone antennas, urban planning, etc.) or data mining. [52].
• - The cultural algorithm (CA) consists of the population component almost identical to that of the genetic algorithm and, in addition, a knowledge component called the belief space.
• - The differential search (DS) algorithm is inspired by the migration of superorganisms.
• - Gaussian adaptation (normal or natural adaptation, abbreviated NA to avoid confusion with GA) is intended to maximize the manufacturing performance of signal processing systems. It can also be used for ordinary parametric optimization. It is based on a certain theorem valid for all regions of acceptability and all Gaussian distributions. The efficiency of NA depends on information theory and a certain efficiency theorem. Its efficiency is defined as information divided by the work necessary to obtain the information. Because NA maximizes mean fitness rather than individual fitness, the landscape smoothes out such that the valleys between peaks can disappear. Therefore, it has a certain "ambition" to avoid local peaks in the fitness landscape. NA is also good at scaling sharp ridges by adapting the moment matrix, because NA can maximize the disorder (average information) of the Gaussian while simultaneously keeping the mean fitness constant.
• - Simulated annealing (SA) is a related global optimization technique that traverses the search space by testing random mutations in an individual solution. A mutation that increases fitness is always accepted. A mutation that decreases fitness is accepted probabilistically based on the difference in fitness and a decreasing temperature parameter. In SA jargon, we talk about aiming for lowest energy rather than maximum fitness. SA can also be used within a standard GA algorithm starting with a relatively high mutation rate and decreasing over time along a given schedule.
• - Tabu search (TS) is similar to simulated annealing in that both traverse the solution space testing mutations of an individual solution. While simulated annealing generates only one mutated solution, tabu search generates many mutated solutions and moves to the solution with the lowest energy of those generated. In order to avoid loops and encourage further movement through the solution space, a tabu list of partial or complete solutions is maintained. It is prohibited to move to a solution that contains elements of the tabu list, which is updated as the solution traverses the solution space.
• - Extreme Optimization (EO), unlike GAs, which work with a population of candidate solutions, develops a single solution and makes local modifications to the worst components. This requires that an appropriate representation be selected that allows individual solution components to be assigned a measure of quality ("fitness"). The governing principle behind this algorithm is that of emergent improvement by selectively removing low-quality components and replacing them with a randomly selected component. This is decidedly at odds with a GA that selects good solutions in an attempt to make better solutions.
• - The cross entropy (CE) method generates candidate solutions using a parameterized probability distribution. The parameters are updated by minimizing the cross entropy, to generate better samples in the next iteration.
• - Reactive search optimization (RSO) advocates the integration of sub-symbolic learning techniques into search heuristics to solve complex optimization problems. The word reactive suggests a rapid response to events during searching through an internal online feedback loop for self-tuning of critical parameters. Methodologies of interest for reactive search include machine learning and statistics, particularly reinforcement learning, active or query learning, neural networks, and metaheuristics.
• - Neural networks.
• - Fuzzy logic.
• - Bayesian network.
• - Local minimum/maximum.
• - Artificial Intelligence.
• - Evolutionary Robotics").
• - Emergency "Emergency (philosophy)").
• - Cellular automaton.
• - Santillán Code").
• - Banzhaf, Wolfgang; Nordin, Peter; Keller, Robert; Francone, Frank (1998). Genetic Programming - An Introduction. San Francisco, CA: Morgan Kaufmann. ISBN 978-1558605107.
• - Bies, Robert R.; Muldoon, Matthew F.; Pollock, Bruce G.; Manuck, Steven; Smith, Gwenn; Sale, Mark E. (2006). "The Genetic Algorithm-Based, Hybrid Machine Learning Approach to Model Selection." Journal of Pharmacokinetics and Pharmacodynamics. Netherlands: Springer: 196-221.
• - Carmona, Enrique J.; Fernández, Severino (2020). Fundamentals of Evolutionary Computation. Marcombo. ISBN 978-8426727558.
• - Cha, Sung-Hyuk; Tappert, Charles C. (2009). "Genetic Algorithm for Constructing Compact Binary Decision Trees." Journal of Pattern Recognition Research. 4 (1): 1-13. doi:10.13176/11.44.
• - Eiben, Agoston; Smith, James (2003). Introduction to Evolutionary Computing. Springer. ISBN 978-3540401841.
• - Fraser, Alex S. (1957). "Simulation of Genetic Systems by Automatic Digital Computers, I. Introduction." Australian Journal of Biological Sciences. 10: 484-491.
• - Goldberg, David (1989). Genetic Algorithms in Search, Optimization and Machine Learning. Reading, MA: Addison-Wesley Professional. ISBN 978-0201157673.
• - Goldberg, David (2002). The Design of Innovation: Lessons from and for Competent Genetic Algorithms. Norwell, MA: Kluwer Academic Publishers. ISBN 978-1402070983.
• - Fogel, David. Evolutionary Computation: Toward the New Philosophy of Machine Intelligence (3rd ed.). Piscataway, NJ: IEEE Press. ISBN 978-0471669517.
• - Hingston, Philip; Barone, Luigi Michalewicz, Zbigniew (2008). Design by Evolution: Advances in Evolutionary Design. Springer. ISBN 978-3540741091.
• - Holland, John (1992). Adaptation in Natural and Artificial Systems. Cambridge, MA: MIT Press. ISBN 978-0262581110.
• - Koza, John (1992). Genetic Programming: On the Programming of Computers by Means of Natural Selection. Cambridge, MA: MIT Press. ISBN 978-0262111706.
• - Michalewicz, Zbigniew (1996). Genetic Algorithms + Data Structures = Evolution Programs. Springer-Verlag. ISBN 978-3540606765.
• - Mitchell, Melanie (1996). An Introduction to Genetic Algorithms Cambridge, MA: MIT Press. ISBN 9780585030944.
• - Poly, R.; Langdon, W.B.; McPhee, N.F. (2008). A Field Guide to Genetic Programming. Lulu.com, freely available from the internet. ISBN 978-1-4092-0073-4.
• - Rechenberg, Ingo (1994): Evolutionsstrategie '94, Stuttgart: Fromman-Holzboog.
• - Schmitt, Lothar M; Nehaniv, Chrystopher L; Fujii, Robert H (1998), Linear analysis of genetic algorithms, Theoretical Computer Science 208: 111-148.
• - Schmitt, Lothar M (2001), Theory of Genetic Algorithms, Theoretical Computer Science 259: 1-61.
• - Schmitt, Lothar M (2004), Theory of Genetic Algorithms II: models for genetic operators over the string-tensor representation of populations and convergence to global optimality for arbitrary fitness functions under scaling, Theoretical Computer Science 310: 181-231.
• - Schwefel, Hans-Paul (1974): Numerische Optimierung von Computer-Modellen (PhD thesis). Reprinted by Birkhäuser (1977).
• - Vose, Michael (1999). The Simple Genetic Algorithm: Foundations and Theory. Cambridge, MA: MIT Press. ISBN 978-0262220583.
• - Whitley, Darrell (1994). "A genetic algorithm tutorial." Statistics and Computing. 4 (2): 65-85. Doi:10.1007/BF00175354.