A short history of Evolutionary Algorithms

Human beings once had a queue like dogs and cats. This trait has disappeared over time. Nowadays, it survives as a slightly more prominent bone at the bottom of the spine for some of us. 

17% of the world population has blue eyes. In occidental countries, between 10 and 15% of people are dominantly left-handed. 

All of these are genetic characteristics that result from years of biological evolutions. 

Natural Selection

According to Darwin, all species arise and develop through a selective process. It is the natural selection of small, inherited variations that increase the individual’s ability to compete, survive, and reproduce. Those who survive do so through a process called ‘Natural selection.’ 

And this same process applies to evolutionary algorithms. 

Natural selection starts with the selection through the survival of the fittest elements belonging to a population. They generate descendants that inherit the features of the parents. These features, in turn, will be added to the next generation. 

A genetic algorithm (GA) is a search heuristic inspired by Charles Darwin’s natural evolution theory. This type of evolutionary algorithm reflects the process of natural selection. 

In this context, the fittest individuals (algorithms) are selected for reproduction. The goal is to produce fitter offsprings for the next generation. Genetic algorithms are a subset of evolutionary algorithms. 

Natural selection for humans refers to some individuals’ survival and reproduction due to differences in phenotype, i.e., genes and their expressions. It is a crucial mechanism of biological evolution. Some changes in the traits of a population inherited from several generations carry over to the next. 

Indeed, variations exist within all populations of living beings. It occurs because random mutations arise in an individual organism’s genome, and their descendants can receive such mutations. 

During the lives of the people, their genomes interact with their surroundings to induce variations. And individuals with specific variants of a trait tend to survive and reproduce better than other people with different, less successful variants. As a consequence, the population evolves. 

The premise of an evolutionary algorithm (EA) is quite accessible, given that you are familiar with the process of natural selection. An evolutionary algorithm contains four overall steps: initialization, selection, genetic operators, and termination.

Population and Gene Selection

Natural selection favors some phenotypes. Over long periods of evolution, this process might result in populations specializing in specific biological niches. As a consequence, it may ultimately end in the appearance of new species. In other words, natural selection is an essential process in the biological evolution of a population.

If parents are fit, their offsprings will be fitter and have a better chance at surviving. This process keeps on iterating. And in the end, we will obtain a generation with the fittest individuals.

We wish to find the best combination of elements composing the algorithm that maximizes the resulting algorithm’s performance. We will accept an optimal end solution in two instances. Either once we have ran the algorithm for some maximum number of iterations or reached a given performance level. This approach is not the only way to optimize a genetic and evolutionary algorithm, including many typical applications.

How do we evaluate the performance of an evolutionary algorithm? We use a function to measure the algorithm’s performance objectively. It is called a fitness function. 

It is a particular objective function. It summarises how close a given solution is to achieving the set aims. Fitness functions serve in genetic programming and genetic algorithms to evaluate an offspring’s distance to an optimal solution.

Now let’s take a look at genetic algorithms. A ‘Genetic Algorithm’ (GA) is a search-based optimization technique. It can find optimal solutions to ‘difficult’ problems, otherwise long to solve. 

A “pool” or a “population” of possible solutions to a given situation is generated randomly. A fitness function is computed for each randomly generated function. 

Some of the solutions generated during this first generation will become parents for the next generation. Therefore we perform a parent selection mechanism among the population. 

It allows breeding, also known as recombination (“crossover”), to produce new children (“the offspring”). In this candidate parent population, each individual (or candidate solution) is assigned a fitness function. The fitter individuals are given a higher chance to mate and generate more “fitter” individuals…(“recombination or crossover”). 

A mutation is a random variation imposed on some individuals. It contributes to additional ‘variability.’ As a result of both breeding and mutation, a new generation of solutions is generated. The same approach is used: selection, breeding/crossover, mutation. 

At each generation, a fitness function is computed for each member of the generation. The processing stops in two cases. When we have generated a given number of generations or reach a predefined performance threshold (fitness). 

An Example of Evolutionary Algorithms

Let’s take an analogy. We want to create a 3D model of the earth using genetic and evolutionary algorithms. Initially, we generate random 3D shapes, say three cubes: a small one, a medium one and a big one and a sphere: a small one, a medium one and a big one, a star: a small one, a medium one, and a big one. 

The fitness function consists in making the difference between the actual 3D model of the earth and those nine initial models. Let’s assume further that the three spheres are retained to breed. They also generate ten spheres of variable sizes. 

A mutation is imposed on the spheres, which add the color green to some of them. In this second generation, the fitness function selects the biggest spheres with the green color mutation (because it approached the earth’s blue).

Crossover (or breeding) happens between the chosen parents. In the next generation, only giant spheres are generated. A mutation is introduced, which transforms the green spheres into blue ones. 

At this stage, one of the offsprings may be a giant enough sphere of blue color close enough to a 3D representation of the earth. The algorithm stops. We have a model for generating a 3D version of the planet. And we can use this model to ‘predict’ the future state of the planet.

Evolutionary algorithms are ideally suited for optimization, classification, prediction, and evaluation problems. As a subset of Machine Learning, itself a subset of Artificial Intelligence, this optimization method is efficient in solving complex problems. It is quite an old discipline in computer science. 

Genetic Algorithms (GA) and Evolutionary Strategies were defined in the 70s, while Genetic Programming appeared in the early 90s. TADA, the Machine Learning platform from MyDataModels, provides all the sophistication and the complexity of evolutionary algorithms packaged in an easy-to-use tool.








Share on linkedin
Share on twitter
Share on facebook

Start making sense of  your data

Test easily TADA with our test data here

You might also like...

Small Data + Unknown Data Distribution = Machine Learning?

TADA interface

The new TADA now available

MyDataModels and Thales

MyDataModels and Thales develop BlueGuard, underwater and coastal surveillance system