Introduction

Let’s learn about heuristic and population methods.

We'll cover the following

The theory of evolution states that the different forms of life we see today are the result of millions of years of evolution, mutations, and adaptation. The specimens that developed mutations that gave them an advantage over the others prevailed, and that mutation was transmitted to the next generation. That’s how evolution explains the existence of complex life forms like ourselves.

We’re not going to start a debate about the theory of evolution, but this is an idea that’s used in the world of optimization. Evolution, as it’s described in Charles Darwin’s theory, is not a carefully designed process that should converge to the organisms we see today. It has no requirements to ensure convergence. It’s just a process that goes on for millions of years and produces a result as a consequence.

In optimization, we can do the same. Sometimes the functions are very complicated or even unknown, so we can’t assure any requirements or behavior. We can’t assure convergence. Then we just start trying with a lot of candidate points and see if we’re lucky enough to find the optimal solution.

That seems pretty outlandish. There are too many candidate points (maybe even an infinite amount of them) and maybe just one of them is the solution. Exploring all of them seems unfeasible, and it is. But we’re not going to explore naively. We’ll mimic the theory of evolution so good candidates will prevail and will produce even better candidates for the next generation of points. This way, we hope that at some point in that simulated evolution, we get the solution we’re looking for.

Create a free account to view this lesson.

By signing up, you agree to Educative's Terms of Service and Privacy Policy