1 minute read

Metaheuristics are general procedures that can be applied to a wide variety of problems. They are not problem-specific and often have a rather simple structure, a loop of sampling and selection as given in Algorithm 1. At each iteration $i$, they maintain a set $S_i$ of interesting candidate solutions $x\in S_i$ from the solution space $\mathbb{X}$. They use these selected solutions in one way or another to sample a set $N_i$ of new solutions. This may happen via a unary search operator, a binary operator, by updating some statistical model and then sampling the model, or by using any other imaginable method. Either way, we get a set of new solutions $N_i$. Then, $S_i$ and $N_i$ are combined into a set $P_i=S_i\cup N_i$ and the set $S_{i+1}\subseteq P_i$ for the next iteration is chosen from it. Normally, the better a solution $x\in P_i$ relative to the other members of $P_i$, meaning the smaller its corresponding objective value $f(x)$, the higher its chance to be selected into $S_{i+1}$. This is how trial-and-error works:

Algorithm 1. The normal cycle of metaheuristic algorithms.
Sample set $S_1$ of initial solutions from the solution space $\mathbb{X}$.
For $i$ from 1 to$\dots$
 Create set $N_i$ of new solutions based on $S_i$.
 $P_i\gets S_i\cup N_i$.
 Select set $S_{i+1}$ from $P_i$ according to some policy.

Many different algorithms follow this pattern. The most common sub-fields are

Posts

Plugging Frequency Fitness Assignment into Metaheuristics

12 minute read

Frequency Fitness Assignment (, 频率适应度分配) is a technique that fundamentally changes how (metaheuristic) optimization algorithms work. The goal of this post is to explore how this technique can be plugged into an existing algorithm. We first discuss optimization and the general pattern of metaheuristics in general. We then discuss the simplest local search algorithm – randomized local search, or for short. We plug FFA into this algorithm and obtain the . We finally list some properties of FRLS as well as some related works.

Measuring the Runtime of (Optimization) Algorithms

22 minute read

My research area is metaheuristic optimization, i.e., algorithms that can find good approximate solutions for computationally hard problems in feasible time. The Traveling Salesperson Problem () is an example for such an optimization task. In a TSP, $n$ cities and the distances between them are given and the goal is to find the shortest round-trip tour that goes through each city exactly once and then returns to its starting point. The TSP is $\mathcal{NP}$-hard, meaning that any currently known algorithm for finding the exact / globally best solution for all possible TSP instances will need a time which grows exponentially with $n$ on the worst case instances. And this is unlikely to change. In other words, if a TSP instance has $n$ cities, then the worst-case runtime of any exact TSP solver is in ${\mathcal{O}}(2^n)$. Well, today we have algorithms that can exactly solve a wide range of problems with tens of thousands of nodes and approximate the solution of million-node problems with an error of less than one part per thousand. This is pretty awesome, but the worst-case runtime to find the exact (optimal) solution is still exponential.

Are Black-Box Global Search Methods, such as Metaheuristics like Evolutionary Algorithms, better than Local Search?

15 minute read

Sometimes, when discussing the benefits of Evolutionary Algorithms (), their special variant Genetic Algorithms () with bit-string based search spaces, or other global search algorithms in general, statements like the following may be made:“If you use a huge population size with a large computational budget for both global search metaheuristics (with enough diversity) and local search metaheuristics, the global search approach will no doubt outperform the local search.” I cannot not really agree to this statement, in particular if it is applied to black box metaheuristics (those that make no use of the problem knowledge).

What is optimization?

18 minute read

The center of my research is . But what is optimization? Basically, optimization is the art of making good decisions. It provides us with a set of tools, mostly from the areas of computer science and mathematics, which are applicable in virtually all fields ranging from business, industry, biology, physics, medicine, data mining, engineering, to even art.

Why research in Computational Intelligence should be less nature-inspired.

17 minute read

The inspiration gleaned from observing nature has led to several important advances in the field of optimization. Still, it seems to me that a lot of work is mainly based on such inspiration alone. This might divert attention away from practical and algorithmic concerns. As a result, there is a growing number of specialized terminologies used in the field of Evolutionary Computation () and Swarm Intelligence (), which I consider as a problem for clarity in research. With this article, I would like to formulate my thoughts with the hope to contribute to a fruitful debate.

Publications