1 minute read

Optimization is maybe the widest possible term for my research field. It comprises all sorts of techniques and problems. Basically, whenever we ask for a superlative, we have an optimization problem at hand. Find the shortest round-trip tour through $n$ cities? That’s an . Assign jobs to machines such that all customer orders are fixed as quickly as possible? That’s an optimization problem. If you ask me, basically all machine learning and AI tasks can be classified as optimization problems, too.

In this very wide field, many algorithms have emerged. As sub-field, we have that can yet be subdivided into, for instance,  (including, e.g., , , and ), methods (such as , , and ) and methods (most prominently, and ). Then there are exact methods like and many other approaches.

All of them have in common that they exist to tackle some classes of the many problems that emerge in optimization, including the , QAP, , BPP, , , and a variety of other , , and tasks. Of course, optimization applies also to many fields in engineering and design.

Posts

Plugging Frequency Fitness Assignment into Metaheuristics

12 minute read

Frequency Fitness Assignment () is a technique that fundamentally changes how (metaheuristic) optimization algorithms work. The goal of this post is to explore how this technique can be plugged into an existing algorithm. We first discuss optimization and the general pattern of metaheuristics in general. We then discuss the simplest local search algorithm – randomized local search, or for short. We plug FFA into this algorithm and obtain the . We finally list some properties of FRLS as well as some related works.

Measuring the Runtime of (Optimization) Algorithms

22 minute read

My research area is metaheuristic optimization, i.e., algorithms that can find good approximate solutions for computationally hard problems in feasible time. The Traveling Salesperson Problem () is an example for such an optimization task. In a TSP, $n$ cities and the distances between them are given and the goal is to find the shortest round-trip tour that goes through each city exactly once and then returns to its starting point. The TSP is $\mathcal{NP}$-hard, meaning that any currently known algorithm for finding the exact / globally best solution for all possible TSP instances will need a time which grows exponentially with $n$ on the worst case instances. And this is unlikely to change. In other words, if a TSP instance has $n$ cities, then the worst-case runtime of any exact TSP solver is in ${\mathcal{O}}(2^n)$. Well, today we have algorithms that can exactly solve a wide range of problems with tens of thousands of nodes and approximate the solution of million-node problems with an error of less than one part per thousand. This is pretty awesome, but the worst-case runtime to find the exact (optimal) solution is still exponential.

Are Black-Box Global Search Methods, such as Metaheuristics like Evolutionary Algorithms, better than Local Search?

15 minute read

Sometimes, when discussing the benefits of Evolutionary Algorithms (), their special variant Genetic Algorithms () with bit-string based search spaces, or other metaheuristic global search algorithms in general, statements like the following may be made:“If you use a huge population size with a large computational budget for both global search metaheuristics (with enough diversity) and local search metaheuristics, the global search approach will no doubt outperform the local search.” I cannot not really agree to this statement, in particular if it is applied to black box metaheuristics (those that make no use of the problem knowledge).

What is optimization?

18 minute read

The center of my research is . But what is optimization? Basically, optimization is the art of making good decisions. It provides us with a set of tools, mostly from the areas of computer science and mathematics, which are applicable in virtually all fields ranging from business, industry, biology, physics, medicine, data mining, engineering, to even art.

Publications