Optimization
Optimization is maybe the widest possible term for my research field. It comprises all sorts of techniques and problems. Basically, whenever we ask for a superlative, we have an optimization problem at hand. Find the shortest round-trip tour through $n$ cities? That’s an optimization problem. Assign jobs to machines such that all customer orders are fixed as quickly as possible? That’s an optimization problem. If you ask me, basically all machine learning and AI tasks can be classified as optimization problems, too.
In this very wide field, many algorithms have emerged. As sub-field, we have metaheuristics that can yet be subdivided into, for instance, local search (including, e.g., RLS, SA, and Tabu Search), evolutionary computation methods (such as EAs, GAs, and MAs) and swarm intelligence methods (most prominently, ACO and PSO). Then there are exact methods like branch and bound and many other approaches.
All of them have in common that they exist to tackle some classes of the many problems that emerge in optimization, including the TSP, QAP, TTP, BPP, CARP, VRP, and a variety of other packing, scheduling, and logistics tasks. Of course, optimization applies also to many fields in engineering and design.
Posts
Plugging Frequency Fitness Assignment into Metaheuristics
Frequency Fitness Assignment (FFA) is a technique that fundamentally changes how (metaheuristic) optimization algorithms work. The goal of this post is to explore how this technique can be plugged into an existing algorithm. We first discuss optimization and the general pattern of metaheuristics in general. We then discuss the simplest local search algorithm – randomized local search, or RLS for short. We plug FFA into this algorithm and obtain the FRLS. We finally list some properties of FRLS as well as some related works.
Why you should use Evolutionary Algorithms to solve your optimization problems (and why not).
Some time ago, I discussed why global optimization with an Evolutionary Algorithm (EA) is not necessarily better than local search. Actually, I was asked “Why should I use an EA?” quite a few times. Thus, today, it is time to write down a few ideas about why and why not you may benefit from using an EA. I tried to be objective, which is not entirely easy since I work in that domain.
Measuring the Runtime of (Optimization) Algorithms
My research area is metaheuristic optimization, i.e., algorithms that can find good approximate solutions for computationally hard problems in feasible time. The Traveling Salesperson Problem (TSP) is an example for such an optimization task. In a TSP, $n$ cities and the distances between them are given and the goal is to find the shortest round-trip tour that goes through each city exactly once and then returns to its starting point. The TSP is $\mathcal{NP}$-hard, meaning that any currently known algorithm for finding the exact / globally best solution for all possible TSP instances will need a time which grows exponentially with $n$ on the worst case instances. And this is unlikely to change. In other words, if a TSP instance has $n$ cities, then the worst-case runtime of any exact TSP solver is in ${\mathcal{O}}(2^n)$. Well, today we have algorithms that can exactly solve a wide range of problems with tens of thousands of nodes and approximate the solution of million-node problems with an error of less than one part per thousand. This is pretty awesome, but the worst-case runtime to find the exact (optimal) solution is still exponential.
Are Black-Box Global Search Methods, such as Metaheuristics like Evolutionary Algorithms, better than Local Search?
Sometimes, when discussing the benefits of Evolutionary Algorithms (EAs), their special variant Genetic Algorithms (GAs) with bit-string based search spaces, or other metaheuristic global search algorithms in general, statements like the following may be made:“If you use a huge population size with a large computational budget for both global search metaheuristics (with enough diversity) and local search metaheuristics, the global search approach will no doubt outperform the local search.” I cannot not really agree to this statement, in particular if it is applied to black box metaheuristics (those that make no use of the problem knowledge).
What is optimization?
The center of my research is optimization. But what is optimization? Basically, optimization is the art of making good decisions. It provides us with a set of tools, mostly from the areas of computer science and mathematics, which are applicable in virtually all fields ranging from business, industry, biology, physics, medicine, data mining, engineering, to even art.
Publications
- Thomas Weise (汤卫思) and Zhize WU (吴志泽): Replicable Self-Documenting Experiments with Arbitrary Search Spaces and Algorithms. Genetic and Evolutionary Computation Conference Companion (GECCO'2023) Companion, July 15-19, 2023, Lisbon, Portugal, pages 1891-1899. New York, NY, USA: ACM.
- Thomas Weise (汤卫思), Markus Wagner, Bin LI (李斌), Xingyi ZHANG (张兴义), and Jörg Lässig: Special Issue on Benchmarking of Computational Intelligence Algorithms in the Applied Soft Computing Journal (Editorial). Applied Soft Computing Journal (ASOC) 93:106502. August 2020.
- Robin Purshouse, Christine Zarges, Sylvain Cussat-Blanc, Michael G. Epitropakis, Marcus Gallagher, Thomas Jansen, Pascal Kerschke, Xiaodong LI, Fernando G. Lobo, Julian Francis Miller, Pietro Simone Oliveto, Mike Preuss, Giovanni Squillero, Alberto Tonda, Markus Wagner, Thomas Weise (汤卫思), Dennis G. Wilson, Borys Wróbel, and Aleš Zamuda: Workshops at PPSN 2018. 15th International Conference on Parallel Problem Solving from Nature (PPSN XV), September 8–12, 2018, Coimbra, Portugal, Part II, pages 490-497. Lecture Notes in Computer Science (LNTCS), volume 11102. Cham, Switzerland: Springer.
- Mohammad Ali Raayatpanah (محمدعلی رعایت پناه) and Thomas Weise (汤卫思): Virtual Network Function Placement for Service Function Chaining with Minimum Energy Consumption. IEEE International Conference on Computer and Communication Engineering Technology (CCET'2018), August 18-20, 2018, Beijing, China, IEEE.
- Thomas Weise (汤卫思), Raymond Chiong, and Ke TANG (唐珂): Evolutionary Optimization: Pitfalls and Booby Traps. Journal of Computer Science and Technology (JCST) 27(5):907-936. September 2012.
- Thomas Weise (汤卫思), Michael Zapf, Raymond Chiong, and Antonio Jesús Nebro Urbaneja: Why is optimization difficult? Nature-Inspired Algorithms for Optimisation, Studies in Computational Intelligence, volume 193/2009, chapter 1, pages 1-50. Berlin/Heidelberg: Springer-Verlag, April 2009.
- Thomas Weise (汤卫思): Global Optimization Algorithms — Theory and Application. self-published, free e-book. 2009.