Posts by Tag

metaheuristics

Plugging Frequency Fitness Assignment into Metaheuristics

12 minute read

Frequency Fitness Assignment (, 频率适应度分配) is a technique that fundamentally changes how (metaheuristic) optimization algorithms work. The goal of this post is to explore how this technique can be plugged into an existing algorithm. We first discuss optimization and the general pattern of metaheuristics in general. We then discuss the simplest local search algorithm – randomized local search, or for short. We plug FFA into this algorithm and obtain the . We finally list some properties of FRLS as well as some related works.

Measuring the Runtime of (Optimization) Algorithms

22 minute read

My research area is metaheuristic optimization, i.e., algorithms that can find good approximate solutions for computationally hard problems in feasible time. The Traveling Salesperson Problem () is an example for such an optimization task. In a TSP, $n$ cities and the distances between them are given and the goal is to find the shortest round-trip tour that goes through each city exactly once and then returns to its starting point. The TSP is $\mathcal{NP}$-hard, meaning that any currently known algorithm for finding the exact / globally best solution for all possible TSP instances will need a time which grows exponentially with $n$ on the worst case instances. And this is unlikely to change. In other words, if a TSP instance has $n$ cities, then the worst-case runtime of any exact TSP solver is in ${\mathcal{O}}(2^n)$. Well, today we have algorithms that can exactly solve a wide range of problems with tens of thousands of nodes and approximate the solution of million-node problems with an error of less than one part per thousand. This is pretty awesome, but the worst-case runtime to find the exact (optimal) solution is still exponential.

Are Black-Box Global Search Methods, such as Metaheuristics like Evolutionary Algorithms, better than Local Search?

15 minute read

Sometimes, when discussing the benefits of Evolutionary Algorithms (), their special variant Genetic Algorithms () with bit-string based search spaces, or other global search algorithms in general, statements like the following may be made:“If you use a huge population size with a large computational budget for both global search metaheuristics (with enough diversity) and local search metaheuristics, the global search approach will no doubt outperform the local search.” I cannot not really agree to this statement, in particular if it is applied to black box metaheuristics (those that make no use of the problem knowledge).

What is optimization?

18 minute read

The center of my research is . But what is optimization? Basically, optimization is the art of making good decisions. It provides us with a set of tools, mostly from the areas of computer science and mathematics, which are applicable in virtually all fields ranging from business, industry, biology, physics, medicine, data mining, engineering, to even art.

Why research in Computational Intelligence should be less nature-inspired.

17 minute read

The inspiration gleaned from observing nature has led to several important advances in the field of optimization. Still, it seems to me that a lot of work is mainly based on such inspiration alone. This might divert attention away from practical and algorithmic concerns. As a result, there is a growing number of specialized terminologies used in the field of Evolutionary Computation () and Swarm Intelligence (), which I consider as a problem for clarity in research. With this article, I would like to formulate my thoughts with the hope to contribute to a fruitful debate.

Back to Top ↑

optimization

Plugging Frequency Fitness Assignment into Metaheuristics

12 minute read

Frequency Fitness Assignment (, 频率适应度分配) is a technique that fundamentally changes how (metaheuristic) optimization algorithms work. The goal of this post is to explore how this technique can be plugged into an existing algorithm. We first discuss optimization and the general pattern of metaheuristics in general. We then discuss the simplest local search algorithm – randomized local search, or for short. We plug FFA into this algorithm and obtain the . We finally list some properties of FRLS as well as some related works.

Measuring the Runtime of (Optimization) Algorithms

22 minute read

My research area is metaheuristic optimization, i.e., algorithms that can find good approximate solutions for computationally hard problems in feasible time. The Traveling Salesperson Problem () is an example for such an optimization task. In a TSP, $n$ cities and the distances between them are given and the goal is to find the shortest round-trip tour that goes through each city exactly once and then returns to its starting point. The TSP is $\mathcal{NP}$-hard, meaning that any currently known algorithm for finding the exact / globally best solution for all possible TSP instances will need a time which grows exponentially with $n$ on the worst case instances. And this is unlikely to change. In other words, if a TSP instance has $n$ cities, then the worst-case runtime of any exact TSP solver is in ${\mathcal{O}}(2^n)$. Well, today we have algorithms that can exactly solve a wide range of problems with tens of thousands of nodes and approximate the solution of million-node problems with an error of less than one part per thousand. This is pretty awesome, but the worst-case runtime to find the exact (optimal) solution is still exponential.

Are Black-Box Global Search Methods, such as Metaheuristics like Evolutionary Algorithms, better than Local Search?

15 minute read

Sometimes, when discussing the benefits of Evolutionary Algorithms (), their special variant Genetic Algorithms () with bit-string based search spaces, or other global search algorithms in general, statements like the following may be made:“If you use a huge population size with a large computational budget for both global search metaheuristics (with enough diversity) and local search metaheuristics, the global search approach will no doubt outperform the local search.” I cannot not really agree to this statement, in particular if it is applied to black box metaheuristics (those that make no use of the problem knowledge).

What is optimization?

18 minute read

The center of my research is . But what is optimization? Basically, optimization is the art of making good decisions. It provides us with a set of tools, mostly from the areas of computer science and mathematics, which are applicable in virtually all fields ranging from business, industry, biology, physics, medicine, data mining, engineering, to even art.

Back to Top ↑

Evolutionary Algorithms (EAs)

Measuring the Runtime of (Optimization) Algorithms

22 minute read

My research area is metaheuristic optimization, i.e., algorithms that can find good approximate solutions for computationally hard problems in feasible time. The Traveling Salesperson Problem () is an example for such an optimization task. In a TSP, $n$ cities and the distances between them are given and the goal is to find the shortest round-trip tour that goes through each city exactly once and then returns to its starting point. The TSP is $\mathcal{NP}$-hard, meaning that any currently known algorithm for finding the exact / globally best solution for all possible TSP instances will need a time which grows exponentially with $n$ on the worst case instances. And this is unlikely to change. In other words, if a TSP instance has $n$ cities, then the worst-case runtime of any exact TSP solver is in ${\mathcal{O}}(2^n)$. Well, today we have algorithms that can exactly solve a wide range of problems with tens of thousands of nodes and approximate the solution of million-node problems with an error of less than one part per thousand. This is pretty awesome, but the worst-case runtime to find the exact (optimal) solution is still exponential.

Are Black-Box Global Search Methods, such as Metaheuristics like Evolutionary Algorithms, better than Local Search?

15 minute read

Sometimes, when discussing the benefits of Evolutionary Algorithms (), their special variant Genetic Algorithms () with bit-string based search spaces, or other global search algorithms in general, statements like the following may be made:“If you use a huge population size with a large computational budget for both global search metaheuristics (with enough diversity) and local search metaheuristics, the global search approach will no doubt outperform the local search.” I cannot not really agree to this statement, in particular if it is applied to black box metaheuristics (those that make no use of the problem knowledge).

Why research in Computational Intelligence should be less nature-inspired.

17 minute read

The inspiration gleaned from observing nature has led to several important advances in the field of optimization. Still, it seems to me that a lot of work is mainly based on such inspiration alone. This might divert attention away from practical and algorithmic concerns. As a result, there is a growing number of specialized terminologies used in the field of Evolutionary Computation () and Swarm Intelligence (), which I consider as a problem for clarity in research. With this article, I would like to formulate my thoughts with the hope to contribute to a fruitful debate.

Back to Top ↑

Evolutionary Computation (EC)

What is optimization?

18 minute read

The center of my research is . But what is optimization? Basically, optimization is the art of making good decisions. It provides us with a set of tools, mostly from the areas of computer science and mathematics, which are applicable in virtually all fields ranging from business, industry, biology, physics, medicine, data mining, engineering, to even art.

Why research in Computational Intelligence should be less nature-inspired.

17 minute read

The inspiration gleaned from observing nature has led to several important advances in the field of optimization. Still, it seems to me that a lot of work is mainly based on such inspiration alone. This might divert attention away from practical and algorithmic concerns. As a result, there is a growing number of specialized terminologies used in the field of Evolutionary Computation () and Swarm Intelligence (), which I consider as a problem for clarity in research. With this article, I would like to formulate my thoughts with the hope to contribute to a fruitful debate.

Back to Top ↑

Traveling Salesperson Problem (TSP)

Measuring the Runtime of (Optimization) Algorithms

22 minute read

My research area is metaheuristic optimization, i.e., algorithms that can find good approximate solutions for computationally hard problems in feasible time. The Traveling Salesperson Problem () is an example for such an optimization task. In a TSP, $n$ cities and the distances between them are given and the goal is to find the shortest round-trip tour that goes through each city exactly once and then returns to its starting point. The TSP is $\mathcal{NP}$-hard, meaning that any currently known algorithm for finding the exact / globally best solution for all possible TSP instances will need a time which grows exponentially with $n$ on the worst case instances. And this is unlikely to change. In other words, if a TSP instance has $n$ cities, then the worst-case runtime of any exact TSP solver is in ${\mathcal{O}}(2^n)$. Well, today we have algorithms that can exactly solve a wide range of problems with tens of thousands of nodes and approximate the solution of million-node problems with an error of less than one part per thousand. This is pretty awesome, but the worst-case runtime to find the exact (optimal) solution is still exponential.

Are Black-Box Global Search Methods, such as Metaheuristics like Evolutionary Algorithms, better than Local Search?

15 minute read

Sometimes, when discussing the benefits of Evolutionary Algorithms (), their special variant Genetic Algorithms () with bit-string based search spaces, or other global search algorithms in general, statements like the following may be made:“If you use a huge population size with a large computational budget for both global search metaheuristics (with enough diversity) and local search metaheuristics, the global search approach will no doubt outperform the local search.” I cannot not really agree to this statement, in particular if it is applied to black box metaheuristics (those that make no use of the problem knowledge).

What is optimization?

18 minute read

The center of my research is . But what is optimization? Basically, optimization is the art of making good decisions. It provides us with a set of tools, mostly from the areas of computer science and mathematics, which are applicable in virtually all fields ranging from business, industry, biology, physics, medicine, data mining, engineering, to even art.

Back to Top ↑

Linux

Updating all Packages and Snaps

2 minute read

I use for my work and try to keep my system up-to-date. This means that, whenever I shut down my computer, I run a little Here I post a small script update.sh that updates all installed packages, both deb and snap packages. Sometimes, for whatever reason, there may be a problem with some inconsistent package state. My script basically applies all methods I know to fix such state. The script is very heavy-handed and loops over the update process four times. If for whatever reason one update would require another first and that could only be done by multiple updates, then that’s no problem. I usually run something like sudo update.sh && shutdown now, which then shuts down my computer. So I do not really need to care how long the script needs anyway.

Repeating a Command until it Succeeds under Linux

2 minute read

Assume that we have a command that we want to execute in a terminal. We know that the command may sometimes fail, but it should actually succeed. So we would try the command again and again until this happens. An example is, for instance, a download process. If we know that the URL is reachable, then the command should succeed. However, maybe there is a lost connection or other temporary disturbance. Another such command could be a build process which downloads and installs dependencies.

Recursively Deleting Empty Files and Directories under Linux

1 minute read

Sometimes, we have the need to delete empty files and empty directories under . This happens, for example, when we want to restart an experiment with . Here I post a small script that you may store as file deleteZeroSizedFiles.sh. It will do exactly that: It will search the current directory and all subdirectories for empty files, i.e., files of size zero. It will delete all of them. Then, it will recursively look for empty directories, i.e., directories that do not contain files or other directories. It will delete them as well.

Back to Top ↑

Bash

Updating all Packages and Snaps

2 minute read

I use for my work and try to keep my system up-to-date. This means that, whenever I shut down my computer, I run a little Here I post a small script update.sh that updates all installed packages, both deb and snap packages. Sometimes, for whatever reason, there may be a problem with some inconsistent package state. My script basically applies all methods I know to fix such state. The script is very heavy-handed and loops over the update process four times. If for whatever reason one update would require another first and that could only be done by multiple updates, then that’s no problem. I usually run something like sudo update.sh && shutdown now, which then shuts down my computer. So I do not really need to care how long the script needs anyway.

Repeating a Command until it Succeeds under Linux

2 minute read

Assume that we have a command that we want to execute in a terminal. We know that the command may sometimes fail, but it should actually succeed. So we would try the command again and again until this happens. An example is, for instance, a download process. If we know that the URL is reachable, then the command should succeed. However, maybe there is a lost connection or other temporary disturbance. Another such command could be a build process which downloads and installs dependencies.

Recursively Deleting Empty Files and Directories under Linux

1 minute read

Sometimes, we have the need to delete empty files and empty directories under . This happens, for example, when we want to restart an experiment with . Here I post a small script that you may store as file deleteZeroSizedFiles.sh. It will do exactly that: It will search the current directory and all subdirectories for empty files, i.e., files of size zero. It will delete all of them. Then, it will recursively look for empty directories, i.e., directories that do not contain files or other directories. It will delete them as well.

Back to Top ↑

Ant Colony Optimization (ACO)

Measuring the Runtime of (Optimization) Algorithms

22 minute read

My research area is metaheuristic optimization, i.e., algorithms that can find good approximate solutions for computationally hard problems in feasible time. The Traveling Salesperson Problem () is an example for such an optimization task. In a TSP, $n$ cities and the distances between them are given and the goal is to find the shortest round-trip tour that goes through each city exactly once and then returns to its starting point. The TSP is $\mathcal{NP}$-hard, meaning that any currently known algorithm for finding the exact / globally best solution for all possible TSP instances will need a time which grows exponentially with $n$ on the worst case instances. And this is unlikely to change. In other words, if a TSP instance has $n$ cities, then the worst-case runtime of any exact TSP solver is in ${\mathcal{O}}(2^n)$. Well, today we have algorithms that can exactly solve a wide range of problems with tens of thousands of nodes and approximate the solution of million-node problems with an error of less than one part per thousand. This is pretty awesome, but the worst-case runtime to find the exact (optimal) solution is still exponential.

Why research in Computational Intelligence should be less nature-inspired.

17 minute read

The inspiration gleaned from observing nature has led to several important advances in the field of optimization. Still, it seems to me that a lot of work is mainly based on such inspiration alone. This might divert attention away from practical and algorithmic concerns. As a result, there is a growing number of specialized terminologies used in the field of Evolutionary Computation () and Swarm Intelligence (), which I consider as a problem for clarity in research. With this article, I would like to formulate my thoughts with the hope to contribute to a fruitful debate.

Back to Top ↑

Randomized Local Search (RLS)

Plugging Frequency Fitness Assignment into Metaheuristics

12 minute read

Frequency Fitness Assignment (, 频率适应度分配) is a technique that fundamentally changes how (metaheuristic) optimization algorithms work. The goal of this post is to explore how this technique can be plugged into an existing algorithm. We first discuss optimization and the general pattern of metaheuristics in general. We then discuss the simplest local search algorithm – randomized local search, or for short. We plug FFA into this algorithm and obtain the . We finally list some properties of FRLS as well as some related works.

Back to Top ↑

Frequency Fitness Assignment (FFA)

Plugging Frequency Fitness Assignment into Metaheuristics

12 minute read

Frequency Fitness Assignment (, 频率适应度分配) is a technique that fundamentally changes how (metaheuristic) optimization algorithms work. The goal of this post is to explore how this technique can be plugged into an existing algorithm. We first discuss optimization and the general pattern of metaheuristics in general. We then discuss the simplest local search algorithm – randomized local search, or for short. We plug FFA into this algorithm and obtain the . We finally list some properties of FRLS as well as some related works.

Back to Top ↑

Computational Intelligence (CI)

Why research in Computational Intelligence should be less nature-inspired.

17 minute read

The inspiration gleaned from observing nature has led to several important advances in the field of optimization. Still, it seems to me that a lot of work is mainly based on such inspiration alone. This might divert attention away from practical and algorithmic concerns. As a result, there is a growing number of specialized terminologies used in the field of Evolutionary Computation () and Swarm Intelligence (), which I consider as a problem for clarity in research. With this article, I would like to formulate my thoughts with the hope to contribute to a fruitful debate.

Back to Top ↑

Genetic Algorithms (GAs)

Why research in Computational Intelligence should be less nature-inspired.

17 minute read

The inspiration gleaned from observing nature has led to several important advances in the field of optimization. Still, it seems to me that a lot of work is mainly based on such inspiration alone. This might divert attention away from practical and algorithmic concerns. As a result, there is a growing number of specialized terminologies used in the field of Evolutionary Computation () and Swarm Intelligence (), which I consider as a problem for clarity in research. With this article, I would like to formulate my thoughts with the hope to contribute to a fruitful debate.

Back to Top ↑

Particle Swarm Optimization (PSO)

Why research in Computational Intelligence should be less nature-inspired.

17 minute read

The inspiration gleaned from observing nature has led to several important advances in the field of optimization. Still, it seems to me that a lot of work is mainly based on such inspiration alone. This might divert attention away from practical and algorithmic concerns. As a result, there is a growing number of specialized terminologies used in the field of Evolutionary Computation () and Swarm Intelligence (), which I consider as a problem for clarity in research. With this article, I would like to formulate my thoughts with the hope to contribute to a fruitful debate.

Back to Top ↑

Swarm Inteligence (SI)

Why research in Computational Intelligence should be less nature-inspired.

17 minute read

The inspiration gleaned from observing nature has led to several important advances in the field of optimization. Still, it seems to me that a lot of work is mainly based on such inspiration alone. This might divert attention away from practical and algorithmic concerns. As a result, there is a growing number of specialized terminologies used in the field of Evolutionary Computation () and Swarm Intelligence (), which I consider as a problem for clarity in research. With this article, I would like to formulate my thoughts with the hope to contribute to a fruitful debate.

Back to Top ↑

benchmarking

Measuring the Runtime of (Optimization) Algorithms

22 minute read

My research area is metaheuristic optimization, i.e., algorithms that can find good approximate solutions for computationally hard problems in feasible time. The Traveling Salesperson Problem () is an example for such an optimization task. In a TSP, $n$ cities and the distances between them are given and the goal is to find the shortest round-trip tour that goes through each city exactly once and then returns to its starting point. The TSP is $\mathcal{NP}$-hard, meaning that any currently known algorithm for finding the exact / globally best solution for all possible TSP instances will need a time which grows exponentially with $n$ on the worst case instances. And this is unlikely to change. In other words, if a TSP instance has $n$ cities, then the worst-case runtime of any exact TSP solver is in ${\mathcal{O}}(2^n)$. Well, today we have algorithms that can exactly solve a wide range of problems with tens of thousands of nodes and approximate the solution of million-node problems with an error of less than one part per thousand. This is pretty awesome, but the worst-case runtime to find the exact (optimal) solution is still exponential.

Back to Top ↑

TSP Suite

Measuring the Runtime of (Optimization) Algorithms

22 minute read

My research area is metaheuristic optimization, i.e., algorithms that can find good approximate solutions for computationally hard problems in feasible time. The Traveling Salesperson Problem () is an example for such an optimization task. In a TSP, $n$ cities and the distances between them are given and the goal is to find the shortest round-trip tour that goes through each city exactly once and then returns to its starting point. The TSP is $\mathcal{NP}$-hard, meaning that any currently known algorithm for finding the exact / globally best solution for all possible TSP instances will need a time which grows exponentially with $n$ on the worst case instances. And this is unlikely to change. In other words, if a TSP instance has $n$ cities, then the worst-case runtime of any exact TSP solver is in ${\mathcal{O}}(2^n)$. Well, today we have algorithms that can exactly solve a wide range of problems with tens of thousands of nodes and approximate the solution of million-node problems with an error of less than one part per thousand. This is pretty awesome, but the worst-case runtime to find the exact (optimal) solution is still exponential.

Back to Top ↑

Randomized Local Search with Frequency Fitness Assignment (FRLS)

Plugging Frequency Fitness Assignment into Metaheuristics

12 minute read

Frequency Fitness Assignment (, 频率适应度分配) is a technique that fundamentally changes how (metaheuristic) optimization algorithms work. The goal of this post is to explore how this technique can be plugged into an existing algorithm. We first discuss optimization and the general pattern of metaheuristics in general. We then discuss the simplest local search algorithm – randomized local search, or for short. We plug FFA into this algorithm and obtain the . We finally list some properties of FRLS as well as some related works.

Back to Top ↑