Randomized Local Search (RLS)
Randomized local search (RLS) is maybe the simplest local search algorithm. It maintains one single best-so-far solution $x_c$ and in each step, it samples one new solution $x_n$ by applying the unary search operator to it. It will accept the new solution $x_n$ as the current solution $x_c$ if it is not worse. This search works already surprisingly well on a wide set of optimization problems. It can be viewed as the bottom line that any new metaheuristic must significantly outperform to be worth its salt.
The Algorithm
Sample solution $x_c$ uniformly at random from $\mathbb{X}$.
$y_c\gets f(x_c)$.
Until $\lnot$ should terminate repeat
$x_n\gets\mathit{op1}(x_c)$; $y_n\gets f(x_n)$.
If $y_n \leq y_c$ then
$x_c\gets x_n$; $y_c\gets y_n$.
Return $x_c, y_c$.
RLS begins by sampling one candidate solution $x_c$ from the solution space $\mathbb{X}$ uniformly at random. It computes the objective value $y_c$ of $x_c$ by evaluating the objective function $f:\mathbb{X}\mapsto\mathbb{R}$. Let’s assume that $f$ is subject to minimization, i.e., represents some sort of cost or amount of consumed resource. $y_c\gets f(x_c)$ then is the quality of the current solution $x_c$ and $x_c$ is the better, the smaller $y_c$ is
Now, in its main loop, RLS creates a slightly modified copy $x_n$ of the $x_c$. It does so by applying a unary operator $\mathit{op1}$ to $x_c$. $\mathit{op1}$ is usually randomized, meaning that it first copies $x_c$ and then randomly changes one or multiple components of that copy. Either way, we get a new solution $x_n\gets\mathit{op1}(x_n)$. This solution is then also evaluated and we obtain its objective value $y_n\gets f(x_n)$.
In its selection step, RLS retains $x_n$ if it is not worse than $x_c$. This means that, if $y_n\leq y_c$, $x_c\gets x_n$ and $y_c\gets y_n$. After the complete computational budget is exhausted, the algorithm returns the best solution that it has discovered (namely $x_c$) as well as its quality $y_c$. This process is illustrated as Algorithm 1.
If we plug Frequency Fitness Assignment (FFA) into this algorithm, we obtain the FRLS. But this is a story for another page.
Posts
Plugging Frequency Fitness Assignment into Metaheuristics
Frequency Fitness Assignment (FFA, 频率适应度分配) is a technique that fundamentally changes how (metaheuristic) optimization algorithms work. The goal of this post is to explore how this technique can be plugged into an existing algorithm. We first discuss optimization and the general pattern of metaheuristics in general. We then discuss the simplest local search algorithm – randomized local search, or RLS for short. We plug FFA into this algorithm and obtain the FRLS. We finally list some properties of FRLS as well as some related works.
Why you should use Evolutionary Algorithms to solve your optimization problems (and why not).
Some time ago, I discussed why global optimization with an Evolutionary Algorithm (EA) is not necessarily better than local search. Actually, I was asked “Why should I use an EA?” quite a few times. Thus, today, it is time to write down a few ideas about why and why not you may benefit from using an EA. I tried to be objective, which is not entirely easy since I work in that domain.
Publications
- Rui ZHAO (赵睿), Zhize WU (吴志泽), Daan van den Berg, Matthias Thürer, Tianyu LIANG (梁天宇), Tan MING (檀明), and Thomas Weise (汤卫思): Randomized Local Search for Two-Dimensional Bin Packing and a Negative Result for Frequency Fitness Assignment. International Conference on Evolutionary Computation Theory and Applications (ECTA'2024), part of 16th International Joint Conference on Computational Intelligence (IJCCI'2024), November 20-22, 2024, Porto, Portugal, pages 15-26. Setúbal, Portugal: SciTePress.
- Jiayang CHEN (陈嘉阳), Zhize WU (吴志泽), Sarah Louise Thomson, and Thomas Weise (汤卫思): Frequency Fitness Assignment: Optimization Without Bias for Good Solution Outperforms Randomized Local Search on the Quadratic Assignment Problem. International Conference on Evolutionary Computation Theory and Applications (ECTA'2024), part of 16th International Joint Conference on Computational Intelligence (IJCCI'2024), November 20-22, 2024, Porto, Portugal, pages 27-37. Setúbal, Portugal: SciTePress.
- Tianyu LIANG (梁天宇), Zhize WU (吴志泽), Matthias Thürer, Markus Wagner, and Thomas Weise (汤卫思): Generating Small Instances with Interesting Features for the Traveling Salesperson Problem. International Conference on Evolutionary Computation Theory and Applications (ECTA'2024), part of 16th International Joint Conference on Computational Intelligence (IJCCI'2024), November 20-22, 2024, Porto, Portugal, pages 173-180. Setúbal, Portugal: SciTePress.
- Xiang CAO (曹翔), Zhize WU (吴志泽), Daan van den Berg, and Thomas Weise (汤卫思): Randomized Local Search vs. NSGA-II vs. Frequency Fitness Assignment on The Traveling Tournament Problem. International Conference on Evolutionary Computation Theory and Applications (ECTA'2024), part of 16th International Joint Conference on Computational Intelligence (IJCCI'2024), November 20-22, 2024, Porto, Portugal, pages 38-49. Setúbal, Portugal: SciTePress.
- Sarah Louise Thomson, Gabriela Ochoa, Daan van den Berg, Tianyu LIANG (梁天宇), and Thomas Weise (汤卫思): Entropy, Search Trajectories, and Explainability for Frequency Fitness Assignment. 18th International Conference on Parallel Problem Solving from Nature (PPSN XVIII), September 14-18, 2024, Hagenberg, Austria, pages 377-392. Lecture Notes in Computer Science (LNCS), volume 15148. Berlin/Heidelberg, Germany: Springer.
- Tianyu LIANG (梁天宇), Zhize WU (吴志泽), Jörg Lässig, Daan van den Berg, Sarah Louise Thomson, and Thomas Weise (汤卫思): Addressing the Traveling Salesperson Problem with Frequency Fitness Assignment and Hybrid Algorithms. Soft Computing 28(17-18):9495-9508. September 2024.
- Rui ZHAO (赵睿), Tianyu LIANG (梁天宇), Zhize WU (吴志泽), Daan van den Berg, Matthias Thürer, and Thomas Weise (汤卫思): Randomized Local Search on the 2D Rectangular Bin Packing Problem with Item Rotation. Genetic and Evolutionary Computation Conference (GECCO'2024) Companion, July 14-18, 2024, Melbourne, VIC, Australia, pages 235-238. New York, NY, USA: ACM.
- Tianyu LIANG (梁天宇), Zhize WU (吴志泽), Jörg Lässig, Daan van den Berg, and Thomas Weise (汤卫思): Solving the Traveling Salesperson Problem using Frequency Fitness Assignment. IEEE Symposium on Foundations of Computational Intelligence (IEEE FOCI'2022), part of the IEEE Symposium Series on Computational Intelligence (SSCI'2022), December 4-7, 2022, Singapore. IEEE.