Coverage for moptipy / algorithms / so / rls.py: 100%

23 statements  

« prev     ^ index     » next       coverage.py v7.12.0, created at 2025-11-24 08:49 +0000

1""" 

2The implementation of the Randomized Local Search algorithm `rls`. 

3 

4The algorithm starts by applying the nullary search operator, an 

5implementation of :meth:`~moptipy.api.operators.Op0.op0`, to sample 

6one fully random solution. This is the first best-so-far solution. 

7In each step, it applies the unary operator, an implementation of 

8:meth:`~moptipy.api.operators.Op1.op1`, to the best-so-far solution to 

9obtain a new, similar solution. If this new solution is not worse than 

10the current best-so-far solution, it replaces this solution. Otherwise, 

11it is discarded. 

12 

13The `rls` algorithm is a simple local search that accepts all 

14non-deteriorating moves. It is thus similar to the simple hill climber 

15`hc` implemented in 

16:class:`~moptipy.algorithms.so.hill_climber.HillClimber`, which, however, 

17accepts strictly improving moves. `rls` is also equivalent to a 

18`(mu+lambda)`-EA without crossover as implemented in 

19:class:`~moptipy.algorithms.so.ea.EA` if the same unary and nullary operator 

20are used and `mu=1`, `lambda=1`, and `br=0`. `rls`, however, will be 

21faster as it does not represent a population of solutions as list of 

22objects but can directly utilize local variables. 

23 

24Strictly speaking, the name "Randomized Local Search" only fits 

25partially to the algorithm we implement here. Take the discrete search 

26domain, where the search spaces are bit strings of a fixed length `n`, 

27as an example. The name "Randomized Local Search" and the abbreviation 

28`rls` has a fixed meaning on this domain: It is the algorithm that 

29starts with a random solution and flips exactly one randomly chosen bit 

30in each step. This corresponds to our `rls` algorithm with the operator 

31:class:`~moptipy.operators.bitstrings.op1_flip1.Op1Flip1`. However, 

32an algorithm that starts with a random solution and flips a number of 

33bits sampled from a Binomial distribution is called `(1+1) EA`. Now 

34this algorithm corresponds again to our `rls` algorithm, but this time 

35with operator 

36:class:`~moptipy.operators.bitstrings.op1_m_over_n_flip.Op1MoverNflip`. 

37In other words, we can implement (at least) two algorithms with 

38well-known and fixed names by plugging different operators into our 

39`rls` approach. One of them is called `RLS`, the other one is called 

40`(1+1) EA`. Now this is somewhat confusing but results from the general 

41nature of our basic framework. Regardless of what we do, we will have 

42some form of name clash here. We advise the user of our algorithms to 

43be careful with respect to literature and scientific conventions when 

44using our framework. 

45 

461. Frank Neumann and Ingo Wegener. Randomized Local Search, Evolutionary 

47 Algorithms, and the Minimum Spanning Tree Problem. *Theoretical Computer 

48 Science.* 378(1):32-40, June 2007. 

49 https://doi.org/10.1016/j.tcs.2006.11.002, 

50 https://eldorado.tu-dortmund.de/bitstream/2003/5454/1/165.pdf 

512. Holger H. Hoos and Thomas Stützle. *Stochastic Local Search: Foundations 

52 and Applications.* 2005. ISBN: 1493303732. In The Morgan Kaufmann Series in 

53 Artificial Intelligence. Amsterdam, The Netherlands: Elsevier. 

543. Thomas Weise. *Optimization Algorithms.* 2021. Hefei, Anhui, China: 

55 Institute of Applied Optimization (IAO), School of Artificial Intelligence 

56 and Big Data, Hefei University. http://thomasweise.github.io/oa/ 

57""" 

58from typing import Callable, Final 

59 

60from numpy.random import Generator 

61 

62from moptipy.api.algorithm import Algorithm1 

63from moptipy.api.operators import Op0, Op1 

64from moptipy.api.process import Process 

65 

66 

67# start book 

68class RLS(Algorithm1): 

69 """ 

70 The RLS is a simple local search accepting all non-worsening moves. 

71 

72 In each step, an RLS creates a modified copy `new_x` of the 

73 current best solution `best_x`. If `new_x` is not worse than `best_x`, 

74 it becomes the new `best_x`. Otherwise, it is discarded. 

75 """ 

76 

77 def solve(self, process: Process) -> None: 

78 """ 

79 Apply the RLS to an optimization problem. 

80 

81 :param process: the black-box process object 

82 """ 

83 # Create records for old and new point in the search space. 

84 best_x = process.create() # record for best-so-far solution 

85 new_x = process.create() # record for new solution 

86 # Obtain the random number generator. 

87 random: Final[Generator] = process.get_random() 

88 

89 # Put function references in variables to save time. 

90 evaluate: Final[Callable] = process.evaluate # the objective 

91 op1: Final[Callable] = self.op1.op1 # the unary operator 

92 should_terminate: Final[Callable] = process.should_terminate 

93 

94 # Start at a random point in the search space and evaluate it. 

95 self.op0.op0(random, best_x) # Create 1 solution randomly and 

96 best_f: int | float = evaluate(best_x) # evaluate it. 

97 

98 while not should_terminate(): # Until we need to quit... 

99 op1(random, new_x, best_x) # new_x = neighbor of best_x 

100 new_f: int | float = evaluate(new_x) 

101 if new_f <= best_f: # new_x is not worse than best_x? 

102 best_f = new_f # Store its objective value. 

103 best_x, new_x = new_x, best_x # Swap best and new. 

104 

105 # end book 

106 

107 def __init__(self, op0: Op0, op1: Op1) -> None: 

108 """ 

109 Create the randomized local search (rls). 

110 

111 :param op0: the nullary search operator 

112 :param op1: the unary search operator 

113 """ 

114 super().__init__("rls", op0, op1)