Coverage for moptipy / algorithms / so / hill_climber.py: 100%

23 statements  

« prev     ^ index     » next       coverage.py v7.12.0, created at 2025-11-24 08:49 +0000

1""" 

2The implementation of the basic hill climbing algorithm `hc`. 

3 

4The algorithm starts by applying the nullary search operator, an 

5implementation of :meth:`~moptipy.api.operators.Op0.op0`, to sample one fully 

6random solution. This is the first best-so-far solution. In each step, it 

7applies the unary operator, an implementation of 

8:meth:`~moptipy.api.operators.Op1.op1`, to the best-so-far solution to obtain 

9a new, similar solution. If this new solution is strictly better than the 

10current best-so-far solution, it replaces this solution. Otherwise, it is 

11discarded. 

12 

13The hill climbing algorithm is a simple local search that only accepts 

14strictly improving moves. It is thus similar to the randomized local search 

15(`rls`) implemented in :class:`~moptipy.algorithms.so.rls.RLS`, which, however, 

16accepts non-deteriorating moves. We also provide `hcr`, a variant of the hill 

17climber that restarts automatically with a certain number of moves were not 

18able to improve the current best-so-far solution in class :class:`~moptipy.\ 

19algorithms.so.hill_climber_with_restarts.HillClimberWithRestarts`. 

20 

211. Stuart Jonathan Russell and Peter Norvig. *Artificial Intelligence: A 

22 Modern Approach (AIMA)*. 2nd edition. 2002. Upper Saddle River, NJ, USA: 

23 Prentice Hall International Inc. ISBN: 0-13-080302-2 

242. Steven S. Skiena. *The Algorithm Design Manual.* 2nd edition. 2008. 

25 London, UK: Springer-Verlag. ISBN: 978-1-84800-069-8. 

26 http://doi.org/10.1007/978-1-84800-070-4. 

273. David Stifler Johnson, Christos H. Papadimitriou, and Mihalis Yannakakis. 

28 How Easy Is Local Search? Journal of Computer and System Sciences. 

29 37(1):79-100. August 1988. http://doi.org/10.1016/0022-0000(88)90046-3 

30 http://www2.karlin.mff.cuni.cz/~krajicek/jpy2.pdf 

314. James C. Spall. *Introduction to Stochastic Search and Optimization.* 

32 April 2003. Estimation, Simulation, and Control -- Wiley-Interscience 

33 Series in Discrete Mathematics and Optimization, volume 6. Chichester, West 

34 Sussex, UK: Wiley Interscience. ISBN: 0-471-33052-3. 

35 http://www.jhuapl.edu/ISSO/ 

365. Holger H. Hoos and Thomas Stützle. *Stochastic Local Search: Foundations 

37 and Applications.* 2005. ISBN: 1493303732. In The Morgan Kaufmann Series in 

38 Artificial Intelligence. Amsterdam, The Netherlands: Elsevier. 

396. Thomas Weise. *Optimization Algorithms.* 2021. Hefei, Anhui, China: 

40 Institute of Applied Optimization (IAO), School of Artificial Intelligence 

41 and Big Data, Hefei University. http://thomasweise.github.io/oa/ 

427. Thomas Weise. *Global Optimization Algorithms - Theory and Application.* 

43 2009. http://www.it-weise.de/projects/book.pdf 

44""" 

45from typing import Callable, Final 

46 

47from numpy.random import Generator 

48 

49from moptipy.api.algorithm import Algorithm1 

50from moptipy.api.operators import Op0, Op1 

51from moptipy.api.process import Process 

52 

53 

54# start book 

55class HillClimber(Algorithm1): 

56 """ 

57 The stochastic hill climbing algorithm only accepts improving moves. 

58 

59 In each step, a hill climber creates a modified copy `new_x` of the 

60 current best solution `best_x`. If `new_x` is better than `best_x`, 

61 it becomes the new `best_x`. Otherwise, it is discarded. 

62 """ 

63 

64 def solve(self, process: Process) -> None: 

65 """ 

66 Apply the hill climber to an optimization problem. 

67 

68 :param process: the black-box process object 

69 """ 

70 # Create records for old and new point in the search space. 

71 best_x = process.create() # record for best-so-far solution 

72 new_x = process.create() # record for new solution 

73 # Obtain the random number generator. 

74 random: Final[Generator] = process.get_random() 

75 

76 # Put function references in variables to save time. 

77 evaluate: Final[Callable] = process.evaluate # the objective 

78 op1: Final[Callable] = self.op1.op1 # the unary operator 

79 should_terminate: Final[Callable] = process.should_terminate 

80 

81 # Start at a random point in the search space and evaluate it. 

82 self.op0.op0(random, best_x) # Create 1 solution randomly and 

83 best_f: int | float = evaluate(best_x) # evaluate it. 

84 

85 while not should_terminate(): # Until we need to quit... 

86 op1(random, new_x, best_x) # new_x = neighbor of best_x 

87 new_f: int | float = evaluate(new_x) 

88 if new_f < best_f: # new_x is _better_ than best_x? 

89 best_f = new_f # Store its objective value. 

90 best_x, new_x = new_x, best_x # Swap best and new. 

91# end book 

92 

93 def __init__(self, op0: Op0, op1: Op1) -> None: 

94 """ 

95 Create the hill climber. 

96 

97 :param op0: the nullary search operator 

98 :param op1: the unary search operator 

99 """ 

100 super().__init__("hc", op0, op1)