Coverage for moptipy / algorithms / so / ffa / ffa_hill_climber.py: 92%
39 statements
« prev ^ index » next coverage.py v7.12.0, created at 2025-11-24 08:49 +0000
« prev ^ index » next coverage.py v7.12.0, created at 2025-11-24 08:49 +0000
1"""
2The FFA-based version of the Hill Climber: The FHC.
4This algorithm is based on
5:class:`~moptipy.algorithms.so.hill_climber.HillClimber`, i.e., the simple
6Hill Climber, but uses Frequency Fitness Assignment (FFA) as fitness
7assignment process. FFA replaces all objective values with their encounter
8frequencies in the selection decisions. The more often an objective value is
9encountered, the higher gets its encounter frequency. Therefore, local
10optima are slowly receiving worse and worse fitness.
12This algorithm is closely related
13:class:`~moptipy.algorithms.so.ffa.fea1plus1.FEA1plus1`.
14The difference is that it accepts new solutions only if their frequency
15fitness is *lower* than the frequency fitness of the current solution.
16:class:`~moptipy.algorithms.so.ffa.fea1plus1.FEA1plus1`, on the other hand,
17accepts solutions if their frequency fitness is *lower or equal*
18than the frequency fitness of the current solution.
201. Thomas Weise, Zhize Wu, Xinlu Li, and Yan Chen. Frequency Fitness
21 Assignment: Making Optimization Algorithms Invariant under Bijective
22 Transformations of the Objective Function Value. *IEEE Transactions on
23 Evolutionary Computation* 25(2):307-319. April 2021. Preprint available at
24 arXiv:2001.01416v5 [cs.NE] 15 Oct 2020.
25 https://dx.doi.org/10.1109/TEVC.2020.3032090
262. Thomas Weise, Zhize Wu, Xinlu Li, Yan Chen, and Jörg Lässig. Frequency
27 Fitness Assignment: Optimization without Bias for Good Solutions can be
28 Efficient. *IEEE Transactions on Evolutionary Computation (TEVC)*.
29 27(4):980-992. August 2023.
30 doi: https://doi.org/10.1109/TEVC.2022.3191698
313. Thomas Weise, Mingxu Wan, Ke Tang, Pu Wang, Alexandre Devert, and Xin
32 Yao. Frequency Fitness Assignment. *IEEE Transactions on Evolutionary
33 Computation (IEEE-EC),* 18(2):226-243, April 2014.
34 https://dx.doi.org/10.1109/TEVC.2013.2251885
354. Thomas Weise, Yan Chen, Xinlu Li, and Zhize Wu. Selecting a diverse set of
36 benchmark instances from a tunable model problem for black-box discrete
37 optimization algorithms. *Applied Soft Computing Journal (ASOC),*
38 92:106269, June 2020. https://dx.doi.org/10.1016/j.asoc.2020.106269
395. Thomas Weise, Xinlu Li, Yan Chen, and Zhize Wu. Solving Job Shop Scheduling
40 Problems Without Using a Bias for Good Solutions. In *Genetic and
41 Evolutionary Computation Conference Companion (GECCO'21 Companion),*
42 July 10-14, 2021, Lille, France. ACM, New York, NY, USA.
43 ISBN 978-1-4503-8351-6. https://dx.doi.org/10.1145/3449726.3463124
446. Thomas Weise, Mingxu Wan, Ke Tang, and Xin Yao. Evolving Exact Integer
45 Algorithms with Genetic Programming. In *Proceedings of the IEEE Congress
46 on Evolutionary Computation (CEC'14), Proceedings of the 2014 World
47 Congress on Computational Intelligence (WCCI'14),* pages 1816-1823,
48 July 6-11, 2014, Beijing, China. Los Alamitos, CA, USA: IEEE Computer
49 Society Press. ISBN: 978-1-4799-1488-3.
50 https://dx.doi.org/10.1109/CEC.2014.6900292
517. Tianyu Liang, Zhize Wu, Jörg Lässig, Daan van den Berg, Sarah Louise
52 Thomson, and Thomas Weise. Addressing the Traveling Salesperson Problem
53 with Frequency Fitness Assignment and Hybrid Algorithms. *Soft Computing.*
54 2024. https://dx.doi.org/10.1007/s00500-024-09718-8
55"""
56from collections import Counter
57from typing import Callable, Final
59from numpy.random import Generator
60from pycommons.types import type_error
62from moptipy.algorithms.so.ffa.ffa_h import create_h, log_h
63from moptipy.api.algorithm import Algorithm1
64from moptipy.api.operators import Op0, Op1
65from moptipy.api.process import Process
68class FHC(Algorithm1):
69 """
70 The FFA-based version of the Hill Climber.
72 This algorithm applies Frequency Fitness Assignment (FFA).
73 This means that it does not select solutions based on whether
74 they are better or worse. Instead, it selects the solution whose
75 objective value has been encountered during the search less often.
76 The word "best" therefore is not used in the traditional sense, i.e.,
77 that one solution is better than another one terms of its objective
78 value. Instead, the current best solution is always the one whose
79 objective value we have seen the least often.
81 This algorithm implementation works best if objective values are
82 integers and have lower and upper bounds that are not too far
83 away from each other. If there are too many possible different objective
84 values, the algorithm degenerates to a random walk.
85 """
87 def __init__(self, op0: Op0, op1: Op1, log_h_tbl: bool = False) -> None:
88 """
89 Create the FHC.
91 :param op0: the nullary search operator
92 :param op1: the unary search operator
93 :param log_h_tbl: should we log the H table?
94 """
95 super().__init__("fhc", op0, op1)
96 if not isinstance(log_h_tbl, bool):
97 raise type_error(log_h_tbl, "log_h_tbl", bool)
98 #: True if we should log the H table, False otherwise
99 self.__log_h_tbl: Final[bool] = log_h_tbl
101 def solve(self, process: Process) -> None:
102 """
103 Apply the FHC to an optimization problem.
105 :param process: the black-box process object
106 """
107 # Create records for old and new point in the search space.
108 cur_x = process.create() # record for current solution
109 new_x = process.create() # record for new solution
111 # Obtain the random number generator.
112 random: Final[Generator] = process.get_random()
114 # Put function references in variables to save time.
115 evaluate: Final[Callable] = process.evaluate # the objective
116 should_terminate: Final[Callable] = process.should_terminate
117 op0: Final[Callable] = self.op0.op0 # the nullary operator
118 op1: Final[Callable] = self.op1.op1 # the unary operator
120 h, ofs = create_h(process) # Allocate the h-table
122 # Start at a random point in the search space and evaluate it.
123 op0(random, cur_x) # Create 1 solution randomly and
124 cur_f: int | float = evaluate(cur_x) + ofs # evaluate it.
126 while not should_terminate(): # Until we need to quit...
127 op1(random, new_x, cur_x) # new_x = neighbor of cur_x
128 new_f: int | float = evaluate(new_x) + ofs
130 h[new_f] += 1 # type: ignore # Increase the frequency
131 h[cur_f] += 1 # type: ignore # of new_f and cur_f.
132 if h[new_f] < h[cur_f]: # type: ignore
133 cur_f = new_f # Store its objective value.
134 cur_x, new_x = new_x, cur_x # Swap best and new.
136 if not self.__log_h_tbl:
137 return # we are done here
139 # After we are done, we want to print the H-table.
140 if h[cur_f] == 0: # type: ignore # Fix the H-table for the case
141 h = Counter() # that only one FE was performed: In this case,
142 h[cur_f] = 1 # make Counter with only a single 1 value inside.
144 log_h(process, h, ofs) # log the H-table