Coverage for moptipy / algorithms / so / ffa / fea1plus1.py: 97%
39 statements
« prev ^ index » next coverage.py v7.12.0, created at 2025-11-24 08:49 +0000
« prev ^ index » next coverage.py v7.12.0, created at 2025-11-24 08:49 +0000
1"""
2The FFA-based version of the (1+1)-EA: the (1+1)-FEA.
4This algorithm is based on :class:`~moptipy.algorithms.so.rls.RLS`, i.e., the
5(1+1)-EA, but uses Frequency Fitness Assignment (FFA) as fitness assignment
6process. FFA replaces all objective values with their encounter frequencies in
7the selection decisions. The more often an objective value is encountered, the
8higher gets its encounter frequency. Therefore, local optima are slowly
9receiving worse and worse fitness.
11Most of the existing metaheuristic algorithms have in common that they
12maintain a set `Si` of one or multiple solutions and derive a set `Ni` of one
13or multiple new solutions in each iteration `i`. From the joint set
14`Pi = Si + Ni` of old and new solutions, they then select the set `Si+1` of
15solutions to be propagated to the next iteration, and so on. This selection
16decision is undertaken based mainly on the objective values `f(x)` of the
17solutions `x in Pi` and solutions with better objective values tend to be
18preferred over solutions with worse objective values.
20Frequency Fitness Assignment (FFA) completely breaks with this most
21fundamental concept of optimization. FFA was first proposed by
22Weise *et al.* as a "plug-in" for metaheuristics intended to prevent
23premature convergence. It therefore maintains a frequency table `H` for
24objective values. Before the metaheuristic chooses the set `Si+1` from `Pi`,
25it increments the encounter frequencies of the objective value of each
26solution in `Pi`, i.e., performs `H[yj] <- H[yj] + 1` for each `xj in Pi`,
27where `yj = f(xj)`. In its selection decisions, the algorithm then uses the
28frequency fitness `H[yj]` instead of the objective values `yj`.
30Here we integrate FFA into the randomized local search algorithm
31:class:`~moptipy.algorithms.so.rls.RLS`, which is also known as the
32`(1+1) EA`. In its original form, RLS maintains a single solution and derives
33a slightly modified copy from it in every iteration. If the modified copy is
34not worse than the original solution, it replaces it. "Not worse" here means
35that its objective value needs to be better or equally good, i.e., `<=`, than
36the maintained current best solution. The RLS with FFA (here called
37`(1+1) FEA`) now replaces the comparison of objective values with a comparison
38of the frequencies of the objective values. Of course, following the
39definition of FFA, the frequencies are first incremented (both of the current
40and the new solution) and then compared.
42The algorithm is here implemented in two different ways: If the objective
43function is always integer valued and the difference between its upper and
44lower bound is not too high, then we count the frequency fitness by using a
45numpy array. This means that frequency updates and getting frequency values is
46very fast. If the objective function is not always integer or if the
47difference between its maximum and minimum is too large, then we will use
48a :class:`collections.Counter` to back the frequency table instead. This will
49be slower and probably require more memory, but it may be the only way to
50accommodate the frequency table. Of course, this will still fail if there are
51too many different objective values and the memory consumed is simply too
52high.
54FFA is also implemented as a fitness assignment process
55(:mod:`~moptipy.algorithms.so.fitness`) in module
56:mod:`~moptipy.algorithms.so.ffa.ffa_fitness`.
581. Thomas Weise, Zhize Wu, Xinlu Li, and Yan Chen. Frequency Fitness
59 Assignment: Making Optimization Algorithms Invariant under Bijective
60 Transformations of the Objective Function Value. *IEEE Transactions on
61 Evolutionary Computation* 25(2):307-319. April 2021. Preprint available at
62 arXiv:2001.01416v5 [cs.NE] 15 Oct 2020.
63 https://dx.doi.org/10.1109/TEVC.2020.3032090
642. Thomas Weise, Zhize Wu, Xinlu Li, Yan Chen, and Jörg Lässig. Frequency
65 Fitness Assignment: Optimization without Bias for Good Solutions can be
66 Efficient. *IEEE Transactions on Evolutionary Computation (TEVC)*.
67 27(4):980-992. August 2023.
68 doi: https://doi.org/10.1109/TEVC.2022.3191698
693. Thomas Weise, Mingxu Wan, Ke Tang, Pu Wang, Alexandre Devert, and Xin
70 Yao. Frequency Fitness Assignment. *IEEE Transactions on Evolutionary
71 Computation (IEEE-EC),* 18(2):226-243, April 2014.
72 https://dx.doi.org/10.1109/TEVC.2013.2251885
734. Thomas Weise, Yan Chen, Xinlu Li, and Zhize Wu. Selecting a diverse set of
74 benchmark instances from a tunable model problem for black-box discrete
75 optimization algorithms. *Applied Soft Computing Journal (ASOC),*
76 92:106269, June 2020. https://dx.doi.org/10.1016/j.asoc.2020.106269
775. Thomas Weise, Xinlu Li, Yan Chen, and Zhize Wu. Solving Job Shop Scheduling
78 Problems Without Using a Bias for Good Solutions. In *Genetic and
79 Evolutionary Computation Conference Companion (GECCO'21 Companion),*
80 July 10-14, 2021, Lille, France. ACM, New York, NY, USA.
81 ISBN 978-1-4503-8351-6. https://dx.doi.org/10.1145/3449726.3463124
826. Thomas Weise, Mingxu Wan, Ke Tang, and Xin Yao. Evolving Exact Integer
83 Algorithms with Genetic Programming. In *Proceedings of the IEEE Congress
84 on Evolutionary Computation (CEC'14), Proceedings of the 2014 World
85 Congress on Computational Intelligence (WCCI'14),* pages 1816-1823,
86 July 6-11, 2014, Beijing, China. Los Alamitos, CA, USA: IEEE Computer
87 Society Press. ISBN: 978-1-4799-1488-3.
88 https://dx.doi.org/10.1109/CEC.2014.6900292
897. Tianyu Liang, Zhize Wu, Jörg Lässig, Daan van den Berg, Sarah Louise
90 Thomson, and Thomas Weise. Addressing the Traveling Salesperson Problem
91 with Frequency Fitness Assignment and Hybrid Algorithms. *Soft Computing.*
92 2024. https://dx.doi.org/10.1007/s00500-024-09718-8
93"""
94from collections import Counter
95from typing import Callable, Final
97from numpy.random import Generator
98from pycommons.types import type_error
100from moptipy.algorithms.so.ffa.ffa_h import create_h, log_h
101from moptipy.api.algorithm import Algorithm1
102from moptipy.api.operators import Op0, Op1
103from moptipy.api.process import Process
106class FEA1plus1(Algorithm1):
107 """
108 The FFA-based version of the (1+1)-EA: the (1+1)-FEA.
110 This algorithm applies Frequency Fitness Assignment (FFA).
111 This means that it does not select solutions based on whether
112 they are better or worse. Instead, it selects the solution whose
113 objective value has been encountered during the search less often.
114 The word "best" therefore is not used in the traditional sense, i.e.,
115 that one solution is better than another one terms of its objective
116 value. Instead, the current best solution is always the one whose
117 objective value we have seen the least often.
119 In each step, a (1+1)-FEA creates a modified copy `new_x` of the
120 current best solution `best_x`. It then increments the frequency fitness
121 of both solutions by 1. If frequency fitness of `new_x` is not bigger
122 the one of `best_x`, it becomes the new `best_x`.
123 Otherwise, it is discarded.
125 This algorithm implementation works best if objective values are
126 integers and have lower and upper bounds that are not too far
127 away from each other. If there are too many possible different objective
128 values, the algorithm degenerates to a random walk.
129 """
131 def __init__(self, op0: Op0, op1: Op1, log_h_tbl: bool = False) -> None:
132 """
133 Create the (1+1)-FEA.
135 :param op0: the nullary search operator
136 :param op1: the unary search operator
137 :param log_h_tbl: should we log the H table?
138 """
139 super().__init__("fea1p1", op0, op1)
140 if not isinstance(log_h_tbl, bool):
141 raise type_error(log_h_tbl, "log_h_tbl", bool)
142 #: True if we should log the H table, False otherwise
143 self.__log_h_tbl: Final[bool] = log_h_tbl
145 def solve(self, process: Process) -> None:
146 """
147 Apply the (1+1)-FEA to an optimization problem.
149 :param process: the black-box process object
150 """
151 # Create records for old and new point in the search space.
152 cur_x = process.create() # record for current solution
153 new_x = process.create() # record for new solution
155 # Obtain the random number generator.
156 random: Final[Generator] = process.get_random()
158 # Put function references in variables to save time.
159 evaluate: Final[Callable] = process.evaluate # the objective
160 should_terminate: Final[Callable] = process.should_terminate
161 op0: Final[Callable] = self.op0.op0 # the nullary operator
162 op1: Final[Callable] = self.op1.op1 # the unary operator
164 h, ofs = create_h(process) # Allocate the h-table
166 # Start at a random point in the search space and evaluate it.
167 op0(random, cur_x) # Create 1 solution randomly and
168 cur_f: int | float = evaluate(cur_x) + ofs # evaluate it.
170 while not should_terminate(): # Until we need to quit...
171 op1(random, new_x, cur_x) # new_x = neighbor of cur_x
172 new_f: int | float = evaluate(new_x) + ofs
174 h[new_f] += 1 # type: ignore # Increase the frequency
175 h[cur_f] += 1 # type: ignore # of new_f and cur_f.
176 if h[new_f] <= h[cur_f]: # type: ignore
177 cur_f = new_f # Store its objective value.
178 cur_x, new_x = new_x, cur_x # Swap best and new.
180 if not self.__log_h_tbl:
181 return # we are done here
183 # After we are done, we want to print the H-table.
184 if h[cur_f] == 0: # type: ignore # Fix the H-table for the case
185 h = Counter() # that only one FE was performed: In this case,
186 h[cur_f] = 1 # make Counter with only a single 1 value inside.
188 log_h(process, h, ofs) # log the H-table