Coverage for moptipy / algorithms / so / general_ea.py: 98%
122 statements
« prev ^ index » next coverage.py v7.12.0, created at 2025-11-24 08:49 +0000
« prev ^ index » next coverage.py v7.12.0, created at 2025-11-24 08:49 +0000
1"""
2A fully configurable, general (mu+lambda) Evolutionary Algorithm.
4This evolutionary algorithm begins by sampling
5:attr:`~moptipy.algorithms.so.ea.EA.mu`
6solutions using the nullary search operation
7:attr:`~moptipy.api.algorithm.Algorithm0.op0`. In each iteration, it then uses
8:attr:`~moptipy.algorithms.so.ea.EA.mu` existing solutions as input for
9the search operations, where, for each solution to be sampled, the binary
10operation :attr:`~moptipy.api.algorithm.Algorithm2.op2` is used with
11probability :attr:`~moptipy.algorithms.so.ea.EA.br` and (otherwise), the unary
12operator :attr:`~moptipy.api.algorithm.Algorithm1` is used. The inputs of both
13operators are chosen from the :attr:`~moptipy.algorithms.so.ea.EA.mu`
14solutions using :attr:`~moptipy.algorithms.so.general_ea.GeneralEA.mating`
15selection. After :attr:`~moptipy.algorithms.so.ea.EA.lambda_` new solutions
16have been created this way (and have been evaluated as well), a fitness
17assignment process (:class:`~moptipy.algorithms.so.fitness.Fitness`) assigns
18fitness values to them based on their objective values
19(:attr:`~moptipy.algorithms.so.record.Record.f`), maybe also using the index
20of the iteration (:attr:`~moptipy.algorithms.so.record.Record.it`) in which
21they were created. The survival selection
22:attr:`~moptipy.algorithms.so.general_ea.GeneralEA.survival` then chooses,
23from the joint set of `mu+lambda` solutions, the `mu` solutions for the
24next iteration. Both mating and survival selection are instances of class
25:class:`~moptipy.algorithms.modules.selection.Selection`.
27This algorithm is equivalent to :class:`~moptipy.algorithms.so.ea.EA`, but
28allows for using a customized fitness assignment step
29(:class:`~moptipy.algorithms.so.fitness.Fitness`) as well as customizable
30survival and :attr:`~moptipy.algorithms.so.general_ea.GeneralEA.mating`
31selection (:class:`~moptipy.algorithms.modules.selection.Selection`).
331. Thomas Bäck, David B. Fogel, and Zbigniew Michalewicz, eds., *Handbook of
34 Evolutionary Computation.* 1997. Computational Intelligence Library.
35 New York, NY, USA: Oxford University Press, Inc. ISBN: 0-7503-0392-1
362. James C. Spall. *Introduction to Stochastic Search and Optimization.*
37 Estimation, Simulation, and Control - Wiley-Interscience Series in Discrete
38 Mathematics and Optimization, volume 6. 2003. Chichester, West Sussex, UK:
39 Wiley Interscience. ISBN: 0-471-33052-3. http://www.jhuapl.edu/ISSO/.
403. Frank Hoffmeister and Thomas Bäck. Genetic Algorithms and Evolution
41 Strategies: Similarities and Differences. In Hans-Paul Schwefel and
42 Reinhard Männer, *Proceedings of the International Conference on Parallel
43 Problem Solving from Nature (PPSN I),* October 1-3, 1990, Dortmund,
44 Germany, volume 496 of Lecture Notes in Computer Science, pages 455-469,
45 Berlin/Heidelberg, Germany: Springer. ISBN: 978-3-540-54148-6.
46 https://doi.org/10.1007/BFb0029787.
47"""
48from typing import Callable, Final, cast
50from numpy.random import Generator
52from moptipy.algorithms.modules.selection import (
53 FitnessRecord,
54 Selection,
55 check_selection,
56)
57from moptipy.algorithms.modules.selections.best import Best
58from moptipy.algorithms.modules.selections.random_without_repl import (
59 RandomWithoutReplacement,
60)
61from moptipy.algorithms.so.ea import EA, _float_0
62from moptipy.algorithms.so.fitness import Fitness, FRecord, check_fitness
63from moptipy.algorithms.so.fitnesses.rank_and_iteration import RankAndIteration
64from moptipy.api.operators import Op0, Op1, Op2
65from moptipy.api.process import Process
66from moptipy.utils.logger import KeyValueLogSection
67from moptipy.utils.strings import PART_SEPARATOR
70class _Record(FRecord):
71 """Same as `FRecord`, but with a secret selection marker."""
73 def __init__(self, x, f: int | float, selected: bool = False):
74 """
75 Create the record.
77 :param x: the data structure for a point in the search space
78 :param f: the corresponding objective value
79 :param selected: is the record currently in use?
80 """
81 super().__init__(x, f)
82 #: an internal flag - do NOT access!!
83 self._selected: bool = selected
86# start book
87class GeneralEA(EA):
88 """The fully customizable (mu+lambda) EA."""
90 def solve(self, process: Process) -> None:
91 """
92 Apply the (mu+lambda) EA to an optimization problem.
94 :param process: the black-box process object
95 """
96 # initialization of some variables omitted in book for brevity
97# end book
98 mu: Final[int] = self.mu # mu: number of best solutions kept
99 lambda_: Final[int] = self.lambda_ # number of new solutions/gen
100 mu_plus_lambda: Final[int] = mu + lambda_ # size = mu + lambda
101 random: Final[Generator] = process.get_random() # random gen
102 create: Final[Callable] = process.create # create x container
103 evaluate: Final[Callable] = process.evaluate # the objective
104 op0: Final[Callable] = self.op0.op0 # the nullary operator
105 op1: Final[Callable] = self.op1.op1 # the unary operator
106 op2: Final[Callable] = self.op2.op2 # the binary operator
107 br: Final[float] = self.br # the rate at which to use op2
108 should_terminate: Final[Callable] = process.should_terminate
109 r01: Final[Callable[[], float]] = cast( # only if 0<br<1, we
110 "Callable[[], float]", # need random floats
111 random.random if 0 < br < 1 else _float_0)
112 assign_fitness: Final[Callable[[list[FRecord], Generator], None]] = \
113 self.fitness.assign_fitness
114 survival_selection: Final[Callable[
115 [list[FRecord], Callable, int, Generator], None]] = \
116 cast("Callable[[list[FRecord], Callable, "
117 "int, Generator], None]", self.survival.select)
118 mating_selection: Final[Callable[
119 [list[FRecord], Callable, int, Generator], None]] = \
120 cast("Callable[[list[FRecord], Callable, "
121 "int, Generator], None]", self.mating.select)
122 recs: Final[list] = [None] * mu_plus_lambda # pre-allocate list
123 parents: Final[list] = [None, None] # mating pool: length 2
124 population: Final[list] = [None] * mu_plus_lambda # whole pop
125 parents_clear: Final[Callable[[], None]] = parents.clear
126 parents_append: Final[Callable[[FitnessRecord], None]] = \
127 cast("Callable[[FitnessRecord], None]", parents.append)
128 population_clear: Final[Callable[[], None]] = population.clear
129 population_append: Final[Callable[[_Record], None]] = \
130 cast("Callable[[_Record], None]", population.append)
131# start book
132 # create list of mu random records and lambda empty records
133 f: int | float = 0 # variable to hold objective values
134 for i in range(mu_plus_lambda): # fill list of size mu+lambda
135 x = create() # by creating point in search space
136 selected: bool = i < mu # only fully create first mu recs
137 if selected: # only the first mu records are initialized by
138 op0(random, x) # applying nullary operator = randomize
139 if should_terminate(): # should we quit? # -book
140 if process.has_log(): # -book
141 self.fitness.log_information_after_run( # -book
142 process) # -book
143 return # computational budget exhausted -> quit # -book
144 f = evaluate(x) # continue? ok, evaluate new solution
145 recs[i] = _Record(x, f, selected) # create and store record
147 mating_pool: Final[list] = recs[0:mu] # the selection survivors
148 assign_fitness(mating_pool, random) # assign fitness first time
149# end book
150 mating_pool_clear: Final[Callable[[], None]] = mating_pool.clear
151 mating_pool_append: Final[Callable[[FitnessRecord], None]] = \
152 cast("Callable[[FitnessRecord], None]", mating_pool.append)
153# start book
154 it: int = 0 # set the iteration counter
155 while True: # lst: keep 0..mu-1, overwrite mu..mu+lambda-1
156 it += 1 # step the iteration counter
157 population_clear() # clear population
159 di = 0 # set index of next potential destination
160 for _ in range(lambda_): # for all lambda offspring
161 if should_terminate(): # only continue if we still... # -book
162 if process.has_log(): # -book
163 self.fitness.log_information_after_run( # -book
164 process) # -book
165 return # ...have sufficient budget # -book
166 while True: # get the next non-selected record
167 dest = recs[di] # get the record
168 di += 1 # step counter
169 if dest._selected: # if it was selected
170 dest._selected = False # mark it as unselected
171 population_append(dest) # store in population
172 continue # try next record
173 break # use the (unselected) record as destination
175 x = dest.x # the destination "x" value
176 dest.it = it # remember iteration of solution creation
177 do_binary: bool = r01() < br # will we do binary operation?
178 parents_clear() # clear mating pool: room for 2
179 mating_selection(mating_pool, parents_append,
180 2 if do_binary else 1, random)
182 if do_binary: # binary operation (with p == br)
183 op2(random, x, parents[0].x, parents[1].x)
184 else: # unary operation otherwise
185 op1(random, x, parents[0].x) # apply unary op
186 dest.f = evaluate(x) # evaluate new point
187 population_append(dest) # store in population
189 # add remaining selected solutions from recs to population
190 # from index di to mu+lambda ... omitted for brevity in book
191 # end book
192 for di2 in range(di, mu_plus_lambda):
193 other = recs[di2]
194 if other._selected: # only if solution was selected
195 other._selected = False # set as unselected
196 population_append(other) # put into population
197 # start book
198 assign_fitness(population, random) # assign fitness
199 mating_pool_clear() # clear list of survived records
200 survival_selection(population, mating_pool_append, mu, random)
201 for rec in mating_pool: # mark all selected solutions as
202 rec._selected = True # selected
203# end book
205 def __init__(self, op0: Op0,
206 op1: Op1 | None = None,
207 op2: Op2 | None = None,
208 mu: int = 1, lambda_: int = 1,
209 br: float | None = None,
210 fitness: Fitness | None = None,
211 survival: Selection | None = None,
212 mating: Selection | None = None,
213 name: str = "generalEa") -> None:
214 """
215 Create the customizable Evolutionary Algorithm (EA).
217 :param op0: the nullary search operator
218 :param op1: the unary search operator
219 :param op2: the binary search operator
220 :param mu: the number of best solutions to survive in each generation
221 :param lambda_: the number of offspring in each generation
222 :param br: the rate at which the binary operator is applied
223 :param fitness: the fitness assignment process
224 :param survival: the survival selections algorithm
225 :param mating: the mating selections algorithm
226 :param name: the base name of the algorithm
227 """
228 if fitness is None:
229 fitness = RankAndIteration()
230 if fitness.__class__ is not RankAndIteration:
231 name = f"{name}{PART_SEPARATOR}{fitness}"
232 if survival is None:
233 survival = Best()
234 if mating is None:
235 mating = RandomWithoutReplacement()
236 if (survival.__class__ is not Best) \
237 or (mating.__class__ is not RandomWithoutReplacement):
238 name = f"{name}{PART_SEPARATOR}{survival}{PART_SEPARATOR}{mating}"
240 super().__init__(op0, op1, op2, mu, lambda_, br, name)
241 #: the fitness assignment process
242 self.fitness: Final[Fitness] = check_fitness(fitness)
243 #: the survival selection algorithm
244 self.survival: Final[Selection] = check_selection(survival)
245 #: the mating selection algorithm
246 self.mating: Final[Selection] = check_selection(mating)
248 def log_parameters_to(self, logger: KeyValueLogSection) -> None:
249 """
250 Log the parameters of the algorithm to a logger.
252 :param logger: the logger for the parameters
253 """
254 super().log_parameters_to(logger)
255 with logger.scope("fitness") as v:
256 self.fitness.log_parameters_to(v)
257 with logger.scope("survival") as s:
258 self.survival.log_parameters_to(s)
259 with logger.scope("mating") as m:
260 self.mating.log_parameters_to(m)
262 def initialize(self) -> None:
263 """Initialize the algorithm."""
264 super().initialize()
265 self.survival.initialize()
266 self.mating.initialize()
267 self.fitness.initialize()