Coverage for moptipy / algorithms / so / general_ma.py: 98%
127 statements
« prev ^ index » next coverage.py v7.12.0, created at 2025-11-24 08:49 +0000
« prev ^ index » next coverage.py v7.12.0, created at 2025-11-24 08:49 +0000
1"""
2A fully configurable, general (mu+lambda) Memetic Algorithm.
4This Memetic Algorithm implementation compares to the one in
5:mod:`~moptipy.algorithms.so.ma` like the general Evolutionary Algorithm
6from :mod:`~moptipy.algorithms.so.general_ea` compares to the simple one
7in :mod:`~moptipy.algorithms.so.ea`: It adds survival and mating selection as
8well as a fitness assignment procedure.
10It begins by sampling :attr:`~moptipy.algorithms.so.ma.MA.mu`
11solutions using the nullary search operation
12:attr:`~moptipy.api.algorithm.Algorithm0.op0`. Each of these solutions is
13refined for :attr:`~moptipy.algorithms.so.ma.MA.ls_fes` objective function
14evaluations using the local search :attr:`~moptipy.algorithms.so.ma.MA.ls`.
15In each iteration, this algorithm then uses
16:attr:`~moptipy.algorithms.so.ma.MA.mu` existing solutions as input for
17the binary search operation :attr:`~moptipy.api.algorithm.Algorithm2.op2`.
18The inputs of the operator are chosen from the
19:attr:`~moptipy.algorithms.so.ma.MA.mu` solutions using
20:attr:`~moptipy.algorithms.so.general_ma.GeneralMA.mating`
21selection. Each of the :attr:`~moptipy.algorithms.so.ma.MA.lambda_` new
22solutions have been created this way are again refined for
23:attr:`~moptipy.algorithms.so.ma.MA.ls_fes` objective function evaluations
24using the local search :attr:`~moptipy.algorithms.so.ma.MA.ls`. Then, a
25fitness assignment process (:class:`~moptipy.algorithms.so.fitness.Fitness`)
26assigns fitness values to them based on their objective values
27(:attr:`~moptipy.algorithms.so.record.Record.f`), maybe also using the index
28of the iteration (:attr:`~moptipy.algorithms.so.record.Record.it`) in which
29they were created. The survival selection
30:attr:`~moptipy.algorithms.so.general_ma.GeneralMA.survival` then chooses,
31from the joint set of `mu+lambda` solutions, the `mu` solutions for the
32next iteration. Both mating and survival selection are instances of class
33:class:`~moptipy.algorithms.modules.selection.Selection`.
351. Pablo Moscato. *On Evolution, Search, Optimization, Genetic Algorithms and
36 Martial Arts: Towards Memetic Algorithms.* Caltech Concurrent Computation
37 Program Report C3P 826. 1989. Pasadena, CA, USA: California Institute of
38 Technology (Caltech), Caltech Concurrent Computation Program (C3P).
39 https://www.researchgate.net/publication/2354457
402. Carlos Cotta, Luke Mathieson, and Pablo Moscato. Memetic Algorithms. In
41 Rafael Martí, Panos M. Pardalos, and Mauricio G. C. Resende, editors,
42 *Handbook of Heuristics.* Part~III: Metaheuristics, pages 607-638. 2018.
43 Cham, Switzerland: Springer. ISBN: 978-3-319-07123-7.
44 doi: https://doi.org/10.1007/978-3-319-07153-4_29-1
45 https://www.researchgate.net/publication/315660932
463. William Eugene Hart, James E. Smith, and Natalio Krasnogor, editors.
47 *Recent Advances in Memetic Algorithms.* Studies in Fuzziness and Soft
48 Computing (STUDFUZZ), volume 166. 2005. Berlin/Heidelberg, Germany:
49 Springer. ISBN: 978-3-540-22904-9.
50 doi: https://doi.org/10.1007/3-540-32363-5
514. Ferrante Neri, Carlos Cotta, and Pablo Moscato. *Handbook of Memetic
52 Algorithms.* Volume 379 of Studies in Computational Intelligence (SCI).
53 2012. Berlin/Heidelberg, Germany: Springer. ISBN: 978-3-642-23246-6
54 doi https://doi.org/10.1007/978-3-642-23247-3.
555. L. Darrell Whitley, V. Scott Gordon, and Keith E. Mathias. Lamarckian
56 Evolution, The Baldwin Effect and Function Optimization. In Yuval Davidor,
57 Hans-Paul Schwefel, and Reinhard Männer, editors, *Proceedings of the Third
58 Conference on Parallel Problem Solving from Nature; International
59 Conference on Evolutionary Computation (PPSN III),* October 9-14, 1994,
60 Jerusalem, Israel, pages 5-15. Volume 866/1994 of Lecture Notes in Computer
61 Science, Berlin, Germany: Springer-Verlag GmbH. ISBN 0387584846.
62 doi: https://doi.org/10.1007/3-540-58484-6_245.
63 https://www.researchgate.net/publication/2521727
646. Thomas Bäck, David B. Fogel, and Zbigniew Michalewicz, eds., *Handbook of
65 Evolutionary Computation.* 1997. Computational Intelligence Library.
66 New York, NY, USA: Oxford University Press, Inc. ISBN: 0-7503-0392-1
677. James C. Spall. *Introduction to Stochastic Search and Optimization.*
68 Estimation, Simulation, and Control - Wiley-Interscience Series in Discrete
69 Mathematics and Optimization, volume 6. 2003. Chichester, West Sussex, UK:
70 Wiley Interscience. ISBN: 0-471-33052-3. http://www.jhuapl.edu/ISSO/.
718. Frank Hoffmeister and Thomas Bäck. Genetic Algorithms and Evolution
72 Strategies: Similarities and Differences. In Hans-Paul Schwefel and
73 Reinhard Männer, *Proceedings of the International Conference on Parallel
74 Problem Solving from Nature (PPSN I),* October 1-3, 1990, Dortmund,
75 Germany, volume 496 of Lecture Notes in Computer Science, pages 455-469,
76 Berlin/Heidelberg, Germany: Springer. ISBN: 978-3-540-54148-6.
77 https://doi.org/10.1007/BFb0029787.
78"""
79from typing import Callable, Final, cast
81from numpy.random import Generator
83from moptipy.algorithms.modules.selection import (
84 FitnessRecord,
85 Selection,
86 check_selection,
87)
88from moptipy.algorithms.modules.selections.best import Best
89from moptipy.algorithms.modules.selections.random_without_repl import (
90 RandomWithoutReplacement,
91)
92from moptipy.algorithms.so.fitness import Fitness, FRecord, check_fitness
93from moptipy.algorithms.so.fitnesses.rank_and_iteration import RankAndIteration
94from moptipy.algorithms.so.general_ea import _Record
95from moptipy.algorithms.so.ma import MA
96from moptipy.api.algorithm import Algorithm0
97from moptipy.api.operators import Op0, Op2
98from moptipy.api.process import Process
99from moptipy.api.subprocesses import for_fes, from_starting_point
100from moptipy.operators.op0_forward import Op0Forward # pylint: disable=W0611
101from moptipy.utils.logger import KeyValueLogSection
102from moptipy.utils.strings import PART_SEPARATOR
105class GeneralMA(MA):
106 """The fully customizable (mu+lambda) MA."""
108 def solve(self, process: Process) -> None:
109 """
110 Apply the (mu+lambda) MA to an optimization problem.
112 :param process: the black-box process object
113 """
114 # initialization of some variables omitted in book for brevity
115 mu: Final[int] = self.mu # mu: number of best solutions kept
116 lambda_: Final[int] = self.lambda_ # number of new solutions/gen
117 mu_plus_lambda: Final[int] = mu + lambda_ # size = mu + lambda
118 random: Final[Generator] = process.get_random() # random gen
119 create: Final[Callable] = process.create # create x container
120 evaluate: Final[Callable] = process.evaluate # the objective
121 op0: Final[Callable] = self.op0.op0 # the nullary operator
122 op2: Final[Callable] = self.op2.op2 # the binary operator
123 should_terminate: Final[Callable] = process.should_terminate
124 ls_fes: Final[int] = self.ls_fes # the number of FEs per ls run
125 ls_solve: Final[Callable[[Process], None]] = self.ls.solve # +book
126 forward_ls_op0_to: Final[Callable] = cast( # forward starting
127 "Op0Forward", self.ls.op0).forward_to # point of ls to...
128 assign_fitness: Final[Callable[[list[FRecord], Generator], None]] = \
129 self.fitness.assign_fitness
130 survival_selection: Final[Callable[
131 [list[FRecord], Callable, int, Generator], None]] = \
132 cast("Callable[[list[FRecord], Callable, int, Generator],"
133 "None]", self.survival.select)
134 mating_selection: Final[Callable[
135 [list[FRecord], Callable, int, Generator], None]] = \
136 cast("Callable[[list[FRecord], Callable, int, Generator],"
137 "None]", self.mating.select)
138 recs: Final[list] = [None] * mu_plus_lambda # pre-allocate list
139 parents: Final[list] = [None, None] # mating pool: length 2
140 population: Final[list] = [None] * mu_plus_lambda # whole pop
141 parents_clear: Final[Callable[[], None]] = parents.clear
142 parents_append: Final[Callable[[FitnessRecord], None]] = \
143 cast("Callable[[FitnessRecord], None]", parents.append)
144 population_clear: Final[Callable[[], None]] = population.clear
145 population_append: Final[Callable[[_Record], None]] = \
146 cast("Callable[[_Record], None]", population.append)
148 # create list of mu random/ls records and lambda empty records
149 f: int | float = 0 # variable to hold objective values
150 for i in range(mu_plus_lambda): # fill list of size mu+lambda
151 x = create() # by creating point in search space
152 selected: bool = i < mu # only fully create first mu recs
153 if selected: # only the first mu records are initialized by
154 op0(random, x) # applying nullary operator = randomize
155 if should_terminate(): # should we stop now?
156 cast("Op0Forward", self.ls.op0).stop_forwarding()
157 if process.has_log():
158 self.fitness.log_information_after_run(process)
159 return # computational budget exhausted -> quit
160 with for_fes(process, ls_fes) as s1, \
161 from_starting_point(s1, x, evaluate(x)) as s2:
162 forward_ls_op0_to(s2.get_copy_of_best_x)
163 ls_solve(s2) # apply local search modifying x
164 f = s2.get_best_f() # get quality of x
165 recs[i] = _Record(x, f, selected) # create and store record
167 mating_pool: Final[list] = recs[0:mu] # the selection survivors
168 assign_fitness(mating_pool, random) # assign fitness first time
170 mating_pool_clear: Final[Callable[[], None]] = mating_pool.clear
171 mating_pool_append: Final[Callable[[FitnessRecord], None]] = \
172 cast("Callable[[FitnessRecord], None]", mating_pool.append)
174 it: int = 0 # set the iteration counter
175 while True: # lst: keep 0..mu-1, overwrite mu..mu+lambda-1
176 it += 1 # step the iteration counter
177 population_clear() # clear population
179 di = 0 # set index of next potential destination
180 for _ in range(lambda_): # for all lambda offspring
181 if should_terminate(): # should we stop now?
182 cast("Op0Forward", self.ls.op0).stop_forwarding()
183 if process.has_log():
184 self.fitness.log_information_after_run(process)
185 return # computational budget exhausted -> quit
186 while True: # get the next non-selected record
187 dest = recs[di] # get the record
188 di += 1 # step counter
189 if dest._selected: # if it was selected
190 dest._selected = False # mark it as unselected
191 population_append(dest) # store in population
192 continue # try next record
193 break # use the (unselected) record as destination
195 x = dest.x # the destination "x" value
196 dest.it = it # remember iteration of solution creation
197 parents_clear() # clear mating pool to make room for 2
198 mating_selection(mating_pool, parents_append, 2, random)
200 op2(random, x, parents[0].x, parents[1].x)
201 with for_fes(process, ls_fes) as s1, \
202 from_starting_point(s1, x, evaluate(x)) as s2:
203 forward_ls_op0_to(s2.get_copy_of_best_x)
204 ls_solve(s2) # apply local search modifying x
205 dest.f = s2.get_best_f() # get quality of x
207 population_append(dest) # store in population
209 # add remaining selected solutions from recs to population
210 for di2 in range(di, mu_plus_lambda):
211 other = recs[di2]
212 if other._selected: # only if solution was selected
213 other._selected = False # set as unselected
214 population_append(other) # put into population
216 assign_fitness(population, random) # assign fitness
217 mating_pool_clear() # clear list of survived records
218 survival_selection(population, mating_pool_append, mu, random)
219 for rec in mating_pool: # mark all selected solutions as
220 rec._selected = True # selected
222 def __init__(self, op0: Op0, op2: Op2,
223 ls: Algorithm0,
224 mu: int = 2, lambda_: int = 1,
225 ls_fes: int = 1000,
226 fitness: Fitness | None = None,
227 survival: Selection | None = None,
228 mating: Selection | None = None,
229 name: str = "generalMa") -> None:
230 """
231 Create the customizable Memetic Algorithm (MA).
233 :param op0: the nullary search operator
234 :param op2: the binary search operator
235 :param ls: the local search to apply to each new solution
236 :param mu: the number of best solutions to survive in each generation
237 :param lambda_: the number of offspring in each generation
238 :param ls_fes: the number of FEs (steps) per local search run
239 :param fitness: the fitness assignment process
240 :param survival: the survival selections algorithm
241 :param mating: the mating selections algorithm
242 :param name: the base name of the algorithm
243 """
244 if fitness is None:
245 fitness = RankAndIteration()
246 if fitness.__class__ is not RankAndIteration:
247 name = f"{name}{PART_SEPARATOR}{fitness}"
248 if survival is None:
249 survival = Best()
250 if mating is None:
251 mating = RandomWithoutReplacement()
252 if (survival.__class__ is not Best) \
253 or (mating.__class__ is not RandomWithoutReplacement):
254 name = f"{name}{PART_SEPARATOR}{survival}{PART_SEPARATOR}{mating}"
256 super().__init__(op0, op2, ls, mu, lambda_, ls_fes, name)
257 #: the fitness assignment process
258 self.fitness: Final[Fitness] = check_fitness(fitness)
259 #: the survival selection algorithm
260 self.survival: Final[Selection] = check_selection(survival)
261 #: the mating selection algorithm
262 self.mating: Final[Selection] = check_selection(mating)
264 def log_parameters_to(self, logger: KeyValueLogSection) -> None:
265 """
266 Log the parameters of the algorithm to a logger.
268 :param logger: the logger for the parameters
269 """
270 super().log_parameters_to(logger)
271 with logger.scope("fitness") as v:
272 self.fitness.log_parameters_to(v)
273 with logger.scope("survival") as s:
274 self.survival.log_parameters_to(s)
275 with logger.scope("mating") as m:
276 self.mating.log_parameters_to(m)
278 def initialize(self) -> None:
279 """Initialize the algorithm."""
280 super().initialize()
281 self.survival.initialize()
282 self.mating.initialize()
283 self.fitness.initialize()