moptipy.examples.bitstrings package¶
Example for bit-string based objective functions.
The problems noted here are mainly well-known benchmark problems from the discrete optimization community. For these problems, we designed the dedicated base class BitStringProblem
.
The following benchmark problems are provided:
ising1d
, the one-dimensional Ising model, where the goal is that all bits should have the same value as their neighbors.The
jump
problem is equivalent toonemax
, but has a deceptive region right before the optimum.The
leadingones
problem, where the goal is to find a bit string with the maximum number of leading ones.The
onemax
problem, where the goal is to find a bit string with the maximum number of ones.The
trap
problem, which is like OneMax, but with the optimum and worst-possible solution swapped. This problem is therefore highly deceptive.The
w_model
, a benchmark problem with tunable epistasis, uniform neutrality, and ruggedness/deceptiveness.The
zeromax
problem, where the goal is to find a bit string with the maximum number of zeros. This is the opposite of the OneMax problem.
Submodules¶
moptipy.examples.bitstrings.bitstring_problem module¶
A base class for bitstring-based problems.
- class moptipy.examples.bitstrings.bitstring_problem.BitStringProblem(n)[source]¶
Bases:
Objective
A base class for problems defined over bit strings.
This base class has a set of default behaviors. It has an attribute
n
denoting the lengths of the accepted bit strings. Itslower_bound()
returns 0 and itsupper_bound()
returnsn
.is_always_integer()
returns True.- is_always_integer()[source]¶
Return True if the evaluate function always returns an int.
This pre-defined function for bit-string based problems will always return True.
- Retval True:
always
- Return type:
- log_parameters_to(logger)[source]¶
Log all parameters of this component as key-value pairs.
- Parameters:
logger (
KeyValueLogSection
) – the logger for the parameters
>>> from moptipy.utils.logger import InMemoryLogger >>> with InMemoryLogger() as l: ... with l.key_values("C") as kv: ... BitStringProblem(5).log_parameters_to(kv) ... text = l.get_log() >>> text[1] 'name: BitStringProblem' >>> text[3] 'lowerBound: 0' >>> text[4] 'upperBound: 5' >>> text[5] :rtype: :sphinx_autodoc_typehints_type:`\:py\:obj\:\`None\`` 'n: 5' >>> len(text) 7
- lower_bound()[source]¶
Get the lower bound of the bit string based problem.
- Return type:
- Returns:
0
>>> print(BitStringProblem(10).lower_bound()) 0
- space()[source]¶
Create a proper search space for this problem.
- Return type:
- Returns:
an instance of
BitStrings
for bit strings of lengthn
moptipy.examples.bitstrings.ising1d module¶
The one-dimensional Ising problem.
The one-dimensional Ising problem describes a ring. For each bit that differs from its (right-side) neighboring bit, a penalty of 1 is incurred. The optimum is a bit string of either all ones or all zeros.
Simon Fischer and Ingo Wegener. The one-dimensional Ising model: Mutation versus recombination. Theoretical Computer Science. 344(2-3):208-225. November 2005. doi: https://doi.org/10.1016/j.tcs.2005.04.002
Carola Doerr and Furong Ye and Naama Horesh and Hao Wang and Ofer M. Shir and Thomas Bäck. Benchmarking Discrete Optimization Heuristics with IOHprofiler. Applied Soft Computing 88:106027, March 2020, doi: https://doi.org/10.1016/j.asoc.2019.106027.
Clarissa Van Hoyweghen, David Edward Goldberg, and Bart Naudts. From Twomax To The Ising Model: Easy And Hard Symmetrical Problems. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO’02), July 9-13, 2002, New York, NY, USA, pages 626-633. Morgan Kaufmann. http://gpbib.cs.ucl.ac.uk/gecco2002/GA013.pdf
- class moptipy.examples.bitstrings.ising1d.Ising1d(n)[source]¶
Bases:
BitStringProblem
The one-dimensional Ising problem.
- moptipy.examples.bitstrings.ising1d.ising1d(x)[source]¶
Compute the objective value of the 1-dimensional Ising problem.
>>> print(ising1d(np.array([True, True, True, True, True]))) 0 >>> print(ising1d(np.array([False, False, False, False, False]))) 0 >>> print(ising1d(np.array([False, False, False, True, False]))) 2 >>> print(ising1d(np.array([True, False, False, False, False]))) 2 >>> print(ising1d(np.array([False, False, False, False, True]))) 2 >>> print(ising1d(np.array([True, False, False, False, True]))) 2 >>> print(ising1d(np.array([True, False, True, False, False]))) 4 >>> print(ising1d(np.array([True, False, True, False, True, False]))) 6
moptipy.examples.bitstrings.jump module¶
The Jump problem.
The jump problem is basically OneMax, but with a deceptive region of k bit flips before the optimum.
Stefan Droste, Thomas Jansen, and Ingo Wegener. On the Analysis of the (1+1) Evolutionary Algorithm. Theoretical Computer Science. 276(1-2):51-81. April 2002. doi: https://doi.org/10.1016/S0304-3975(01)00182-7
Tobias Friedrich, Francesco Quinzan, and Markus Wagner. Escaping Large Deceptive Basins of Attraction with Heavy-Tailed Mutation Operators. In Hernán E. Aguirre and Keiki Takadama, editors, Proceedings of the Genetic and Evolutionary Computation Conference (GECCO’18), July 15-19, 2018. Kyoto, Japan. ACM. doi: https://doi.org/10.1145/3205455.3205515}
Francesco Quinzan, Andreas Göbel, Markus Wagner, and Tobias Friedrich. Evolutionary algorithms and submodular functions: benefits of heavy-tailed mutations. Natural Computing. 20(3):561-575. September 2021. doi: https://doi.org/10.1007/s11047-021-09841-7. Also: arXiv:1805.10902v2 [cs.DS] 21 Nov 2018. https://arxiv.org/abs/1805.10902
- class moptipy.examples.bitstrings.jump.Jump(n, k)[source]¶
Bases:
BitStringProblem
Compute the Jump problem.
- log_parameters_to(logger)[source]¶
Log all parameters of this component as key-value pairs.
- Parameters:
logger (
KeyValueLogSection
) – the logger for the parameters
>>> from moptipy.utils.logger import InMemoryLogger >>> with InMemoryLogger() as l: ... with l.key_values("C") as kv: ... Jump(23, 7).log_parameters_to(kv) ... text = l.get_log() >>> text[1] 'name: jump_23_7' >>> text[3] 'lowerBound: 0' >>> text[4] 'upperBound: 29' >>> text[5] 'n: 23' >>> text[6] :rtype: :sphinx_autodoc_typehints_type:`\:py\:obj\:\`None\`` 'k: 7' >>> len(text) 8
- moptipy.examples.bitstrings.jump.jump(x, k)[source]¶
Compute the jump value.
>>> print(jump(np.array([False, False, False, False, False, False]), 2)) 6 >>> print(jump(np.array([False, False, False, False, True, False]), 2)) 5 >>> print(jump(np.array([False, True, True, False, False, False]), 2)) 4 >>> print(jump(np.array([True, False, True, False, True, False]), 2)) 3 >>> print(jump(np.array([True, False, True, False, True, True]), 2)) 2 >>> print(jump(np.array([True, True, True, True, True, False]), 2)) 7 >>> print(jump(np.array([True, True, True, True, True, True]), 2)) 0
moptipy.examples.bitstrings.leadingones module¶
An objective function counting the leading ones in a bit string.
- class moptipy.examples.bitstrings.leadingones.LeadingOnes(n)[source]¶
Bases:
BitStringProblem
Maximize the number of leadings ones in a bit string.
- moptipy.examples.bitstrings.leadingones.leadingones(x)[source]¶
Get the length of the string minus the number of leading ones.
>>> print(leadingones(np.array([False, False, True, False, False]))) 5 >>> print(leadingones(np.array([True, False, False, True, True]))) 4 >>> print(leadingones(np.array([True, True, False, False, False]))) 3 >>> print(leadingones(np.array([True, True, True, False, True]))) 2 >>> print(leadingones(np.array([True, True, True, True, False]))) 1 >>> print(leadingones(np.array([True, True, True, True, True]))) 0
moptipy.examples.bitstrings.onemax module¶
An objective function counting the number of ones in a bit string.
Heinz Mühlenbein. How Genetic Algorithms Really Work: Mutation and Hillclimbing. In Reinhard Männer and Bernard Manderick, editors, Proceedings of Parallel Problem Solving from Nature 2 (PPSN-II), September 28-30, 1992, Brussels, Belgium, pages 15-26. Elsevier. https://www.researchgate.net/publication/220702092
Stefan Droste, Thomas Jansen, and Ingo Wegener. Upper and Lower Bounds for Randomized Search Heuristics in Black-Box Optimization. Theory of Computing Systems. 39(4):525-544. July 2006. doi: https://doi.org/10.1007/s00224-004-1177-z
- class moptipy.examples.bitstrings.onemax.OneMax(n)[source]¶
Bases:
BitStringProblem
Maximize the number of ones in a bit string.
- moptipy.examples.bitstrings.onemax.onemax(x)[source]¶
Get the length of a string minus the number of ones in it.
- Parameters:
x (
ndarray
) – the np array- Return type:
- Returns:
the length of the string minus the number of ones, i.e., the number of zeros
>>> print(onemax(np.array([True, True, False, False, False]))) 3 >>> print(onemax(np.array([True, False, True, False, False]))) 3 >>> print(onemax(np.array([False, True, True, False, False]))) 3 >>> print(onemax(np.array([True, True, True, True, True]))) 0 >>> print(onemax(np.array([False, True, True, True, True]))) 1 >>> print(onemax(np.array([False, False, False, False, False]))) 5
moptipy.examples.bitstrings.trap module¶
The well-known Trap problem.
- class moptipy.examples.bitstrings.trap.Trap(n)[source]¶
Bases:
BitStringProblem
The trap problem.
- moptipy.examples.bitstrings.trap.trap(x)[source]¶
Compute the trap objective value.
>>> print(trap(np.array([True, True, False, False, False]))) 3 >>> print(trap(np.array([True, False, True, False, False]))) 3 >>> print(trap(np.array([False, True, True, False, False]))) 3 >>> print(trap(np.array([True, True, True, True, True]))) 0 >>> print(trap(np.array([False, True, True, True, True]))) 5 >>> print(trap(np.array([False, False, False, False, False]))) 1 >>> print(trap(np.array([False, True, False, False, False]))) 2 >>> print(trap(np.array([False, True, True, True, False]))) 4
moptipy.examples.bitstrings.w_model module¶
A Python implementation of the W-Model benchmark problem.
This is a tunable benchmark problem for bit-string based search spaces. It exhibits ruggedness, epistasis, deceptiveness, and (uniform) neutrality in a tunable fashion. In [1], a set of 19 diverse instances of the W-Model are selected for algorithm benchmarking. These instances are provided via default_instances()
.
Thomas Weise, Yan Chen, Xinlu Li, and Zhize Wu. Selecting a diverse set of benchmark instances from a tunable model problem for black-box discrete optimization algorithms. Applied Soft Computing Journal (ASOC), 92:106269, June 2020. doi: https://doi.org/10.1016/j.asoc.2020.106269
Thomas Weise and Zijun Wu. Difficult Features of Combinatorial Optimization Problems and the Tunable W-Model Benchmark Problem for Simulating them. In Black Box Discrete Optimization Benchmarking (BB-DOB) Workshop of Companion Material Proceedings of the Genetic and Evolutionary Computation Conference (GECCO 2018), July 15th-19th 2018, Kyoto, Japan, pages 1769-1776, ISBN 978-1-4503-5764-7. ACM. doi: https://doi.org/10.1145/3205651.3208240
Thomas Weise, Stefan Niemczyk, Hendrik Skubch, Roland Reichle, and Kurt Geihs. A Tunable Model for Multi-Objective, Epistatic, Rugged, and Neutral Fitness Landscapes. In Maarten Keijzer, Giuliano Antoniol, Clare Bates Congdon, Kalyanmoy Deb, Benjamin Doerr, Nikolaus Hansen, John H. Holmes, Gregory S. Hornby, Daniel Howard, James Kennedy, Sanjeev P. Kumar, Fernando G. Lobo, Julian Francis Miller, Jason H. Moore, Frank Neumann, Martin Pelikan, Jordan B. Pollack, Kumara Sastry, Kenneth Owen Stanley, Adrian Stoica, El-Ghazali, and Ingo Wegener, editors, Proceedings of the 10th Annual Conference on Genetic and Evolutionary Computation (GECCO’08), pages 795-802, July 12-16, 2008, Atlanta, GA, USA. ISBN: 978-1-60558-130-9, New York, NY, USA: ACM Press. doi: https://doi.org/10.1145/1389095.1389252
Carola Doerr, Furong Ye, Naama Horesh, Hao Wang, Ofer M. Shir, Thomas Bäck. Benchmarking Discrete Optimization Heuristics with IOHprofiler. Applied Soft Computing 88:106027, March 2020, doi: https://doi.org/10.1016/j.asoc.2019.106027.
- class moptipy.examples.bitstrings.w_model.WModel(nopt, m=1, nu=2, gamma=0, name=None)[source]¶
Bases:
BitStringProblem
The tunable W-Model benchmark problem.
- static default_instances()[source]¶
Get the 19 default instances of the W-Model.
- Return type:
- Returns:
an Iterable that can provide callables constructing the 19 default instances of the W-Model
>>> len(list(WModel.default_instances())) 19
- log_parameters_to(logger)[source]¶
Log all parameters of this component as key-value pairs.
- Parameters:
logger (
KeyValueLogSection
) – the logger for the parameters
>>> from moptipy.utils.logger import InMemoryLogger >>> with InMemoryLogger() as l: ... with l.key_values("C") as kv: ... WModel(6, 2, 4, 7).log_parameters_to(kv) ... text = l.get_log() >>> text[1] 'name: wmodel_6_2_4_7' >>> text[3] 'lowerBound: 0' >>> text[4] 'upperBound: 6' >>> text[5] 'n: 12' >>> text[6] 'nopt: 6' >>> text[7] 'm: 2' >>> text[8] 'nu: 4' >>> text[9] 'nu1: 0.5' >>> text[11] 'gamma: 7' >>> text[12] 'gamma1: 0.4666666666666667' >>> text[14] :rtype: :sphinx_autodoc_typehints_type:`\:py\:obj\:\`None\`` 'gammaPrime: 11' >>> len(text) 16
- moptipy.examples.bitstrings.w_model.w_model_create_ruggedness_permutation(gamma, nopt, r)[source]¶
Create the raw ruggedness array.
This function creates the ruggedness permutation where groups of rugged transformations alternate with deceptive permutations for increasing gamma.
- Parameters:
>>> r = array('i', range(11)) >>> w_model_create_ruggedness_permutation(0, 10, r) >>> "".join(str(xx) for xx in r) '012345678910' >>> r = array('i', range(7)) >>> w_model_create_ruggedness_permutation(9, 6, r) >>> r[3] :rtype: :sphinx_autodoc_typehints_type:`\:py\:obj\:\`None\`` 3 >>> r[6] 5
- moptipy.examples.bitstrings.w_model.w_model_epistasis(x_in, nu, x_out)[source]¶
Perform the epistasis transformation.
- moptipy.examples.bitstrings.w_model.w_model_f(x)[source]¶
Compute the basic hamming distance of the W-Model to the optimal string.
The optimal string in the W-Model is 0101010101…. Here we compute the objective value of a candidate solution, i.e., the Hamming distance to the string 010101010101…. This is the basic problem objective function. It can be applied either directly, on top of transformations such as the neutrality- or epistasis mappings. Its result can be transformed by a ruggedness permutation to introduce ruggedness into the problem.
- Parameters:
x (
ndarray
) – the bit string to evaluate- Return type:
- Returns:
the Hamming distance to the string of alternating zeros and ones and starting with 0.
>>> w_model_f(np.array([False])) 0 >>> w_model_f(np.array([True])) 1 >>> w_model_f(np.array([False, True])) 0 >>> w_model_f(np.array([False, False])) 1 >>> w_model_f(np.array([True, True])) 1 >>> w_model_f(np.array([True, False])) 2 >>> w_model_f(np.array([False, True, False])) 0 >>> w_model_f(np.array([True, False, True])) 3 >>> w_model_f(np.array([True, True, True])) 2 >>> w_model_f(np.array([True, True, False])) 1 >>> w_model_f(np.array([False, True, True])) 1 >>> w_model_f(np.array([False, True, False, True, False, True, False])) 0 >>> w_model_f(np.array([False, True, False, True, False, True, True])) 1 >>> w_model_f(np.array([True, False, True, False, True, False, True])) 7
- moptipy.examples.bitstrings.w_model.w_model_max_gamma(nopt)[source]¶
Compute the maximum gamma value for a given length nopt.
- moptipy.examples.bitstrings.w_model.w_model_neutrality(x_in, mu, x_out)[source]¶
Perform the neutrality transformation.
The neutrality transformation is the first layer of the W-Model. It introduces (or better, removes during the mapping) uniform redundancy by basically reducing the size of a bit string by factor mu.
Basically, the input array x_in is split in consecutive bit groups of length mu. Each such group is translated to one bit in the output string x_out. This bit will become 1 if the majority of the bits in the input group are also 1. Otherwise it becomes 0.
Notice that len(x_out) >= mu * length(x_in) must hold.
- Parameters:
>>> xin = np.array([0, 1, 0, 0, 0, 1, 1, 0, 0, 0, 0, ... 0, 1, 1, 1, 0, 1, 0, 0, 0, 0], bool) >>> xout = np.empty(len(xin) // 2, bool) >>> w_model_neutrality(xin, 2, xout) >>> print("".join('1' if xx else '0' for xx in xout)) 1011001110 >>> xin = np.array([0, 1, 0, 1, 0, 1, 1, 0, 0, 0, 0, ... 0, 1, 1, 1, 0, 1, 0, 0, 0, 0], bool) :rtype: :sphinx_autodoc_typehints_type:`\:py\:obj\:\`None\`` >>> w_model_neutrality(xin, 2, xout) >>> print("".join('1' if xx else '0' for xx in xout)) 1111001110
- moptipy.examples.bitstrings.w_model.w_model_ruggedness_translate(gamma, nopt)[source]¶
Transform the gamma values such that deceptive follows rugged.
If the ruggedness permutations are created with raw gamma values, then rugged and deceptive permutations will alternate with rising gamma. With this function here, the gamma parameters are translated such that first all the rugged permutations come (with rising degree of ruggedness) and then the deceptive ones come.
- Parameters:
- Return type:
- Returns:
the translated gamma values
>>> w_model_ruggedness_translate(12, 6) 9 >>> w_model_ruggedness_translate(34, 25) 57 >>> w_model_ruggedness_translate(0, 5) 0 >>> w_model_ruggedness_translate(1, 5) 1 >>> w_model_ruggedness_translate(2, 5) 2 >>> w_model_ruggedness_translate(3, 5) 3 >>> w_model_ruggedness_translate(4, 5) 4 >>> w_model_ruggedness_translate(5, 5) 8 >>> w_model_ruggedness_translate(6, 5) 9 >>> w_model_ruggedness_translate(7, 5) 10 >>> w_model_ruggedness_translate(8, 5) 7 >>> w_model_ruggedness_translate(9, 5) 6 >>> w_model_ruggedness_translate(10, 5) 5
moptipy.examples.bitstrings.zeromax module¶
An objective function counting the number of zeros in a bit string.
This problem exists mainly for testing purposes as counterpart of OneMax
.
- class moptipy.examples.bitstrings.zeromax.ZeroMax(n)[source]¶
Bases:
BitStringProblem
Maximize the number of zeros in a bit string.
- moptipy.examples.bitstrings.zeromax.zeromax(x)[source]¶
Get the length of a string minus the number of zeros in it.
- Parameters:
x (
ndarray
) – the np array- Return type:
- Returns:
the length of the string minus the number of zeros, i.e., the number of ones
>>> print(zeromax(np.array([True, True, False, False, False]))) 2 >>> print(zeromax(np.array([True, False, True, False, False]))) 2 >>> print(zeromax(np.array([False, True, True, False, False]))) 2 >>> print(zeromax(np.array([True, True, True, True, True]))) 5 >>> print(zeromax(np.array([False, True, True, True, True]))) 4 >>> print(zeromax(np.array([False, False, False, False, False]))) 0