moptipy.algorithms.so.ffa package

Algorithms and modules based on Frequency Fitness Assignment (FFA).

Frequency Fitness Assignment (FFA) is an algorithm plugin developed by the team behind moptipy. FFA is suitable only for problems where the number of different objective values is not too high. On such problems, which are often discrete or combinatorial, it often can improve the result quality that the optimization algorithms can reach, at the expense of requiring a larger runtime.

  1. Thomas Weise, Zhize Wu, Xinlu Li, and Yan Chen. Frequency Fitness Assignment: Making Optimization Algorithms Invariant under Bijective Transformations of the Objective Function Value. IEEE Transactions on Evolutionary Computation 25(2):307-319. April 2021. Preprint available at arXiv:2001.01416v5 [cs.NE] 15 Oct 2020. https://dx.doi.org/10.1109/TEVC.2020.3032090

  2. Thomas Weise, Zhize Wu, Xinlu Li, Yan Chen, and Jörg Lässig. Frequency Fitness Assignment: Optimization without Bias for Good Solutions can be Efficient. IEEE Transactions on Evolutionary Computation (TEVC). 27(4):980-992. August 2023. doi: https://doi.org/10.1109/TEVC.2022.3191698

  3. Thomas Weise, Mingxu Wan, Ke Tang, Pu Wang, Alexandre Devert, and Xin Yao. Frequency Fitness Assignment. IEEE Transactions on Evolutionary Computation (IEEE-EC), 18(2):226-243, April 2014. https://dx.doi.org/10.1109/TEVC.2013.2251885

  4. Thomas Weise, Yan Chen, Xinlu Li, and Zhize Wu. Selecting a diverse set of benchmark instances from a tunable model problem for black-box discrete optimization algorithms. Applied Soft Computing Journal (ASOC), 92:106269, June 2020. https://dx.doi.org/10.1016/j.asoc.2020.106269

  5. Thomas Weise, Xinlu Li, Yan Chen, and Zhize Wu. Solving Job Shop Scheduling Problems Without Using a Bias for Good Solutions. In Genetic and Evolutionary Computation Conference Companion (GECCO’21 Companion), July 10-14, 2021, Lille, France. ACM, New York, NY, USA. ISBN 978-1-4503-8351-6. https://dx.doi.org/10.1145/3449726.3463124

  6. Thomas Weise, Mingxu Wan, Ke Tang, and Xin Yao. Evolving Exact Integer Algorithms with Genetic Programming. In Proceedings of the IEEE Congress on Evolutionary Computation (CEC’14), Proceedings of the 2014 World Congress on Computational Intelligence (WCCI’14), pages 1816-1823, July 6-11, 2014, Beijing, China. Los Alamitos, CA, USA: IEEE Computer Society Press. ISBN: 978-1-4799-1488-3. https://dx.doi.org/10.1109/CEC.2014.6900292

  7. Tianyu Liang, Zhize Wu, Jörg Lässig, Daan van den Berg, Sarah Louise Thomson, and Thomas Weise. Addressing the Traveling Salesperson Problem with Frequency Fitness Assignment and Hybrid Algorithms. Soft Computing. 2024. https://dx.doi.org/10.1007/s00500-024-09718-8

Submodules

moptipy.algorithms.so.ffa.eafea module

The EAFEA is hybrid of the (1+1)FEA and the (1+1)EA without Solution Transfer.

  1. Thomas Weise, Zhize Wu, Xinlu Li, Yan Chen, and Jörg Lässig. Frequency Fitness Assignment: Optimization without Bias for Good Solutions can be Efficient. IEEE Transactions on Evolutionary Computation (TEVC). 27(4):980-992. August 2023. doi: https://doi.org/10.1109/TEVC.2022.3191698

class moptipy.algorithms.so.ffa.eafea.EAFEA(op0, op1, log_h_tbl=False)[source]

Bases: Algorithm1

An implementation of the EAFEA.

solve(process)[source]

Apply the EAFEA to an optimization problem.

Parameters:

process (Process) – the black-box process object

Return type:

None

moptipy.algorithms.so.ffa.eafea_a module

The EAFEA-A is hybrid of the (1+1)FEA and the (1+1)EA with Solution Transfer.

The solution is transferred from the FEA branch to the EA branch if its H-value is 1, i.e., if it represents a completely new objective value.

  1. Tianyu Liang, Zhize Wu, Jörg Lässig, Daan van den Berg, Sarah Louise Thomson, and Thomas Weise. Addressing the Traveling Salesperson Problem with Frequency Fitness Assignment and Hybrid Algorithms. Soft Computing. 2024. https://dx.doi.org/10.1007/s00500-024-09718-8

class moptipy.algorithms.so.ffa.eafea_a.EAFEAA(op0, op1, log_h_tbl=False)[source]

Bases: Algorithm1

An implementation of the EAFEA-A.

solve(process)[source]

Apply the EAFEA-A to an optimization problem.

Parameters:

process (Process) – the black-box process object

Return type:

None

moptipy.algorithms.so.ffa.eafea_b module

The EAFEA-B is hybrid of the (1+1)FEA and the (1+1)EA with Solution Transfer.

The solution is transferred from the FEA branch to the EA branch if its better than the current solution in that branch.

  1. Tianyu Liang, Zhize Wu, Jörg Lässig, Daan van den Berg, Sarah Louise Thomson, and Thomas Weise. Addressing the Traveling Salesperson Problem with Frequency Fitness Assignment and Hybrid Algorithms. Soft Computing. 2024. https://dx.doi.org/10.1007/s00500-024-09718-8

class moptipy.algorithms.so.ffa.eafea_b.EAFEAB(op0, op1, log_h_tbl=False)[source]

Bases: Algorithm1

An implementation of the EAFEA-B.

solve(process)[source]

Apply the EAFEA-B to an optimization problem.

Parameters:

process (Process) – the black-box process object

Return type:

None

moptipy.algorithms.so.ffa.eafea_c module

A Hybrid EA-FEA Algorithm: the EAFEA-C.

This hybrid algorithm has the following features:

  • The new solution of the FEA strand is copied to the EA strand if it has an H-value which is not worse than the H-value of the current solution.

  • The new solution of the EA strand is copied over to the FEA strand if it is better than the current solution of the EA strand.

  • The H-table is updated by both strands.

  • The FEA strand always toggles back to the EA strand.

  • The EA strand toggles to the FEA strand if it did not improve for a time limit that is incremented by one whenever a toggle was made.

class moptipy.algorithms.so.ffa.eafea_c.EAFEAC(op0, op1, log_h_tbl=False)[source]

Bases: Algorithm1

An implementation of the EAFEA-C.

solve(process)[source]

Apply the EAFEA-C to an optimization problem.

Parameters:

process (Process) – the black-box process object

Return type:

None

moptipy.algorithms.so.ffa.eafea_n module

A Hybrid EA-FEA Algorithm: the EAFEA-N.

This hybrid algorithm has the following features:

  • The new solution of the FEA strand is always copied to the EA strand.

  • The new solution of the EA strand is copied over to the FEA strand if it is better than the current solution of the FEA strand.

  • The H-table is updated by both strands.

  • The FEA strand always toggles back to the EA strand.

  • The EA strand toggles to the FEA strand if it did not improve for a certain, increasing time limit.

  • Every time the EA strand toggles over to the FEA strand, time_limit is incremented by 1.

class moptipy.algorithms.so.ffa.eafea_n.EAFEAN(op0, op1, log_h_tbl=False)[source]

Bases: Algorithm1

An implementation of the EAFEA-N.

solve(process)[source]

Apply the EAFEA-N to an optimization problem.

Parameters:

process (Process) – the black-box process object

Return type:

None

moptipy.algorithms.so.ffa.fea1plus1 module

The FFA-based version of the (1+1)-EA: the (1+1)-FEA.

This algorithm is based on RLS, i.e., the (1+1)-EA, but uses Frequency Fitness Assignment (FFA) as fitness assignment process. FFA replaces all objective values with their encounter frequencies in the selection decisions. The more often an objective value is encountered, the higher gets its encounter frequency. Therefore, local optima are slowly receiving worse and worse fitness.

Most of the existing metaheuristic algorithms have in common that they maintain a set Si of one or multiple solutions and derive a set Ni of one or multiple new solutions in each iteration i. From the joint set Pi = Si + Ni of old and new solutions, they then select the set Si+1 of solutions to be propagated to the next iteration, and so on. This selection decision is undertaken based mainly on the objective values f(x) of the solutions x in Pi and solutions with better objective values tend to be preferred over solutions with worse objective values.

Frequency Fitness Assignment (FFA) completely breaks with this most fundamental concept of optimization. FFA was first proposed by Weise et al. as a “plug-in” for metaheuristics intended to prevent premature convergence. It therefore maintains a frequency table H for objective values. Before the metaheuristic chooses the set Si+1 from Pi, it increments the encounter frequencies of the objective value of each solution in Pi, i.e., performs H[yj] <- H[yj] + 1 for each xj in Pi, where yj = f(xj). In its selection decisions, the algorithm then uses the frequency fitness H[yj] instead of the objective values yj.

Here we integrate FFA into the randomized local search algorithm RLS, which is also known as the (1+1) EA. In its original form, RLS maintains a single solution and derives a slightly modified copy from it in every iteration. If the modified copy is not worse than the original solution, it replaces it. “Not worse” here means that its objective value needs to be better or equally good, i.e., <=, than the maintained current best solution. The RLS with FFA (here called (1+1) FEA) now replaces the comparison of objective values with a comparison of the frequencies of the objective values. Of course, following the definition of FFA, the frequencies are first incremented (both of the current and the new solution) and then compared.

The algorithm is here implemented in two different ways: If the objective function is always integer valued and the difference between its upper and lower bound is not too high, then we count the frequency fitness by using a numpy array. This means that frequency updates and getting frequency values is very fast. If the objective function is not always integer or if the difference between its maximum and minimum is too large, then we will use a collections.Counter to back the frequency table instead. This will be slower and probably require more memory, but it may be the only way to accommodate the frequency table. Of course, this will still fail if there are too many different objective values and the memory consumed is simply too high.

FFA is also implemented as a fitness assignment process (fitness) in module ffa_fitness.

  1. Thomas Weise, Zhize Wu, Xinlu Li, and Yan Chen. Frequency Fitness Assignment: Making Optimization Algorithms Invariant under Bijective Transformations of the Objective Function Value. IEEE Transactions on Evolutionary Computation 25(2):307-319. April 2021. Preprint available at arXiv:2001.01416v5 [cs.NE] 15 Oct 2020. https://dx.doi.org/10.1109/TEVC.2020.3032090

  2. Thomas Weise, Zhize Wu, Xinlu Li, Yan Chen, and Jörg Lässig. Frequency Fitness Assignment: Optimization without Bias for Good Solutions can be Efficient. IEEE Transactions on Evolutionary Computation (TEVC). 27(4):980-992. August 2023. doi: https://doi.org/10.1109/TEVC.2022.3191698

  3. Thomas Weise, Mingxu Wan, Ke Tang, Pu Wang, Alexandre Devert, and Xin Yao. Frequency Fitness Assignment. IEEE Transactions on Evolutionary Computation (IEEE-EC), 18(2):226-243, April 2014. https://dx.doi.org/10.1109/TEVC.2013.2251885

  4. Thomas Weise, Yan Chen, Xinlu Li, and Zhize Wu. Selecting a diverse set of benchmark instances from a tunable model problem for black-box discrete optimization algorithms. Applied Soft Computing Journal (ASOC), 92:106269, June 2020. https://dx.doi.org/10.1016/j.asoc.2020.106269

  5. Thomas Weise, Xinlu Li, Yan Chen, and Zhize Wu. Solving Job Shop Scheduling Problems Without Using a Bias for Good Solutions. In Genetic and Evolutionary Computation Conference Companion (GECCO’21 Companion), July 10-14, 2021, Lille, France. ACM, New York, NY, USA. ISBN 978-1-4503-8351-6. https://dx.doi.org/10.1145/3449726.3463124

  6. Thomas Weise, Mingxu Wan, Ke Tang, and Xin Yao. Evolving Exact Integer Algorithms with Genetic Programming. In Proceedings of the IEEE Congress on Evolutionary Computation (CEC’14), Proceedings of the 2014 World Congress on Computational Intelligence (WCCI’14), pages 1816-1823, July 6-11, 2014, Beijing, China. Los Alamitos, CA, USA: IEEE Computer Society Press. ISBN: 978-1-4799-1488-3. https://dx.doi.org/10.1109/CEC.2014.6900292

  7. Tianyu Liang, Zhize Wu, Jörg Lässig, Daan van den Berg, Sarah Louise Thomson, and Thomas Weise. Addressing the Traveling Salesperson Problem with Frequency Fitness Assignment and Hybrid Algorithms. Soft Computing. 2024. https://dx.doi.org/10.1007/s00500-024-09718-8

class moptipy.algorithms.so.ffa.fea1plus1.FEA1plus1(op0, op1, log_h_tbl=False)[source]

Bases: Algorithm1

The FFA-based version of the (1+1)-EA: the (1+1)-FEA.

This algorithm applies Frequency Fitness Assignment (FFA). This means that it does not select solutions based on whether they are better or worse. Instead, it selects the solution whose objective value has been encountered during the search less often. The word “best” therefore is not used in the traditional sense, i.e., that one solution is better than another one terms of its objective value. Instead, the current best solution is always the one whose objective value we have seen the least often.

In each step, a (1+1)-FEA creates a modified copy new_x of the current best solution best_x. It then increments the frequency fitness of both solutions by 1. If frequency fitness of new_x is not bigger the one of best_x, it becomes the new best_x. Otherwise, it is discarded.

This algorithm implementation requires that objective values are integers and have lower and upper bounds that are not too far away from each other. A more general version is available as a fitness assignment process (fitness) that can be plugged into a general EA (general_ea) in module ffa_fitness.

solve(process)[source]

Apply the (1+1)-FEA to an optimization problem.

Parameters:

process (Process) – the black-box process object

Return type:

None

moptipy.algorithms.so.ffa.ffa_fitness module

The Frequency Fitness Assignment (FFA) Process.

Frequency Fitness Assignment (FFA) replaces all objective values with their encounter frequencies in the selection decisions. The more often an objective value is encountered, the higher gets its encounter frequency. Therefore, local optima are slowly receiving worse and worse fitness.

Notice that this implementation of FFA has a slight twist to it: It will incorporate the iteration index (it) of the solutions into the fitness. This index is used to break ties, in which case newer solutions are preferred.

This can make the EA with FFA compatible with the moptipy.algorithms.so.ffa.fea1plus1.FEA1plus1 if “best” selection (moptipy.algorithms.modules.selections.best.Best) is used at mu=lambda=1. To facilitate this, there is one special case in the FFA fitness assignment: If the population consists of exactly 1 element at iteration index 0, then the frequency values are not updated.

  1. Thomas Weise, Zhize Wu, Xinlu Li, and Yan Chen. Frequency Fitness Assignment: Making Optimization Algorithms Invariant under Bijective Transformations of the Objective Function Value. IEEE Transactions on Evolutionary Computation 25(2):307-319. April 2021. Preprint available at arXiv:2001.01416v5 [cs.NE] 15 Oct 2020. https://dx.doi.org/10.1109/TEVC.2020.3032090

  2. Thomas Weise, Zhize Wu, Xinlu Li, Yan Chen, and Jörg Lässig. Frequency Fitness Assignment: Optimization without Bias for Good Solutions can be Efficient. IEEE Transactions on Evolutionary Computation (TEVC). 27(4):980-992. August 2023. doi: https://doi.org/10.1109/TEVC.2022.3191698

  3. Thomas Weise, Mingxu Wan, Ke Tang, Pu Wang, Alexandre Devert, and Xin Yao. Frequency Fitness Assignment. IEEE Transactions on Evolutionary Computation (IEEE-EC), 18(2):226-243, April 2014. https://dx.doi.org/10.1109/TEVC.2013.2251885

  4. Thomas Weise, Yan Chen, Xinlu Li, and Zhize Wu. Selecting a diverse set of benchmark instances from a tunable model problem for black-box discrete optimization algorithms. Applied Soft Computing Journal (ASOC), 92:106269, June 2020. https://dx.doi.org/10.1016/j.asoc.2020.106269

  5. Thomas Weise, Xinlu Li, Yan Chen, and Zhize Wu. Solving Job Shop Scheduling Problems Without Using a Bias for Good Solutions. In Genetic and Evolutionary Computation Conference Companion (GECCO’21 Companion), July 10-14, 2021, Lille, France. ACM, New York, NY, USA. ISBN 978-1-4503-8351-6. https://dx.doi.org/10.1145/3449726.3463124

  6. Thomas Weise, Mingxu Wan, Ke Tang, and Xin Yao. Evolving Exact Integer Algorithms with Genetic Programming. In Proceedings of the IEEE Congress on Evolutionary Computation (CEC’14), Proceedings of the 2014 World Congress on Computational Intelligence (WCCI’14), pages 1816-1823, July 6-11, 2014, Beijing, China. Los Alamitos, CA, USA: IEEE Computer Society Press. ISBN: 978-1-4799-1488-3. https://dx.doi.org/10.1109/CEC.2014.6900292

class moptipy.algorithms.so.ffa.ffa_fitness.FFA(f, log_h_tbl=False)[source]

Bases: Fitness

The frequency fitness assignment (FFA) process.

assign_fitness(p, random)[source]

Assign the frequency fitness.

Parameters:
Return type:

None

>>> from moptipy.examples.bitstrings.onemax import OneMax
>>> fit = FFA(OneMax(200))
>>> a = FRecord(None, 1)
>>> b = FRecord(None, 2)
>>> c = FRecord(None, 2)
>>> d = FRecord(None, 3)
>>> from numpy.random import default_rng
>>> rand = default_rng()
>>> fit.assign_fitness([a, b, c, d], rand)
>>> assert a.fitness == 1
>>> assert b.fitness == 2
>>> assert c.fitness == 2
>>> assert d.fitness == 1
>>> fit.assign_fitness([a, b, c, d], rand)
>>> assert a.fitness == 2
>>> assert b.fitness == 4
>>> assert c.fitness == 4
>>> assert d.fitness == 2
initialize()[source]

Initialize the algorithm.

Return type:

None

log_information_after_run(process)[source]

Write the H table.

Return type:

None

log_parameters_to(logger)[source]

Log all parameters of this component as key-value pairs.

Parameters:

logger (KeyValueLogSection) – the logger for the parameters

Return type:

None

moptipy.algorithms.so.ffa.ffa_fitness.SWITCH_TO_OFFSET_LB: Final[int] = 8388608

The lower bound at which we switch to an offset-based backing array for the frequency table H.

moptipy.algorithms.so.ffa.ffa_h module

Here we provide some tools for implementing FFA.

moptipy.algorithms.so.ffa.ffa_h.H_LOG_SECTION: Final[str] = 'H'

the log section for the frequency table

moptipy.algorithms.so.ffa.ffa_h.SWITCH_TO_MAP_RANGE: Final[int] = 67108864

The difference between upper- and lower bound at which we switch from using arrays as backing store for the frequency table H to maps.

moptipy.algorithms.so.ffa.ffa_h.clear_h(h)[source]

Clear a H-Table.

Parameters:

h (ndarray | Counter[int | float]) – the H-table to clear.

Return type:

None

moptipy.algorithms.so.ffa.ffa_h.create_h(objective)[source]

Create a data structure for recording frequencies.

If our objective function always returns integer values and its range is not too large, then we can use a numpy.ndarray as a backing datastructure for the H-map used in FFA. If the objective function may sometimes return float values or has a wide range, it will be better (or even necessary) to use Counter instead.

This function here makes the decision of what backing datastructure to use.

Additionally, it may be the lower or even upper bound of an integer objective function could be negative. To avoid dealing with negative indices, we would then add an offset to each objective value before accessing it. This function will return an appropriate offset as well.

Parameters:

objective (Objective) – the objective for which the data structure should be created

Return type:

tuple[ndarray | Counter[int | float], int]

Returns:

a tuple of the H-data structure as well as an offset to be added to all objective values.

moptipy.algorithms.so.ffa.ffa_h.h_to_str(h, offset)[source]

Convert a H-table to a string.

Parameters:
Return type:

str

>>> hl = np.array([0, 0, 1, 7, 4, 0, 0, 9, 0])
>>> h_to_str(hl, 0)
'2;;;7;;4;7;9'
>>> h_to_str(hl, -1)
'3;;;7;;4;8;9'
>>> hd = Counter((1, 1, 1, 1, 1, 4, 4, 4, 4, 4, 4, 4, 3, 3, 3, 3,
...               3, 3, 2, 2, 2, 2, 2, 2, 2, 2, 2))
>>> h_to_str(hd, 0)
'1;5;;9;;6;;7'
>>> try:
...     hd = {1: 0}
...     h_to_str(hd, 0)
... except ValueError as ve:
...     print(ve)
empty H table?
>>> hx = np.zeros(100, int)
>>> hx[10] = 4
>>> hx[12] = 234
>>> hx[89] = 111
>>> hx[90] = 1
>>> hx[45] = 2314
>>> h_to_str(hx, 0)
'10;4;12;234;45;2314;89;111;;'
moptipy.algorithms.so.ffa.ffa_h.log_h(process, h, offset)[source]

Convert a frequency table H to a string and log it to a process.

The frequency table is logged as a single line of text into a section H delimited by the lines BEGIN_H and END_H. The line consists of 2*n semicolon separated values. Each such value pair consists of an objective value y and its observed frequency H[y]. The former is either an integer or a float and the latter is an integer. If two consecutive objective values take the form ‘f` and f+1, then the second one will be omitted, and so on.

Parameters:
Return type:

None