Experimental Data of the Book "Optimization Algorithms"

Here we provide the data of the experiments conducted for the electronic book "Optimization Algorithms". These experiments have been done using the The Metaheuristic Optimization in Python, moptipy for short. All datasets are provided in the tar.xz format, which can be unpacked by the default archive manager of several Linux distributions as well as WinRAR.

Job Shop Scheduling Problem (JSSP)

The Job Shop Scheduling Problem (JSSP) is a classical NP-hard optimization problem where the goal is to find schedules (assignments of jobs to machines) that complete as fast as possible. Each of the n jobs of a JSSP instance must be processed by m machines in a job-specific sequence, requiring a specific runtime on each machine.

Random Sampling-Related Algorithms

These algorithms make random search moves without using information from the objective function.

Hill Climbers (HC)

Hill climbers are algorithms are simple local searches that accept only strictly improving moves.

Random Local Search (RLS)

Random Local Search (RLS)-style algorithms are simple local searches that accept all moves that are either improving or lead to the same objective values, i.e., all non-worsening moves. Depending on the field and representation used, they are also known under different names, such as, for example, (1+1) EA if the search space are bit strings and the search operator flips each bit with the same probability.

Evolutionary Algorithms (EAs)

Evolutionary Algorithms (EAs) maintain a population of solutions. They start with a population of mu random solutions. In each iteration, they keep the mu best solutions in the population. Then they create lambda new solutions by applying search operators to them and add them to the population.

Simulated Annealing (SA)

Simulated Annealing (SA) is similar to RLS in that it is a local search that always accepts any non-worsening move. However, different from RLS, it also sometimes accepts a move to a solution worse than the current one. The probability to accept such a move is the higher the higher the current "temperature" is and the lower the worse the move is. The "temperature" is controlled by a temperature schedule.

Memetic Algorithms (MA)

Memetic Algorithms (MAs) are basically Evolutionary Algorithms where each solution is refined by another algorithm, usually a local search, for a pre-defined number of ls_fes FEs. We can implement this concept in several different ways, e.g., as a MA with hard-wired RLS, or as a MA where arbitrary algorithms can be plugged in. Here we provide the results of the former (MA+RLS) and the latter (MA+Simulated Annealing).

License

This dataset is released under the Attribution-NonCommercial-ShareAlike 4.0 International license (CC BY‑NC‑SA 4.0), see http://creativecommons.org/licenses/by-nc-sa/4.0/ for a summary.

Contact

If you have any questions or suggestions, please contact Prof. Dr. Thomas Weise (汤卫思教授) of the Institute of Applied Optimization (应用优化研究所, IAO) of the School of Artificial Intelligence and Big Data (人工智能与大数据学院) at Hefei University (合肥学院) in Hefei, Anhui, China (中国安徽省合肥市) via email to tweise@hfuu.edu.cn with CC to tweise@ustc.edu.cn.