Hyper-heuristics Tutorial

Hyper-heuristics Tutorial

Hyper-heuristics Tutorial Friday, February 14, 2020 Real-World Challenges Researchers strive to make algorithms increasingly general-purpose But practitioners have very specific needs Designing custom algorithms tuned to particular problem instance distributions and/or computational architectures can be very time consuming Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz

2 Automated Design of Algorithms Addresses the need for custom algorithms But due to high computational complexity, only feasible for repeated problem solving Hyper-heuristics accomplish automated design of algorithms by searching program space Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 3 Hyper-heuristics

Hyper-heuristics are a special type of meta-heuristic Step 1: Extract algorithmic primitives from existing algorithms Step 2: Search the space of programs defined by the extracted primitives While Genetic Programming (GP) is particularly well suited for executing Step 2, other meta-heuristics can be, and have been, employed The type of GP employed matters [24] Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 4

Type of GP Matters: Experiment Description Implement five types of GP (tree GP, linear GP, canonical Cartesian GP, Stack GP, and Grammatical Evolution) in hyper-heuristics for evolving blackbox search algorithms for solving 3-SAT Base hyper-heuristic fitness on the fitness of the best search algorithm generated at solving the 3SAT problem Compare relative effectiveness of each GP type as a hyper-heuristic GP Individual Description Search algorithms are represented as an iterative algorithm that passes one or more set of variable assignments to the next iteration Genetic program represents a single program

iteration Algorithm runs starting with a random initial population of solutions for 30 seconds 3-SAT Problem A subset of the Boolean Satisfiability Problem (SAT) The goal is to select values for Boolean variables such that a given Boolean equation evaluates as true (is satisfied) Boolean equations are in 3-conjunctive normal form Example: (A B C) (A C D) (B C V D) B B C) (A C D) (B C V D) C) (A C D) (B C V D) (A B C) (A C D) (B C V D) C B C) (A C D) (B C V D) D) (A C D) (B C V D) (B B C) (A C D) (B C V D) C V D) Satisfied by A, B, C, D Fitness is the number of clauses satisfied by the best solution in the final population

Genetic Programming Nodes Used Last population, Random population Tournament selection, Fitness proportional selection, Truncation selection, Random selection Bitwise mutation, Greedy flip, Quick greedy flip, Stepwise adaption of weights, Novelty Union Results 2000 Number of Clauses Satisfied 1980

1960 1940 1920 1900 1880 1860 1840 1820 1800 Tree Linear Cartesian Grammatical

Stack Results [24] 10 Results Generated algorithms mostly performed comparably well on training and test problems Tree and stack GP perform similarly well on this problem, as do linear and Cartesian GP Tree and stack GP perform significantly better on this problem than linear and Cartesian GP, which perform significantly better than grammatical evolution

Conclusions The choice of GP type makes a significant difference in the performance of the hyperheuristic The size of the search space appears to be a major factor in the performance of the hyperheuristic Case Study 1: The Automated Design of Crossover Operators [20] Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 13 Motivation

Performance Sensitive to Crossover Selection Identifying & Configuring Best Traditional Crossover is Time Consuming Existing Operators May Be Suboptimal Optimal Operator May Change During Evolution Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 14 Some Possible Solutions Meta-EA Exceptionally time consuming Self-Adaptive Algorithm Selection

Limited by algorithms it can choose from Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 15 Self-Configuring Crossover (SCX) Each Individual Encodes a Crossover Operator Crossovers Encoded as a List of Primitives Swap Merge

Offspring Crossover Swap(3, 5, 2) Merge(1, r, 0.7) Each Primitive has three parameters Swap(r, i, r) Number, Random, or Inline Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 16

Applying an SCX Concatenate Genes Parent 1 Genes 1.0 2.0 Friday, February 14, 2020 3.0 Parent 2 Genes 4.0

5.0 John R. Woodward, Daniel R. Tauritz 6.0 7.0 8.0 17 The Swap Primitive Each Primitive has a type Swap represents crossovers that move genetic material

Swap(3, 5, 2) First Two Parameters Start 1 Position Start 2 Position Third Parameter Primitive Dependent Swaps use Width Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 18 Applying an SCX

Concatenate Genes Offspring Crossover 1.0 2.0 3.0 4.0 3.0 5.0 4.0 6.0

5.0 7.0 6.0 8.0 Swap(3, 5, 2) Merge(1, r, 0.7) Swap(r, i, r) Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz

19 The Merge Primitive Third Parameter Primitive Dependent Merges use Weight Merge(1, r, 0.7) Random Construct All past primitive parameters used the Number construct r marks a primitive using the Random Construct Allows primitives to act stochastically Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz

20 Applying an SCX Concatenate Genes Offspring Crossover 1.0 2.0 5.0 6.0 3.0

4.0 7.0 8.0 Swap(3, 5, 2) 0.7 g(i) = *g(i) + (1-)*g(j) Merge(1, r, 0.7) Swap(r, i, r) Friday, February 14, 2020

g(1)2.5 g(2) = 6.0*(0.7) 1.0*(0.7) + 1.0*(1-0.7) 6.0*(1-0.7) 4.5 John R. Woodward, Daniel R. Tauritz 21 The Inline Construct Only Usable by First Two Parameters Swap(r, i, r)

Denoted as i Forces Primitive to Act on the Same Loci in Both Parents Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 22 Applying an SCX Concatenate Genes Offspring Crossover 2.5 2.0

5.0 4.5 3.0 4.0 7.0 8.0 Swap(3, 5, 2) Merge(1, r, 0.7)

Swap(r, i, r) Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 23 Applying an SCX Remove Exess Genes Concatenate Genes 2.5 Friday, February 14, 2020

4.0 5.0 4.5 Offspring Genes John R. Woodward, Daniel R. Tauritz 3.0 2.0 7.0 8.0

24 Evolving Crossovers Parent 2 Crossover Parent 1 Crossover Offspring Crossover Swap(r, 7, 3) Merge(i, 8, r) Swap(3, 5, 2) Swap(4, 2, r)

Merge(1, r, 0.7) Merge(r, r, r) Swap(r, i, r) Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 25 Empirical Quality Assessment Compared Against Arithmetic Crossover N-Point Crossover

Uniform Crossover On Problems Rosenbrock Rastrigin Offset Rastrigin NK-Landscapes DTrap Friday, February 14, 2020 Problem Rosenbrock Rastrigin Offset Rastrigin NK DTrap

John R. Woodward, Daniel R. Tauritz Comparison SCX -86.94 (54.54) -26.47 (23.33) -59.2 (6.998) -0.0088 (0.021) -0.1175 (0.116) -0.03 (0.028) 0.771 (0.011) 0.8016 (0.013) 0.9782 (0.005) 0.9925 (0.021) 26 Adaptations: Rastrigin

Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 27 Adaptations: DTrap Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 28 SCX Overhead Requires No Additional Evaluation

Adds No Significant Increase in Run Time All linear operations Adds Initial Crossover Length Parameter Testing showed results fairly insensitive to this parameter Even worst settings tested achieved better results than comparison operators Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 29 Conclusions Remove Need to Select Crossover Algorithm Better Fitness Without Significant Overhead

Benefits From Dynamically Changing Operator Promising Approach for Evolving Crossover Operators for Additional Representations (e.g., Permutations) Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 30 Case Study 2: The Automated Design of Black Box Search Algorithms [21, 23, 25] Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz

31 Approach Hyper-Heuristic employing Genetic Programing Post-ordered parse tree Evolve the iterated function Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 32 Our Solution

Initialization Iterated Function Check for Termination Terminate Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 33

Our Solution Hyper-Heuristic employing Genetic Programing Post-ordered parse tree Evolve the iterated function High-level primitives Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 34 Parse Tree Iterated function Sets of solutions

Function returns a set of solutions accessible to the next iteration Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 35 Primitive Types Variation Primitives Selection Primitives Set Primitives Evaluation Primitive Terminal Primitives

Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 36 Variation Primitives Bit-flip Mutation rate Uniform Recombination count Diagonal Recombination n Friday, February 14, 2020

John R. Woodward, Daniel R. Tauritz 37 Selection Primitives Truncation Selection count K-Tournament Selection k count Random Sub-set Selection count Friday, February 14, 2020

John R. Woodward, Daniel R. Tauritz 38 Set-Operation Primitives Make Set name Persistent Sets name Union Friday, February 14, 2020

John R. Woodward, Daniel R. Tauritz 39 Evaluation Primitive Evaluates the nodes passed in Allows multiple operations and accurate selections within an iteration Allows for deception Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 40

Terminal Primitives Random Individuals count `Last Set Persistent Sets name Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 41 Meta-Genetic Program Create Valid

Population Select Survivors Check Termination Generate Children Evaluate Children Friday, February 14, 2020

John R. Woodward, Daniel R. Tauritz 42 BBSA Evaluation Create Valid Population Select Survivors Generate Children Generate Children Evaluate Children

Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 43 Termination Conditions Evaluations Iterations Operations Convergence Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 44

Proof of Concept Testing Deceptive Trap Problem 0|0|1|1|0 0|1|0|1|0 1|1|1|1|0 5 4.5 4 3.5 Fitness 3

2.5 2 1.5 1 0.5 0 0 0.5 1 1.5 2

2.5 3 3.5 4 4.5 5 # of 1s Friday, February 14, 2020

John R. Woodward, Daniel R. Tauritz 45 Proof of Concept Testing (cont.) Evolved Problem Configuration Bit-length = 100 Trap Size = 5 Verification Problem Configurations Bit-length = 100, Trap Size = 5 Bit-length = 200, Trap Size = 5 Bit-length = 105, Trap Size = 7 Bit-length = 210, Trap Size = 7 Friday, February 14, 2020

John R. Woodward, Daniel R. Tauritz 46 Results 60% Success Rate Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 47 Results:

Bit-Length = 100 Trap Size = 5 Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 48 Results: Bit-Length = 200 Trap Size = 5 Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz

49 Results: Bit-Length = 105 Trap Size = 7 Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 50 Results: Bit-Length = 210 Trap Size = 7

Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 51 BBSA2 BBSA1 Friday, February 14, 2020 BBSA3 John R. Woodward, Daniel R. Tauritz

52 BBSA1 Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 53 BBSA2 Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz

54 BBSA3 Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 55 BBSA2 Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz

56 Over-Specialization Trained Problem Configuration Friday, February 14, 2020 Alternate Problem Configuration John R. Woodward, Daniel R. Tauritz 57

Robustness Measures of Robustness Applicability Fallibility Applicability What area of the problem configuration space do I perform well on? Fallibility If a given BBSA doesnt perform well, how much worse will I perform? Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz

58 Robustness Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 59 Multi-Sampling Train on multiple problem configurations Results in more robust BBSAs Provides the benefit of selecting the region of interest on the problem configuration landscape

Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 60 Multi-Sample Testing Deceptive Trap Problem 0|0|1|1|0 0|1|0|1|0 1|1|1|1|0 5

4.5 4 3.5 Fitness 3 2.5 2 1.5 1 0.5 0 0 0.5

1 1.5 2 2.5 3 3.5 4 4.5

5 # of 1s Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 61 Multi-Sample Testing (cont.) Multi-Sampling Evolution Levels 1-5 Training Problem Configurations 1.

2. 3. 4. 5. Bit-length = 100, Trap Size = 5 Bit-length = 200, Trap Size = 5 Bit-length = 105, Trap Size = 7 Bit-length = 210, Trap Size = 7 Bit-length = 300, Trap Size = 5 Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 62

Initial Test Problem Configurations 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. Bit-length = 100, Trap Size = 5 Bit-length = 200, Trap Size = 5

Bit-length = 105, Trap Size = 7 Bit-length = 210, Trap Size = 7 Bit-length = 300, Trap Size = 5 Bit-length = 99, Trap Size = 9 Bit-length = 198, Trap Size = 9 Bit-length = 150, Trap Size = 5 Bit-length = 250, Trap Size = 5 Bit-length = 147, Trap Size = 7 Bit-length = 252, Trap Size = 7 Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 63

Initial Results Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 64 Problem Configuration Landscape Analysis Run evolved BBSAs on wider set of problem configurations Bit-length: ~75-~500 Trap Size: 4-20 Friday, February 14, 2020

John R. Woodward, Daniel R. Tauritz 65 Results: Multi-Sampling Level 1 Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 66 Results: Multi-Sampling Level 2 Friday, February 14, 2020

John R. Woodward, Daniel R. Tauritz 67 Results: Multi-Sampling Level 3 Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 68 Results: Multi-Sampling Level 4 Friday, February 14, 2020

John R. Woodward, Daniel R. Tauritz 69 Results: Multi-Sampling Level 5 Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 70 Results: EA Comparison Friday, February 14, 2020

John R. Woodward, Daniel R. Tauritz 71 Robustness: Fallibility Multi-Sample Level 5 Standard EA Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 72 Robustness: Fallibility Multi-Sample Level 1

Standard EA Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 73 Robustness: Applicability Multi-Sample Level 1 Multi-Sample Level 5 Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz

74 Robustness: Fallibility Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 75 Drawbacks Increased computational time More runs per evaluation (increased wall time) More problem configurations to optimize for (increased evaluations)

Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 76 Summary of Multi-Sample Improvements Improved Hyper-Heuristic to evolve more robust BBSAs Evolved custom BBSA which outperformed standard EA and were robust to changes in problem configuration Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz

77 Case Study 3: Evolving Random Graph Generators: A Case for Increased Algorithmic Primitive Granularity [27] Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 78 Random Graphs Graphs are a powerful modeling tool Computer and social networks

Transportation and power grids Algorithms designed for graphs Community detection and graph partitioning Network routing and intrusion detection Random graphs provide test data Prediction using random graphs Spread of disease Deployment of wireless sensors Traditional Random Graph Models Erds-Rnyi Barabsi-Albert

Automated Random Graph Model Design Random graph model needs to accurately reflect intended concept Model selection can be automated, but relies on having a good solution available Developing an accurate model for a new application can be difficult Can the model design process be automated to produce an accurate graph model given examples? Hyper-heuristic Approach Extract functionality from existing graph generation techniques Use Genetic Programming (GP) to construct new random graph algorithms

Previous Attempts at Evolving Random Graph Generators Assumes growth model, adding one node at a time Does well at reproducing traditional models Not demonstrated to do well at generating real complex networks Limits the search space of possible solutions Increased Algorithmic Primitive Granularity Remove the assumed growth structure More flexible lower-level primitive set Benefit: Can represent a larger variety of algorithms Drawback: Larger search space, increasing complexity

Methodology NSGA-II evolves population of random graph models Strongly typed parse tree representation Centrality distributions used to evaluate solution quality (degree, betweenness, PageRank) Primitive Operations Terminals Graph elements: nodes, edges Graph properties: average degree, size, order Constants: integers, probabilities, Booleans, user inputs No-op terminators Functions

Basic programming constructs: for, while, if-else Data structures: lists of values, nodes, or edges, list combining/selection/sorting Math and logic operators: add, multiply, <, ==, AND, OR Graph operators: add edges, add subgraph, rewire edges Example Evolved Random Graph Generator Reproducing Erds-Rnyi Low-GP High-GP Metric

Mean Comparison Mean Degree 0.101 0.048

= 0.108 0.047 Betweenness 0.104 0.031 = 0.105 0.033

PageRank 0.032 = 0.112 0.029 0.110 Reproducing Random Community Graphs Low-GP High-GP

Metric Mean Comparison Mean Degree 0.436

0.075 < 0.458 0.055 Betweenness 0.209 0.105 < 0.320

0.126 PageRank 0.029 < 0.150 0.036 0.127 Actual Graph

Low-GP High-GP Evolved Random Collaboration Network Generator Conclusion Traditional random graph models do not always produce appropriate representations of certain concepts Accurate random graph model design can be automated using genetic programming More flexible set of low-level primitive operations increases resulting model accuracy

Increase in a priori evolution time is amortized over repeated use of the evolved solutions Some Final Thoughts Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 92 Challenges in Hyper-heuristics Hyper-heuristics are very computationally expensive (use Asynchronous Parallel GP [26,30]) What is the best primitive granularity? (see next slide)

How to automate decomposition and recomposition of primitives? How to automate primitive extraction? How does hyper-heuristic performance scale for increasing primitive space size? (see [25,27]) Primitive Granularity Algorithm Selective Hyperheuristics Generative Hyperheuristics Our Hyper-heuristic Primitives Full BBSAs i.e., EA, SA, SAHC, etc.

High-level BBSA operations i.e., Truncation Selection, Bit-Flip Mutation, etc. Low-level BBSA operations i.e., If Converged Statements, For loops, etc. Genetic Programming Turing Complete Set of Primitives

References 1 1. 2. 3. 4. 5. 6. 7. John Woodward. Computable and Incomputable Search Algorithms and Functions. IEEE International Conference on Intelligent Computing and Intelligent Systems (IEEE ICIS 2009), pages 871-875, Shanghai, China, November 20-22, 2009.

John Woodward. The Necessity of Meta Bias in Search Algorithms. International Conference on Computational Intelligence and Software Engineering (CiSE), pages 1-4, Wuhan, China, December 1012, 2010. John Woodward & Ruibin Bai. Why Evolution is not a Good Paradigm for Program Induction: A Critique of Genetic Programming. In Proceedings of the first ACM/SIGEVO Summit on Genetic and Evolutionary Computation, pages 593-600, Shanghai, China, June 12-14, 2009. Jerry Swan, John Woodward, Ender Ozcan, Graham Kendall, Edmund Burke. Searching the Hyperheuristic Design Space. Cognitive Computation, 6:66-73, 2014. Gisele L. Pappa, Gabriela Ochoa, Matthew R. Hyde, Alex A. Freitas, John Woodward, Jerry Swan. Contrasting meta-learning and hyper-heuristic research. Genetic Programming and Evolvable Machines, 15:3-35, 2014. Edmund K. Burke, Matthew Hyde, Graham Kendall, and John Woodward. Automating the Packing Heuristic Design Process with Genetic Programming. Evolutionary Computation, 20(1):63-89, 2012. Edmund K. Burke, Matthew R. Hyde, Graham Kendall, and John Woodward. A Genetic Programming Hyper-Heuristic Approach for Evolving Two Dimensional Strip Packing Heuristics. IEEE Transactions on Evolutionary Computation, 14(6):942-958, December 2010. Friday, February 14, 2020

John R. Woodward, Daniel R. Tauritz 95 References 2 8. Edmund K. Burke, Matthew R. Hyde, Graham Kendall, Gabriela Ochoa, Ender Ozcan and John R. Woodward. Exploring Hyper-heuristic Methodologies with Genetic Programming, Computational Intelligence: Collaboration, Fusion and Emergence, In C. Mumford and L. Jain (eds.), Intelligent Systems Reference Library, Springer, pp. 177-201, 2009. 9. Edmund K. Burke, Matthew Hyde, Graham Kendall and John R. Woodward. The Scalability of Evolved On Line Bin Packing Heuristics. In Proceedings of the IEEE Congress on Evolutionary Computation, pages 2530-2537, September 25-28, 2007. 10. R. Poli, John R. Woodward, and Edmund K. Burke. A Histogram-matching Approach to the Evolution of Bin-packing Strategies. In Proceedings of the IEEE Congress on Evolutionary Computation, pages 35003507, September 25-28, 2007. 11. Edmund K. Burke, Matthew Hyde, Graham Kendall, and John Woodward. Automatic Heuristic Generation with Genetic Programming: Evolving a Jack-of-all-Trades or a Master of One, In Proceedings of the

Genetic and Evolutionary Computation Conference, pages 1559-1565, London, UK, July 2007. 12. John R. Woodward and Jerry Swan. Template Method Hyper-heuristics, Metaheuristic Design Patterns (MetaDeeP) workshop, GECCO Comp14, pages 1437-1438, Vancouver, Canada, July 12-16, 2014. 13. Saemundur O. Haraldsson and John R. Woodward, Automated Design of Algorithms and Genetic Improvement: Contrast and Commonalities, 4th Workshop on Automatic Design of Algorithms (ECADA), GECCO Comp 14, pages 1373-1380, Vancouver, Canada, July 12-16, 2014. Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 96 References 3 14. John R. Woodward, Simon P. Martin and Jerry Swan. Benchmarks That Matter For Genetic Programming, 4th Workshop on Evolutionary Computation for the Automated Design of Algorithms (ECADA), GECCO

Comp 14, pages 1397-1404, Vancouver, Canada, July 12-16, 2014. 15. John R. Woodward and Jerry Swan. The Automatic Generation of Mutation Operators for Genetic Algorithms, 2nd Workshop on Evolutionary Computation for the Automated Design of Algorithms (ECADA), GECCO Comp 12, pages 67-74, Philadelphia, U.S.A., July 7-11, 2012. 16. John R. Woodward and Jerry Swan. Automatically Designing Selection Heuristics. 1st Workshop on Evolutionary Computation for Designing Generic Algorithms, pages 583-590, Dublin, Ireland, 2011. 17. Edmund K. Burke, Matthew Hyde, Graham Kendall, Gabriela Ochoa, Ender Ozcan, and John Woodward. A Classification of Hyper-heuristics Approaches, Handbook of Metaheuristics, pages 449-468, International Series in Operations Research & Management Science, M. Gendreau and J-Y Potvin (Eds.), Springer, 2010. 18. Libin Hong and John Woodward and Jingpeng Li and Ender Ozcan. Automated Design of Probability Distributions as Mutation Operators for Evolutionary Programming Using Genetic Programming. Proceedings of the 16th European Conference on Genetic Programming (EuroGP 2013), volume 7831, pages 85-96, Vienna, Austria, April 3-5, 2013. 19. Ekaterina A. Smorodkina and Daniel R. Tauritz. Toward Automating EA Configuration: the Parent Selection Stage. In Proceedings of CEC 2007 - IEEE Congress on Evolutionary Computation, pages 63-70,

Singapore, September 25-28, 2007. Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 97 References 4 20. Brian W. Goldman and Daniel R. Tauritz. Self-Configuring Crossover. In Proceedings of the 13th Annual Conference Companion on Genetic and Evolutionary Computation (GECCO '11), pages 575-582, Dublin, Ireland, July 12-16, 2011. 21. Matthew A. Martin and Daniel R. Tauritz. Evolving Black-Box Search Algorithms Employing Genetic Programming. In Proceedings of the 15th Annual Conference Companion on Genetic and Evolutionary Computation (GECCO '13), pages 1497-1504, Amsterdam, The Netherlands, July 6-10, 2013. 22. Nathaniel R. Kamrath, Brian W. Goldman and Daniel R. Tauritz. Using Supportive Coevolution to Evolve

Self-Configuring Crossover. In Proceedings of the 15th Annual Conference Companion on Genetic and Evolutionary Computation (GECCO '13), pages 1489-1496, Amsterdam, The Netherlands, July 6-10, 2013. 23. Matthew A. Martin and Daniel R. Tauritz. A Problem Configuration Study of the Robustness of a BlackBox Search Algorithm Hyper-Heuristic. In Proceedings of the 16th Annual Conference Companion on Genetic and Evolutionary Computation (GECCO '14), pages 1389-1396, Vancouver, BC, Canada, July 1216, 2014. 24. Sean Harris, Travis Bueter, and Daniel R. Tauritz. A Comparison of Genetic Programming Variants for Hyper-Heuristics. In Proceedings of the 17th Annual Conference Companion on Genetic and Evolutionary Computation (GECCO '15), pages 1043-1050, Madrid, Spain, July 11-15, 2015. 25. Matthew A. Martin and Daniel R. Tauritz. Hyper-Heuristics: A Study On Increasing Primitive-Space. In Proceedings of the 17th Annual Conference Companion on Genetic and Evolutionary Computation (GECCO '15), pages 1051-1058, Madrid, Spain, July 11-15, 2015. Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 98

References 5 26. Alex R. Bertels and Daniel R. Tauritz. Why Asynchronous Parallel Evolution is the Future of Hyperheuristics: A CDCL SAT Solver Case Study. In Proceedings of the 18th Annual Conference Companion on Genetic and Evolutionary Computation (GECCO `16), pages 1359-1365, Denver, Colorado, USA, July 2024, 2016. 27. Aaron S. Pope, Daniel R. Tauritz and Alexander D. Kent. Evolving Random Graph Generators: A Case for Increased Algorithmic Primitive Granularity. In Proceedings of the 2016 IEEE Symposium Series on Computational Intelligence (IEEE SSCI 2016), Athens, Greece, December 6-9, 2016. 28. Aaron S. Pope, Daniel R. Tauritz and Alexander D. Kent. Evolving Multi-level Graph Partitioning Algorithms. In Proceedings of the 2016 IEEE Symposium Series on Computational Intelligence (IEEE SSCI 2016), Athens, Greece, December 6-9, 2016. 29. Islam Elnabarawy, Daniel R. Tauritz, Donald C. Wunsch. Evolutionary Computation for the Automated Design of Category Functions for Fuzzy ART: An Initial Exploration. To appear in Proceedings of the 19th Annual Conference Companion on Genetic and Evolutionary Computation (GECCO17), Berlin, Germany, July 15-19, 2017. 30. Adam Harter, Daniel R. Tauritz, William M. Siever. Asynchronous Parallel Cartesian Genetic Programming. To appear in Proceedings of the 19th Annual Conference Companion on Genetic and Evolutionary Computation (GECCO17), Berlin, Germany, July 15-19, 2017.

Friday, February 14, 2020 John R. Woodward, Daniel R. Tauritz 99

Recently Viewed Presentations

  • The WATER Front Updates on Policies, Guidances, Issues

    The WATER Front Updates on Policies, Guidances, Issues

    Fill in if establishing local reservoir triggers for staged drought response. Groundwater Trigger(s) Fill in if establishing local groundwater triggers for staged drought response. Demand Management Actions. Nonessential Outdoor Watering. 1 day per week watering, before 9 am and after...
  • CHAPTER 44 Urinary and Reproduct ive Disorders URINARY

    CHAPTER 44 Urinary and Reproduct ive Disorders URINARY

    CHAPTER 44 Urinary and Reproductive Disorders
  • Chapter Twelve - Asena Caner, Ph.D.

    Chapter Twelve - Asena Caner, Ph.D.

    A state-contingent consumptionwith $17 consumption value in the accident state and $20 consumption value in the no-accident state. State-Contingent Budget Constraints Without insurance,
  • Considerations for Psychotropic Medication in Patients with ...

    Considerations for Psychotropic Medication in Patients with ...

    Considerations for Psychotropic Medication in Patients with Altered Gastrointestinal Physiology. Christian Askren, Christopher Fox, Sonja Hutchens, Marjorie Peck
  • Human Embryonic and Fetal Periods a General Survey

    Human Embryonic and Fetal Periods a General Survey

    After the end of the third week, the three germinal layers begin to differentiate and transform so that the initially flat embryonic disk develops into a cylindrical structure like a "C".The folding and genesis of the abdominal wall permits a...
  • Conscorcio Madrono

    Conscorcio Madrono

    Why manage and share data? Direct benefits for you. To make your research easier! Stop yourself drowning in irrelevant stuff. Make sure you can understand and reuse your data again later
  • Chapter 1

    Chapter 1

    switch Multiple-Selection Statement - cont'd Expression in each case Integers Character constant E.g., 'F', '16' or '#' Constant variable Declared with keyword final switch Multiple-Selection Statement - break and continue Statements break/continue Alter flow of control break statement Causes immediate...
  • COPYRIGHT FOR FACULTY Prepared for New Faculty Professional

    COPYRIGHT FOR FACULTY Prepared for New Faculty Professional

    Fair Use Examples. Photocopying articles for your research. One or few copies made. E-mailing articles either to yourself or students. Using factual, non creative work such as non-fiction, articles or documentary film.