4th Workshop on Evolutionary Computation for the Automated Design of Algorithms

                                                                            July 12-16th

                                                             Vancouver Canada


Metaheuristics are stochastic search processes [1] and are typically applied directly to a search space of candidate solutions, in order to find near-optimal high-quality solutions to the current problem instance. However, if metaheuristics are applied indirectly to the space of solutions, via another search space, we are operating at a higher level of abstraction which leverages some important benefits [2]. If the intermediate search space is a set of algorithms/heuristics, then this method can be employed to discover novel heuristics [3]. In other words, the research focus has shifted from how to solve problem instances faster/better to the more ambitious vision of how to automatically develop algorithms, which in turn, solve problem instances faster/better.

This type of approach has been named hyper-heuristics, and in this workshop we are particularly interested in the use of hyper-heuristics to generate new and interesting heuristics. In the vast majority of cases, these semi-automatically generated heuristics outperform existing human-designed heuristics.

Another interpretation of this approach is, instead of employing a method such as Genetic Programming (or some other automated process of algorithm design) to evolve algorithms from scratch, we give the process a head start and take already existing algorithms and allow the process of evolution to improve these initial seed programs. However, typically we do not give Genetic Programming full licence to alter any part of a program, but only pre-designated parts of the program so Genetic Programming does not accidentally destroy any existing functionality. Therefore activity can be targeted at sites in the programs which are thought to yield greatest benefits from being altered. Thus, while the ultimate goal of automatic programming is designing algorithms from scratch, we can avoid reinventing the wheel by supplying some known ingredients by way of existing components of algorithms or an architectural component, such as the template method design pattern. The resulting algorithm is therefore part man-made and part machine-made, reflecting the current state of the art.

Genetic Programming can interact with programs in a number of different ways.

A benefit of working at a higher level of abstraction is that there is a clear division of responsibilities between the human designer and the computer. The human designer is free to provide high-level guidance in terms of initial seed programs, defining the space of programs (e.g. via a template method or a grammar), while Genetic Programming can supply the core of an algorithm (e.g. the body of a for-loop). Thus while GP is doing the laborious chore of providing many variations on a theme, the human designer is engaged in the more creative process of defining a family of algorithms. Thus the human designer is effectively able to propose, not just one algorithm (which is what is achievable with manual design e.g. "in this paper we propose a single novel algorithm") but able to generate a whole family of algorithms (defined by the search space), delegating the mundane task of sifting through them with an automated generate and test procedure. An additional benefit of working at a higher level of abstraction is that we have the potential to build a bridge between optimization and machine learning [6]. By operating first on a space of heuristics and then on the space of candidate solutions, this approach brings about a qualitative change, rather than a quantitative change: at the base-level we are learning about problem instances, while at the hyper-level we are learning about the probability distribution from which the problem instances were drawn in the sense used in machine learning.

The number of application areas which exploits this method is growing and include data mining [4], job-shop scheduling [7, 8], and evolving evolutionary algorithms themselves [9, 10].

[1] Sean Luke Essentials of Metaheuristics, 2013, Lulu

[2] J. Swan, J. Woodward, E Özcan, G Kendall, E Burke Searching the Hyper-heuristic Design Space Cognitive Computation, 1-8

[3] E. K. Burke, M. R. Hyde, G. Kendall, G. Ochoa, E. Ozcan and J. R. Woodward (2009) Exploring Hyper-heuristic Methodologies with Genetic Programming, Computational Intelligence: Collaboration, Fusion and Emergence, In C. Mumford and L. Jain (eds.), Intelligent Systems Reference Library, Springer, pp. 177-201

[4] G. L. Pappa and A. A. Freitas, Automating the Design of Data Mining Algorithms: An Evolutionary Computation Approach, Springer, Natural Computing Series, 2010. xiii + 187 pages.

[5] E. K. Burke, M. R. Hyde, G. Kendall Grammatical evolution of local search heuristics Evolutionary Computation, IEEE Transactions on 16 (3), 406-417

[6] G. L. Pappa, G. Ochoa, M. R. Hyde, A. A. Freitas, J. R, Woodward, J. Swan Contrasting meta-learning and hyper-heuristic research: the role of evolutionary algorithms Genetic Programming and Evolvable Machines, 1-33

[7] Joc Cing Tay, Nhu Binh Ho, Evolving dispatching rules using genetic programming for solving multi-objective flexible job-shop problems, Computers and Industrial Engineering, Volume 54, Issue 3, April 2008, Pages 453-47.

[8] Su Nguyen and Mengjie Zhang and Mark Johnston and Kay Chen Tan. Learning iterative dispatching rules for job shop scheduling with genetic programming. The International Journal of Advanced Manufacturing Technology, 67(1-4):85-100, 2013.

[9] Matthew A. Martin and Daniel R. Tauritz. Evolving black-box search algorithms employing genetic programming. GECCO 2013 pages 1497-1504

[10] Libin Hong and John Woodward and Jingpeng Li and Ender Ozcan. Automated Design of Probability Distributions as Mutation Operators for Evolutionary Programming Using Genetic Programming. 16th European Conference on Genetic Programming, EuroGP 2013.


Important Dates


Paper Submission

Submitted papers should follow the ACM format, and not exceed 8 pages. Please see the GECCO 2014 information for authors for further details. However, note that the review process of the workshop is not double-blind. Hence, authors' information should appear in the paper.

All accepted papers will be presented at the workshop and appear in the GECCO workshop volume. Proceedings of the workshop will be published on CD-ROM, and distributed at the conference.

Papers should be submitted in PostScript or PDF format to: [jrw at cs dot stir dot ac dot uk], and contain the subject "GECCO 2014 Workshop". A conformation of recipt email will be sent by return



This will be a half-day workshop. Each presentation is planned to last for 20 minutes followed by 10 minutes for discussions, and the panel will last 45 minutes.



Workshop Chairs

     John Woodward - University of Stirling, United Kingdom

     Jerry Swan - University of Stirling, United Kingdom

     Earl Barr - University College London, United Kingdom






Call for Papers

Although most evolutionary computation techniques are designed to generate specific solutions to a given instance of a problem, some of these techniques can be explored to solve more generic problems. The main objective of this workshop is to discuss evolutionary computation methods for generating generic algorithms and/or heuristics. These methods have the advantage of producing solutions that are applicable to any instance of a problem domain, instead of a solution specifically produced for a single instance of the problem. The areas of application of these methods may include, for instance, data mining, machine learning, optimization, bioinformatics, image processing, economics, etc.

The workshop welcomes original submissions on all aspects of Evolutionary Computation for Designing Generic Algorithms, which include (but are not limited to) the following topics and themes:

Accepted papers for 2014

  • "A Problem Configuration Study of the Robustness of a Black-Box Search Algorithm Hyper-Heuristic" Matthew A. Martin; Daniel R. Tauritz; Natural Computation Laboratory Department of Computer Science Missouri University of Science and Technology Rolla, Missouri, U.S.A.
  • A Step Size Based Self-Adaptive Mutation Operator for Evolutionary Programming Libin Hong; Ender Özcan; Computer Science, University of Nottingham P.R.C. Department of Computer Science, University of Nottingham U.K.
  • Automated Design of Algorithms and Genetic Improvement Saemundur Haraldsson; John Woodward: Department of Maths and Computer Science. The University of Striling, Scotland
  • Genetic Programming Needs Benchmarks That Matter John R. Woodward; Simon P. Martin; Jerry Swan; Department of Maths and Computer Science. The University of Striling, Scotland

  • t

    Last updated on 07 Feb, 2014

    Previous Editions