LNCS Homepage
ContentsAuthor IndexSearch

On Effective and Inexpensive Local Search Techniques in Genetic Programming Regression

Fergal Lane, R. Muhammad Atif Azad, and Conor Ryan

CSIS Department, University of Limerick, Ireland
Fergal.Lane@ul.ie
Atif.Azad@ul.ie
Conor.Ryan@ul.ie

Abstract. Local search methods can harmoniously work with global search methods such as Evolutionary Algorithms (EAs); however, particularly in Genetic Programming (GP), concerns remain about the additional cost of local search (LS). One successful such system is Chameleon, which tunes internal GP nodes and addresses cost concerns by employing a number of strategies to make its LS both effective and inexpensive. Expense is reduced by an innovative approach to parsimony pressure whereby smaller trees are rewarded with local search opportunities more often than bigger trees.

This paper investigates three new extensions to Chameleon’s original simple setup, seeking ways for an even more effective local search. These are: trying alternative, more cost-reflective parsimony measures such as visitation length instead of tree size; varying the reward function to more gently incentivize parsimony than that in the original setup; and having more tuning earlier in runs when smaller trees can be tuned more cheaply and effectively. These strategies were tested on a varied suite of 16 difficult artificial and real-world regression problems. All of these techniques improved performance.

We show that these strategies successfully combined to cumulatively improve average test RMSE by 31% over the original and already effective Chameleon tuning scheme. A minimum of 64 simulations were run on every problem/tuning setup combination.

LNCS 8672, p. 444 ff.

Full article in PDF | BibTeX


lncs@springer.com
© Springer International Publishing Switzerland 2014