Skip navigation
PDF 1.1 MB
DOI: http://dx.doi.org/10.7551/978-0-262-32621-6-ch073
Pages 447-454
First published 30 July 2014

POET: an evo-devo method to optimize the weights of large artificial neural networks

Alessandro Fontana, Andrea Soltoggio and Borys Wróbel

Abstract (Excerpt)

Large search spaces as those of artificial neural networks are difficult to search with machine learning techniques. The large amount of parameters is the main challenge for search techniques that do not exploit correlations expressed as patterns in the parameter space. Evolutionary computation with indirect genotype-phenotype mapping was proposed as a possible solution, but current methods often fail when the space is fractured and presents irregularities. This study employs an evolutionary indirect encoding inspired by developmental biology. Cellular proliferations and deletions of variable size allow for the definition of both regular large areas and small detailed areas in the parameter space. The method is tested on the search of the weights of a neural network for the classification of the MNIST dataset. The results demonstrate that even large networks such as those required for image classification can be effectively automatically designed by the proposed evolutionary developmental method. The combination of real-world problems like vision and classification, evolution and development, endows the proposed method with aspects of particular relevance to artificial life.