Sparsity through evolutionary pruning prevents neuronal networks from overfitting

Gerum R, Erpenbeck A, Krauß P, Schilling A (2020)


Publication Type: Journal article

Publication year: 2020

Journal

Book Volume: 128

Pages Range: 305-312

DOI: 10.1016/j.neunet.2020.05.007

Abstract

Modern Machine learning techniques take advantage of the exponentially rising calculation power in new generation processor units. Thus, the number of parameters which are trained to solve complex tasks was highly increased over the last decades. However, still the networks fail – in contrast to our brain – to develop general intelligence in the sense of being able to solve several complex tasks with only one network architecture. This could be the case because the brain is not a randomly initialized neural network, which has to be trained from scratch by simply investing a lot of calculation power, but has from birth some fixed hierarchical structure. To make progress in decoding the structural basis of biological neural networks we here chose a bottom-up approach, where we evolutionarily trained small neural networks in performing a maze task. This simple maze task requires dynamic decision making with delayed rewards. We were able to show that during the evolutionary optimization random severance of connections leads to better generalization performance of the networks compared to fully connected networks. We conclude that sparsity is a central property of neural networks and should be considered for modern Machine learning approaches.

Authors with CRIS profile

Involved external institutions

How to cite

APA:

Gerum, R., Erpenbeck, A., Krauß, P., & Schilling, A. (2020). Sparsity through evolutionary pruning prevents neuronal networks from overfitting. Neural Networks, 128, 305-312. https://dx.doi.org/10.1016/j.neunet.2020.05.007

MLA:

Gerum, Richard, et al. "Sparsity through evolutionary pruning prevents neuronal networks from overfitting." Neural Networks 128 (2020): 305-312.

BibTeX: Download