2009 IEEE International Conference on
Systems, Man, and Cybernetics |
![]() |
Abstract
In this paper we test and compare Artificial Metaplasticity (AMP) results for Multilater Perceptrons (MLPs). AMP is a novel Artificial Neural Network (ANN) training algorithm inspired in biological metaplasticity property of neurons and Shannon's information theory. During training phase, AMP training algorithm gives more relevance to less frequent patterns and subtracts relevance to the frequent ones, claiming to achieve a much more efficient training, while at least maintaining the MLP performance. AMP is specially recommended when few patterns are available to train the network. We implement an Artificial Metaplasticity MLP (AMMLP) on standard and well-used databases for Machine Learning. Experimental results show the superiority of AMMLPs when compared with recent results on the same databases.