TY - JOUR
T1 - Evolutionary Artificial Neural Network Design and Training for wood veneer classification
AU - Castellani, Marco
AU - Rowlands, Hefin
PY - 2009/6
Y1 - 2009/6
N2 - This study addresses the design and the training of a Multi-Layer Perceptron classifier for identification of wood veneer defects from statistical features of wood sub-images. Previous research utilised a neural network structure manually optimised using the Taguchi method with the connection weights trained using the Backpropagation rule. The proposed approach uses the evolutionary Artificial Neural Network Generation and Training (ANNGaT) algorithm to generate the neural network system. The algorithm evolves simultaneously the neural network topology and the weights. ANNGaT optimises the size of the hidden layer(s) of the neural network structure through genetic mutations of the individuals. The number of hidden layers is a system parameter. Experimental tests show that ANNGaT produces highly compact neural network structures capable of accurate and robust learning. The tests show no differences in accuracy between neural network architectures using one and two hidden layers of processing units. Compared to the manual approach, the evolutionary algorithm generates equally performing solutions using considerably smaller architectures. Moreover, the proposed algorithm requires a lower design effort since the process is fully automated. (C) 2009 Elsevier Ltd. All rights reserved.
AB - This study addresses the design and the training of a Multi-Layer Perceptron classifier for identification of wood veneer defects from statistical features of wood sub-images. Previous research utilised a neural network structure manually optimised using the Taguchi method with the connection weights trained using the Backpropagation rule. The proposed approach uses the evolutionary Artificial Neural Network Generation and Training (ANNGaT) algorithm to generate the neural network system. The algorithm evolves simultaneously the neural network topology and the weights. ANNGaT optimises the size of the hidden layer(s) of the neural network structure through genetic mutations of the individuals. The number of hidden layers is a system parameter. Experimental tests show that ANNGaT produces highly compact neural network structures capable of accurate and robust learning. The tests show no differences in accuracy between neural network architectures using one and two hidden layers of processing units. Compared to the manual approach, the evolutionary algorithm generates equally performing solutions using considerably smaller architectures. Moreover, the proposed algorithm requires a lower design effort since the process is fully automated. (C) 2009 Elsevier Ltd. All rights reserved.
KW - Artificial Neural Networks
KW - Evolutionary Algorithms
KW - Artificial Neural Network Design
KW - Pattern classification
KW - Automated visual inspection
KW - ALGORITHMS
KW - BOARDS
KW - DEFECTS
U2 - 10.1016/j.engappai.2009.01.013
DO - 10.1016/j.engappai.2009.01.013
M3 - Article
SN - 0952-1976
VL - 22
SP - 732
EP - 741
JO - Engineering applications of artificial intelligence
JF - Engineering applications of artificial intelligence
IS - 4-5
ER -