A growing and pruning sequential learning algorithm of hyper basis function neural network for function approximation
Само за регистроване кориснике
2013
Чланак у часопису (Објављена верзија)
Метаподаци
Приказ свих података о документуАпстракт
Radial basis function (RBF) neural network is constructed of certain number of RBF neurons, and these networks are among the most used neural networks for modeling of various nonlinear problems in engineering. Conventional RBF neuron is usually based on Gaussian type of activation function with single width for each activation function. This feature restricts neuron performance for modeling the complex nonlinear problems. To accommodate limitation of a single scale, this paper presents neural network with similar but yet different activation function hyper basis function (HBF). The HBF allows different scaling of input dimensions to provide better generalization property when dealing with complex nonlinear problems in engineering practice. The HBF is based on generalization of Gaussian type of neuron that applies Mahalanobis-like distance as a distance metrics between input training sample and prototype vector. Compared to the RBF, the HBF neuron has more parameters to optimize, but HB...F neural network needs less number of HBF neurons to memorize relationship between input and output sets in order to achieve good generalization property. However, recent research results of HBF neural network performance have shown that optimal way of constructing this type of neural network is needed; this paper addresses this issue and modifies sequential learning algorithm for HBF neural network that exploits the concept of neuron's significance and allows growing and pruning of HBF neuron during learning process. Extensive experimental study shows that HBF neural network, trained with developed learning algorithm, achieves lower prediction error and more compact neural network.
Кључне речи:
Sequential learning / Neuron's significance / Hyper basis function / Feedforward neural networks / Extended Kalman FilterИзвор:
Neural Networks, 2013, 46, 210-226Издавач:
- Pergamon-Elsevier Science Ltd, Oxford
Финансирање / пројекти:
DOI: 10.1016/j.neunet.2013.06.004
ISSN: 0893-6080
PubMed: 23811384
WoS: 000325308900022
Scopus: 2-s2.0-84879753900
Колекције
Институција/група
Mašinski fakultetTY - JOUR AU - Vuković, Najdan AU - Miljković, Zoran PY - 2013 UR - https://machinery.mas.bg.ac.rs/handle/123456789/1723 AB - Radial basis function (RBF) neural network is constructed of certain number of RBF neurons, and these networks are among the most used neural networks for modeling of various nonlinear problems in engineering. Conventional RBF neuron is usually based on Gaussian type of activation function with single width for each activation function. This feature restricts neuron performance for modeling the complex nonlinear problems. To accommodate limitation of a single scale, this paper presents neural network with similar but yet different activation function hyper basis function (HBF). The HBF allows different scaling of input dimensions to provide better generalization property when dealing with complex nonlinear problems in engineering practice. The HBF is based on generalization of Gaussian type of neuron that applies Mahalanobis-like distance as a distance metrics between input training sample and prototype vector. Compared to the RBF, the HBF neuron has more parameters to optimize, but HBF neural network needs less number of HBF neurons to memorize relationship between input and output sets in order to achieve good generalization property. However, recent research results of HBF neural network performance have shown that optimal way of constructing this type of neural network is needed; this paper addresses this issue and modifies sequential learning algorithm for HBF neural network that exploits the concept of neuron's significance and allows growing and pruning of HBF neuron during learning process. Extensive experimental study shows that HBF neural network, trained with developed learning algorithm, achieves lower prediction error and more compact neural network. PB - Pergamon-Elsevier Science Ltd, Oxford T2 - Neural Networks T1 - A growing and pruning sequential learning algorithm of hyper basis function neural network for function approximation EP - 226 SP - 210 VL - 46 DO - 10.1016/j.neunet.2013.06.004 ER -
@article{ author = "Vuković, Najdan and Miljković, Zoran", year = "2013", abstract = "Radial basis function (RBF) neural network is constructed of certain number of RBF neurons, and these networks are among the most used neural networks for modeling of various nonlinear problems in engineering. Conventional RBF neuron is usually based on Gaussian type of activation function with single width for each activation function. This feature restricts neuron performance for modeling the complex nonlinear problems. To accommodate limitation of a single scale, this paper presents neural network with similar but yet different activation function hyper basis function (HBF). The HBF allows different scaling of input dimensions to provide better generalization property when dealing with complex nonlinear problems in engineering practice. The HBF is based on generalization of Gaussian type of neuron that applies Mahalanobis-like distance as a distance metrics between input training sample and prototype vector. Compared to the RBF, the HBF neuron has more parameters to optimize, but HBF neural network needs less number of HBF neurons to memorize relationship between input and output sets in order to achieve good generalization property. However, recent research results of HBF neural network performance have shown that optimal way of constructing this type of neural network is needed; this paper addresses this issue and modifies sequential learning algorithm for HBF neural network that exploits the concept of neuron's significance and allows growing and pruning of HBF neuron during learning process. Extensive experimental study shows that HBF neural network, trained with developed learning algorithm, achieves lower prediction error and more compact neural network.", publisher = "Pergamon-Elsevier Science Ltd, Oxford", journal = "Neural Networks", title = "A growing and pruning sequential learning algorithm of hyper basis function neural network for function approximation", pages = "226-210", volume = "46", doi = "10.1016/j.neunet.2013.06.004" }
Vuković, N.,& Miljković, Z.. (2013). A growing and pruning sequential learning algorithm of hyper basis function neural network for function approximation. in Neural Networks Pergamon-Elsevier Science Ltd, Oxford., 46, 210-226. https://doi.org/10.1016/j.neunet.2013.06.004
Vuković N, Miljković Z. A growing and pruning sequential learning algorithm of hyper basis function neural network for function approximation. in Neural Networks. 2013;46:210-226. doi:10.1016/j.neunet.2013.06.004 .
Vuković, Najdan, Miljković, Zoran, "A growing and pruning sequential learning algorithm of hyper basis function neural network for function approximation" in Neural Networks, 46 (2013):210-226, https://doi.org/10.1016/j.neunet.2013.06.004 . .