Приказ основних података о документу
A comprehensive experimental evaluation of orthogonal polynomial expanded random vector functional link neural networks for regression
dc.creator | Vuković, Najdan | |
dc.creator | Petrović, Milica | |
dc.creator | Miljković, Zoran | |
dc.date.accessioned | 2022-09-19T18:31:20Z | |
dc.date.available | 2022-09-19T18:31:20Z | |
dc.date.issued | 2018 | |
dc.identifier.issn | 1568-4946 | |
dc.identifier.uri | https://machinery.mas.bg.ac.rs/handle/123456789/2878 | |
dc.description.abstract | The Random Vector Functional Link Neural Network (RVFLNN) enables fast learning through a random selection of input weights while learning procedure determines only output weights. Unlike Extreme Learning Machines (ELM), RVFLNN exploits connection between the input layer and the output layer which means that RVFLNN are higher class of networks. Although RVFLNN has been proposed more than two decades ago (Pao, Park, Sobajic, 1994), the nonlinear expansion of the input vector into set of orthogonal functions has not been studied. The Orthogonal Polynomial Expanded Random Vector Functional Link Neural Network (OPE-RVFLNN) utilizes advantages from expansion of the input vector and random determination of the input weights. Through comprehensive experimental evaluation by using 30 UCI regression datasets, we tested four orthogonal polynomials (Chebyshev, Hermite, Laguerre and Legendre) and three activation functions (tansig, logsig, tribal). Rigorous non-parametric statistical hypotheses testing confirms two major conclusions made by Zhang and Suganthan for classification (Zhang and Suganthan, 2015) and Ren et al. for timeseries prediction (Ren, Suganthan, Srikanth, Amaratunga, 2016) in their RVFLNN papers: direct links between the input and output vectors are essential for improved network performance, and ridge regression generates significantly better network parameters than Moore-Penrose pseudoinversion. Our research shows a significant improvement of network performance when one uses tansig activation function and Chebyshev orthogonal polynomial for regression problems. Conclusions drawn from this study may be used as guidelines for OPE-RVFLNN development and implementation for regression problems. | en |
dc.publisher | Elsevier, Amsterdam | |
dc.relation | info:eu-repo/grantAgreement/MESTD/Technological Development (TD or TR)/35004/RS// | |
dc.rights | restrictedAccess | |
dc.source | Applied Soft Computing | |
dc.subject | Ridge regression | en |
dc.subject | Random vector functional link neural networks | en |
dc.subject | Orthogonal polynomial | en |
dc.subject | Nonparametric statistical hypotheses testing | en |
dc.title | A comprehensive experimental evaluation of orthogonal polynomial expanded random vector functional link neural networks for regression | en |
dc.type | article | |
dc.rights.license | ARR | |
dc.citation.epage | 1096 | |
dc.citation.other | 70: 1083-1096 | |
dc.citation.rank | M21 | |
dc.citation.spage | 1083 | |
dc.citation.volume | 70 | |
dc.identifier.doi | 10.1016/j.asoc.2017.10.010 | |
dc.identifier.scopus | 2-s2.0-85031671021 | |
dc.identifier.wos | 000443296000075 | |
dc.type.version | publishedVersion |