Приказ основних података о документу

dc.creatorVuković, Najdan
dc.creatorPetrović, Milica
dc.creatorMiljković, Zoran
dc.date.accessioned2022-09-19T18:31:20Z
dc.date.available2022-09-19T18:31:20Z
dc.date.issued2018
dc.identifier.issn1568-4946
dc.identifier.urihttps://machinery.mas.bg.ac.rs/handle/123456789/2878
dc.description.abstractThe Random Vector Functional Link Neural Network (RVFLNN) enables fast learning through a random selection of input weights while learning procedure determines only output weights. Unlike Extreme Learning Machines (ELM), RVFLNN exploits connection between the input layer and the output layer which means that RVFLNN are higher class of networks. Although RVFLNN has been proposed more than two decades ago (Pao, Park, Sobajic, 1994), the nonlinear expansion of the input vector into set of orthogonal functions has not been studied. The Orthogonal Polynomial Expanded Random Vector Functional Link Neural Network (OPE-RVFLNN) utilizes advantages from expansion of the input vector and random determination of the input weights. Through comprehensive experimental evaluation by using 30 UCI regression datasets, we tested four orthogonal polynomials (Chebyshev, Hermite, Laguerre and Legendre) and three activation functions (tansig, logsig, tribal). Rigorous non-parametric statistical hypotheses testing confirms two major conclusions made by Zhang and Suganthan for classification (Zhang and Suganthan, 2015) and Ren et al. for timeseries prediction (Ren, Suganthan, Srikanth, Amaratunga, 2016) in their RVFLNN papers: direct links between the input and output vectors are essential for improved network performance, and ridge regression generates significantly better network parameters than Moore-Penrose pseudoinversion. Our research shows a significant improvement of network performance when one uses tansig activation function and Chebyshev orthogonal polynomial for regression problems. Conclusions drawn from this study may be used as guidelines for OPE-RVFLNN development and implementation for regression problems.en
dc.publisherElsevier, Amsterdam
dc.relationinfo:eu-repo/grantAgreement/MESTD/Technological Development (TD or TR)/35004/RS//
dc.rightsrestrictedAccess
dc.sourceApplied Soft Computing
dc.subjectRidge regressionen
dc.subjectRandom vector functional link neural networksen
dc.subjectOrthogonal polynomialen
dc.subjectNonparametric statistical hypotheses testingen
dc.titleA comprehensive experimental evaluation of orthogonal polynomial expanded random vector functional link neural networks for regressionen
dc.typearticle
dc.rights.licenseARR
dc.citation.epage1096
dc.citation.other70: 1083-1096
dc.citation.rankM21
dc.citation.spage1083
dc.citation.volume70
dc.identifier.doi10.1016/j.asoc.2017.10.010
dc.identifier.scopus2-s2.0-85031671021
dc.identifier.wos000443296000075
dc.type.versionpublishedVersion


Документи

Thumbnail

Овај документ се појављује у следећим колекцијама

Приказ основних података о документу