A comprehensive experimental evaluation of orthogonal polynomial expanded random vector functional link neural networks for regression
Само за регистроване кориснике
2018
Чланак у часопису (Објављена верзија)
Метаподаци
Приказ свих података о документуАпстракт
The Random Vector Functional Link Neural Network (RVFLNN) enables fast learning through a random selection of input weights while learning procedure determines only output weights. Unlike Extreme Learning Machines (ELM), RVFLNN exploits connection between the input layer and the output layer which means that RVFLNN are higher class of networks. Although RVFLNN has been proposed more than two decades ago (Pao, Park, Sobajic, 1994), the nonlinear expansion of the input vector into set of orthogonal functions has not been studied. The Orthogonal Polynomial Expanded Random Vector Functional Link Neural Network (OPE-RVFLNN) utilizes advantages from expansion of the input vector and random determination of the input weights. Through comprehensive experimental evaluation by using 30 UCI regression datasets, we tested four orthogonal polynomials (Chebyshev, Hermite, Laguerre and Legendre) and three activation functions (tansig, logsig, tribal). Rigorous non-parametric statistical hypotheses te...sting confirms two major conclusions made by Zhang and Suganthan for classification (Zhang and Suganthan, 2015) and Ren et al. for timeseries prediction (Ren, Suganthan, Srikanth, Amaratunga, 2016) in their RVFLNN papers: direct links between the input and output vectors are essential for improved network performance, and ridge regression generates significantly better network parameters than Moore-Penrose pseudoinversion. Our research shows a significant improvement of network performance when one uses tansig activation function and Chebyshev orthogonal polynomial for regression problems. Conclusions drawn from this study may be used as guidelines for OPE-RVFLNN development and implementation for regression problems.
Кључне речи:
Ridge regression / Random vector functional link neural networks / Orthogonal polynomial / Nonparametric statistical hypotheses testingИзвор:
Applied Soft Computing, 2018, 70, 1083-1096Издавач:
- Elsevier, Amsterdam
Финансирање / пројекти:
- Иновативни приступ у примени интелигентних технолошких система за производњу делова од лима заснован на еколошким принципима (RS-MESTD-Technological Development (TD or TR)-35004)
DOI: 10.1016/j.asoc.2017.10.010
ISSN: 1568-4946
WoS: 000443296000075
Scopus: 2-s2.0-85031671021
Колекције
Институција/група
Mašinski fakultetTY - JOUR AU - Vuković, Najdan AU - Petrović, Milica AU - Miljković, Zoran PY - 2018 UR - https://machinery.mas.bg.ac.rs/handle/123456789/2878 AB - The Random Vector Functional Link Neural Network (RVFLNN) enables fast learning through a random selection of input weights while learning procedure determines only output weights. Unlike Extreme Learning Machines (ELM), RVFLNN exploits connection between the input layer and the output layer which means that RVFLNN are higher class of networks. Although RVFLNN has been proposed more than two decades ago (Pao, Park, Sobajic, 1994), the nonlinear expansion of the input vector into set of orthogonal functions has not been studied. The Orthogonal Polynomial Expanded Random Vector Functional Link Neural Network (OPE-RVFLNN) utilizes advantages from expansion of the input vector and random determination of the input weights. Through comprehensive experimental evaluation by using 30 UCI regression datasets, we tested four orthogonal polynomials (Chebyshev, Hermite, Laguerre and Legendre) and three activation functions (tansig, logsig, tribal). Rigorous non-parametric statistical hypotheses testing confirms two major conclusions made by Zhang and Suganthan for classification (Zhang and Suganthan, 2015) and Ren et al. for timeseries prediction (Ren, Suganthan, Srikanth, Amaratunga, 2016) in their RVFLNN papers: direct links between the input and output vectors are essential for improved network performance, and ridge regression generates significantly better network parameters than Moore-Penrose pseudoinversion. Our research shows a significant improvement of network performance when one uses tansig activation function and Chebyshev orthogonal polynomial for regression problems. Conclusions drawn from this study may be used as guidelines for OPE-RVFLNN development and implementation for regression problems. PB - Elsevier, Amsterdam T2 - Applied Soft Computing T1 - A comprehensive experimental evaluation of orthogonal polynomial expanded random vector functional link neural networks for regression EP - 1096 SP - 1083 VL - 70 DO - 10.1016/j.asoc.2017.10.010 ER -
@article{ author = "Vuković, Najdan and Petrović, Milica and Miljković, Zoran", year = "2018", abstract = "The Random Vector Functional Link Neural Network (RVFLNN) enables fast learning through a random selection of input weights while learning procedure determines only output weights. Unlike Extreme Learning Machines (ELM), RVFLNN exploits connection between the input layer and the output layer which means that RVFLNN are higher class of networks. Although RVFLNN has been proposed more than two decades ago (Pao, Park, Sobajic, 1994), the nonlinear expansion of the input vector into set of orthogonal functions has not been studied. The Orthogonal Polynomial Expanded Random Vector Functional Link Neural Network (OPE-RVFLNN) utilizes advantages from expansion of the input vector and random determination of the input weights. Through comprehensive experimental evaluation by using 30 UCI regression datasets, we tested four orthogonal polynomials (Chebyshev, Hermite, Laguerre and Legendre) and three activation functions (tansig, logsig, tribal). Rigorous non-parametric statistical hypotheses testing confirms two major conclusions made by Zhang and Suganthan for classification (Zhang and Suganthan, 2015) and Ren et al. for timeseries prediction (Ren, Suganthan, Srikanth, Amaratunga, 2016) in their RVFLNN papers: direct links between the input and output vectors are essential for improved network performance, and ridge regression generates significantly better network parameters than Moore-Penrose pseudoinversion. Our research shows a significant improvement of network performance when one uses tansig activation function and Chebyshev orthogonal polynomial for regression problems. Conclusions drawn from this study may be used as guidelines for OPE-RVFLNN development and implementation for regression problems.", publisher = "Elsevier, Amsterdam", journal = "Applied Soft Computing", title = "A comprehensive experimental evaluation of orthogonal polynomial expanded random vector functional link neural networks for regression", pages = "1096-1083", volume = "70", doi = "10.1016/j.asoc.2017.10.010" }
Vuković, N., Petrović, M.,& Miljković, Z.. (2018). A comprehensive experimental evaluation of orthogonal polynomial expanded random vector functional link neural networks for regression. in Applied Soft Computing Elsevier, Amsterdam., 70, 1083-1096. https://doi.org/10.1016/j.asoc.2017.10.010
Vuković N, Petrović M, Miljković Z. A comprehensive experimental evaluation of orthogonal polynomial expanded random vector functional link neural networks for regression. in Applied Soft Computing. 2018;70:1083-1096. doi:10.1016/j.asoc.2017.10.010 .
Vuković, Najdan, Petrović, Milica, Miljković, Zoran, "A comprehensive experimental evaluation of orthogonal polynomial expanded random vector functional link neural networks for regression" in Applied Soft Computing, 70 (2018):1083-1096, https://doi.org/10.1016/j.asoc.2017.10.010 . .