• Asep Andri Fauzi Department of Statistics, IPB University, Indonesia
  • Agus M. Soleh Department of Statistics, IPB University, Indonesia
  • Anik Djuraidah Department of Statistics, IPB University, Indonesia
Keywords: highly correlated predictor, random forest regression, partial least square regession, support vector regression


Highly correlated predictors and nonlinear relationships between response and predictors potentially affected the performance of predictive modeling, especially when using the ordinary least square (OLS) method. The simple technique to solve this problem is by using another method such as Partial Least Square Regression (PLSR), Support Vector Regression with kernel Radial Basis Function (SVR-RBF), and Random Forest Regression (RFR). The purpose of this study is to compare OLS, PLSR, SVR-RBF, and RFR using simulation data. The methods were evaluated by the root mean square error prediction (RMSEP). The result showed that in the linear model, SVR-RBF and RFR have large RMSEP; OLS and PLSR are better than SVR-RBF and RFR, and PLSR provides much more stable prediction than OLS in case of highly correlated predictors and small sample size. In nonlinear data, RFR produced the smallest RMSEP when data contains high correlated predictors.


Download data is not yet available.


Adamowski, J., Fung Chan, H., Prasher, S. O., Ozga-Zielinski, B., & Sliusarieva, A. (2012). Comparison of multiple linear and nonlinear regression, autoregressive integrated moving average, artificial neural network, and wavelet artificial neural network methods for urban water demand forecasting in Montreal, Canada. Water Resources Research, 48(1).

Dormann, C. F., Elith, J., Bacher, S., Buchmann, C., Carl, G., Carré, G., … Lautenbach, S. (2013). Collinearity: a review of methods to deal with it and a simulation study evaluating their performance. Ecography, 36(1): 27–46.

Farahani, H. A., Rahiminezhad, A., Same, L., & others. (2010). A Comparison of Partial Least Squares (PLS) and Ordinary Least Squares (OLS) regressions in predicting of couples mental health based on their communicational patterns. Procedia-Social and Behavioral Sciences, 5: 1459–1463.

Geladi, P., & Kowalski, B. R. (1986). Partial least-squares regression: a tutorial. Analytica Chimica Acta, 185: 1–17.

Hastie, T., Tibshirani, R., & Friedman, J. (2009). The elements of statistical learning: data mining, inference, and prediction. New York (US): Springer Science & Business Media.

James, G., Witten, D., Hastie, T., & Tibshirani, R. (2013). An introduction to statistical learning. New York (US): Springer.

Jing, W., Yang, Y., Yue, X., & Zhao, X. (2016). A spatial downscaling algorithm for satellite-based precipitation over the Tibetan plateau based on NDVI, DEM, and land surface temperature. Remote Sensing, 8(8): 1–19.

Karatzoglou, A., Smola, A., Hornik, K., & Zeileis, A. (2004). kernlab-an S4 package for kernel methods in R. Journal of Statistical Software, 11(9): 1–20.

Karimi, Y., Prasher, S., Madani, A., Kim, S., & others. (2008). Application of support vector machine technology for the estimation of crop biophysical parameters using aerial hyperspectral observations. Canadian Biosystems Engineering, 50(7): 13–20.

Liaw, A., Wiener, M., & others. (2002). Classification and regression by randomForest. R News, 2(3): 18–22.

Liu, Y., Sun, X., Zhou, J., Zhang, H., & Yang, C. (2010). Linear and nonlinear multivariate regressions for determination sugar content of intact Gannan navel orange by Vis–NIR diffuse reflectance spectroscopy. Mathematical and Computer Modelling, 51(11–12): 1438–1443.

Ma, X., Zhang, Y., Cao, H., Zhang, S., & Zhou, Y. (2018). Nonlinear regression with high-dimensional space mapping for blood component spectral quantitative analysis. Journal of Spectroscopy, 2018.

Mevik, B.-H., Wehrens, R., & Liland, K. H. (2019). pls: Partial least squares and principal component regression. R Package Version.

Pour, S. H., Shahid, S., & Chung, E.-S. (2016). A hybrid model for statistical downscaling of daily rainfall. Procedia Engineering, 154: 1424–1430.

[R Core Team]. (2019). R: A language and environment for statistical computing. Vienna (AT): R Foundation for Statistical Computing.

Roy, A., Manna, R., & Chakraborty, S. (2019). Support vector regression based metamodeling for structural reliability analysis. Probabilistic Engineering Mechanics, 55: 78–89.

Smith, P. F., Ganesh, S., & Liu, P. (2013). A comparison of random forest regression and multiple linear regression for prediction in neuroscience. Journal of Neuroscience Methods, 220(1): 85–91.

Steinwart, I., & Christmann, A. (2008). Support Vector Machines. New York (US): Springer Science & Business Media.

Vapnik, V. (2000). The Nature of Statistical Learning Theory. New York (US): Springer Science & Business Media.

Vapnik, V., Golowich, S. E., & Smola, A. J. (1997). Support vector method for function approximation, regression estimation and signal processing. Advances in Neural Information Processing Systems, 281–287.

Venables, W. N., & Ripley, B. D. (2002). Modern applied statistics with S. New York (US): Springer.

Xu, R., Chen, Y., & Chen, Z. (2019). Future Changes of Precipitation over the Han River Basin Using NEX-GDDP Dataset and the SVR_QM Method. Atmosphere, 10(11): 688.

Yeniay, Ö., & Göktaş, A. (2002). A comparison of partial least squares regression with other prediction methods. Hacettepe Journal of Mathematics and Statistics, 31: 99–111.

How to Cite
Fauzi, A., Soleh, A. M., & Djuraidah, A. (2020). KAJIAN SIMULASI PERBANDINGAN METODE REGRESI KUADRAT TERKECIL PARSIAL, SUPPORT VECTOR MACHINE, DAN RANDOM FOREST. Indonesian Journal of Statistics and Its Applications, 4(1), 203-215. https://doi.org/10.29244/ijsa.v4i1.610