TY - JOUR AU - Fauzi, Asep Andri AU - Soleh, Agus M. AU - Djuraidah, Anik PY - 2020/02/28 Y2 - 2024/03/28 TI - KAJIAN SIMULASI PERBANDINGAN METODE REGRESI KUADRAT TERKECIL PARSIAL, SUPPORT VECTOR MACHINE, DAN RANDOM FOREST JF - Indonesian Journal of Statistics and Its Applications JA - IJSA VL - 4 IS - 1 SE - Articles DO - 10.29244/ijsa.v4i1.610 UR - https://journal.stats.id/index.php/ijsa/article/view/610 SP - 203-215 AB - <p>Highly correlated predictors and nonlinear relationships between response and predictors potentially affected the performance of predictive modeling, especially when using the ordinary least square (OLS) method. The simple technique to solve this problem is by using another method such as Partial Least Square Regression (PLSR), Support Vector Regression with kernel Radial Basis Function (SVR-RBF), and Random Forest Regression (RFR). The purpose of this study is to compare OLS, PLSR, SVR-RBF, and RFR using simulation data. The methods were evaluated by the root mean square error prediction (RMSEP). The result showed that in the linear model, SVR-RBF and RFR have large RMSEP; OLS and PLSR are better than SVR-RBF and RFR, and PLSR provides much more stable prediction than OLS in case of highly correlated predictors and small sample size. In nonlinear data, RFR produced the smallest RMSEP when data contains high correlated predictors.</p> ER -