ISSN: 1304-7191 | E-ISSN: 1304-7205
Comparative performance analysis of epsilon-insensitive and pruningbased algorithms for sparse least squares support vector regression
1Department of Electrical and Electronics Engineering, Ankara Yildirim Beyazit University, Ankara, 06760, Türkiye
Sigma J Eng Nat Sci 2024; 42(2): 578-589 DOI: 10.14744/sigma.2024.00045
Full Text PDF


Least Squares Support Vector Regression (LSSVR) which is a least squares version of the Sup-port Vector Regression (SVR) is defined with a regularized squared loss without epsilon-in-sensitiveness. LSSVR is formulated in the dual space as a linear equality constrained quadratic minimization which can be transformed into solution of a linear algebraic equation system. As a consequence of this system where the number of Lagrange multipliers is half that of classical SVR, LSSVR has much less time consumption compared to the classical SVR. De-spite this computationally attractive feature, it lacks the sparsity characteristic of SVR due to epsilon-insensitiveness. In LSSVR, every (training) input data is treated as a support vector, yielding extremely poor generalization performance. To overcome these drawbacks, the epsi-lon-insensitive LSSVR with epsilon-insensitivity at quadratic loss, in which sparsity is directly controlled by the epsilon parameter, is derived in this paper. Since the quadratic loss is sensi-tive to outliers, its weighted version (epsilon insensitive WLSSVR) has also been developed. Finally, the performances of epsilon-insensitive LSSVR and epsilon-insensitive WLSSVR are quantitatively compared in detail with those commonly used in the literature, pruning-based LSSVR and weighted pruning-based LSSVR. Experimental results on simulated and 8 differ-ent real-life data show that epsilon-insensitive LSSVR and epsilon-insensitive WLSSVR are superior in terms of computation time, generalization ability, and sparsity.