IDEAS home Printed from https://ideas.repec.org/a/eee/apmaco/v305y2017icp32-52.html
   My bibliography  Save this article

Robust Lp-norm least squares support vector regression with feature selection

Author

Listed:
  • Ye, Ya-Fen
  • Shao, Yuan-Hai
  • Deng, Nai-Yang
  • Li, Chun-Na
  • Hua, Xiang-Yu

Abstract

In this paper, we aim a novel algorithm called robust Lp-norm least squares support vector regression (Lp-LSSVR) that is more robust than the traditional least squares support vector regression(LS-SVR). Using the absolute constraint and the Lp-norm regularization term, our Lp-LSSVR performs robust against outliers. Moreover, though the optimization problem is non-convex, the sparse solution of Lp-norm and the lower bonds for nonzero components technique ensure useful features selected by Lp-LSSVR, and it helps to find the local optimum of our Lp-LSSVR. Experimental results show that although Lp-LSSVR is more robust than least squares support vector regression (LS-SVR), and much faster than Lp-norm support vector regression (Lp-SVR) and SVR due to its equality constraint, it is slower than LS-SVR and L1-norm support vector regression (L1-SVR), it is as effective as Lp-SVR, L1-SVR, LS-SVR and SVR in both feature selection and regression.

Suggested Citation

  • Ye, Ya-Fen & Shao, Yuan-Hai & Deng, Nai-Yang & Li, Chun-Na & Hua, Xiang-Yu, 2017. "Robust Lp-norm least squares support vector regression with feature selection," Applied Mathematics and Computation, Elsevier, vol. 305(C), pages 32-52.
  • Handle: RePEc:eee:apmaco:v:305:y:2017:i:c:p:32-52
    DOI: 10.1016/j.amc.2017.01.062
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0096300317300887
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.amc.2017.01.062?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Lukas Meier & Sara Van De Geer & Peter Bühlmann, 2008. "The group lasso for logistic regression," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 70(1), pages 53-71, February.
    2. James H. Stock & Mark W.Watson, 2003. "Forecasting Output and Inflation: The Role of Asset Prices," Journal of Economic Literature, American Economic Association, vol. 41(3), pages 788-829, September.
    3. P. S. Bradley & O. L. Mangasarian & W. N. Street, 1998. "Feature Selection via Mathematical Programming," INFORMS Journal on Computing, INFORMS, vol. 10(2), pages 209-217, May.
    4. Ming Yuan & Yi Lin, 2006. "Model selection and estimation in regression with grouped variables," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 68(1), pages 49-67, February.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Yu, Haijing & Hu, Chenpei & Xu, Bing, 2022. "Re-examining the existence of a “resource curse”: A spatial heterogeneity perspective," Journal of Business Research, Elsevier, vol. 139(C), pages 1004-1011.
    2. Kuen-Suan Chen & Kuo-Ping Lin & Jun-Xiang Yan & Wan-Lin Hsieh, 2019. "Renewable Power Output Forecasting Using Least-Squares Support Vector Regression and Google Data," Sustainability, MDPI, vol. 11(11), pages 1-13, May.
    3. Yao Dong & He Jiang, 2018. "A Two-Stage Regularization Method for Variable Selection and Forecasting in High-Order Interaction Model," Complexity, Hindawi, vol. 2018, pages 1-12, November.
    4. Kar Hoou Hui & Ching Sheng Ooi & Meng Hee Lim & Mohd Salman Leong & Salah Mahdi Al-Obaidi, 2017. "An improved wrapper-based feature selection method for machinery fault diagnosis," PLOS ONE, Public Library of Science, vol. 12(12), pages 1-10, December.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    2. Bilin Zeng & Xuerong Meggie Wen & Lixing Zhu, 2017. "A link-free sparse group variable selection method for single-index model," Journal of Applied Statistics, Taylor & Francis Journals, vol. 44(13), pages 2388-2400, October.
    3. Olga Klopp & Marianna Pensky, 2013. "Sparse High-dimensional Varying Coefficient Model : Non-asymptotic Minimax Study," Working Papers 2013-30, Center for Research in Economics and Statistics.
    4. Osamu Komori & Shinto Eguchi & John B. Copas, 2015. "Generalized t-statistic for two-group classification," Biometrics, The International Biometric Society, vol. 71(2), pages 404-416, June.
    5. Luu, Tung Duy & Fadili, Jalal & Chesneau, Christophe, 2019. "PAC-Bayesian risk bounds for group-analysis sparse regression by exponential weighting," Journal of Multivariate Analysis, Elsevier, vol. 171(C), pages 209-233.
    6. Lee, In Gyu & Yoon, Sang Won & Won, Daehan, 2022. "A Mixed Integer Linear Programming Support Vector Machine for Cost-Effective Group Feature Selection: Branch-Cut-and-Price Approach," European Journal of Operational Research, Elsevier, vol. 299(3), pages 1055-1068.
    7. Ruidi Chen & Ioannis Ch. Paschalidis, 2022. "Robust Grouped Variable Selection Using Distributionally Robust Optimization," Journal of Optimization Theory and Applications, Springer, vol. 194(3), pages 1042-1071, September.
    8. Zanhua Yin, 2020. "Variable selection for sparse logistic regression," Metrika: International Journal for Theoretical and Applied Statistics, Springer, vol. 83(7), pages 821-836, October.
    9. Jon Ellingsen & Vegard H. Larsen & Leif Anders Thorsrud, 2020. "News Media vs. FRED-MD for Macroeconomic Forecasting," CESifo Working Paper Series 8639, CESifo.
    10. A. Karagrigoriou & C. Koukouvinos & K. Mylona, 2010. "On the advantages of the non-concave penalized likelihood model selection method with minimum prediction errors in large-scale medical studies," Journal of Applied Statistics, Taylor & Francis Journals, vol. 37(1), pages 13-24.
    11. Liu, Jianyu & Yu, Guan & Liu, Yufeng, 2019. "Graph-based sparse linear discriminant analysis for high-dimensional classification," Journal of Multivariate Analysis, Elsevier, vol. 171(C), pages 250-269.
    12. Shota Yamanaka & Nobuo Yamashita, 2018. "Duality of nonconvex optimization with positively homogeneous functions," Computational Optimization and Applications, Springer, vol. 71(2), pages 435-456, November.
    13. Lichun Wang & Yuan You & Heng Lian, 2015. "Convergence and sparsity of Lasso and group Lasso in high-dimensional generalized linear models," Statistical Papers, Springer, vol. 56(3), pages 819-828, August.
    14. Pei Wang & Shunjie Chen & Sijia Yang, 2022. "Recent Advances on Penalized Regression Models for Biological Data," Mathematics, MDPI, vol. 10(19), pages 1-24, October.
    15. Silvia Villa & Lorenzo Rosasco & Sofia Mosci & Alessandro Verri, 2014. "Proximal methods for the latent group lasso penalty," Computational Optimization and Applications, Springer, vol. 58(2), pages 381-407, June.
    16. Canhong Wen & Zhenduo Li & Ruipeng Dong & Yijin Ni & Wenliang Pan, 2023. "Simultaneous Dimension Reduction and Variable Selection for Multinomial Logistic Regression," INFORMS Journal on Computing, INFORMS, vol. 35(5), pages 1044-1060, September.
    17. Groll, Andreas & Hambuckers, Julien & Kneib, Thomas & Umlauf, Nikolaus, 2019. "LASSO-type penalization in the framework of generalized additive models for location, scale and shape," Computational Statistics & Data Analysis, Elsevier, vol. 140(C), pages 59-73.
    18. Xianwen Ding & Zhihuang Yang, 2024. "Adaptive Bi-Level Variable Selection for Quantile Regression Models with a Diverging Number of Covariates," Mathematics, MDPI, vol. 12(20), pages 1-23, October.
    19. Faisal Zahid & Gerhard Tutz, 2013. "Multinomial logit models with implicit variable selection," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 7(4), pages 393-416, December.
    20. Yanfang Zhang & Chuanhua Wei & Xiaolin Liu, 2022. "Group Logistic Regression Models with l p,q Regularization," Mathematics, MDPI, vol. 10(13), pages 1-15, June.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:apmaco:v:305:y:2017:i:c:p:32-52. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: https://www.journals.elsevier.com/applied-mathematics-and-computation .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.