IDEAS home Printed from https://ideas.repec.org/a/spr/annopr/v174y2010i1p83-10110.1007-s10479-008-0495-y.html
   My bibliography  Save this article

Adjusted support vector machines based on a new loss function

Author

Listed:
  • Shuchun Wang
  • Wei Jiang
  • Kwok-Leung Tsui

Abstract

Support vector machine (SVM) has attracted considerable attentions recently due to its successful applications in various domains. However, by maximizing the margin of separation between the two classes in a binary classification problem, the SVM solutions often suffer two serious drawbacks. First, SVM separating hyperplane is usually very sensitive to training samples since it strongly depends on support vectors which are only a few points located on the wrong side of the corresponding margin boundaries. Second, the separating hyperplane is equidistant to the two classes which are considered equally important when optimizing the separating hyperplane location regardless the number of training data and their dispersions in each class. In this paper, we propose a new SVM solution, adjusted support vector machine (ASVM), based on a new loss function to adjust the SVM solution taking into account the sample sizes and dispersions of the two classes. Numerical experiments show that the ASVM outperforms conventional SVM, especially when the two classes have large differences in sample size and dispersion. Copyright Springer Science+Business Media, LLC 2010

Suggested Citation

  • Shuchun Wang & Wei Jiang & Kwok-Leung Tsui, 2010. "Adjusted support vector machines based on a new loss function," Annals of Operations Research, Springer, vol. 174(1), pages 83-101, February.
  • Handle: RePEc:spr:annopr:v:174:y:2010:i:1:p:83-101:10.1007/s10479-008-0495-y
    DOI: 10.1007/s10479-008-0495-y
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1007/s10479-008-0495-y
    Download Restriction: Access to full text is restricted to subscribers.

    File URL: https://libkey.io/10.1007/s10479-008-0495-y?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Peter Hall & J. S. Marron & Amnon Neeman, 2005. "Geometric representation of high dimension, low sample size data," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(3), pages 427-444, June.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Yazan F. Roumani & Yaman Roumani & Joseph K. Nwankpa & Mohan Tanniru, 2018. "Classifying readmissions to a cardiac intensive care unit," Annals of Operations Research, Springer, vol. 263(1), pages 429-451, April.
    2. Shuguang He & Wei Jiang & Houtao Deng, 2018. "A distance-based control chart for monitoring multivariate processes using support vector machines," Annals of Operations Research, Springer, vol. 263(1), pages 191-207, April.
    3. Kyungsik Lee & Norman Kim & Myong Jeong, 2014. "The sparse signomial classification and regression model," Annals of Operations Research, Springer, vol. 216(1), pages 257-286, May.
    4. Ayşegül Aşkan & Serpil Sayın, 2014. "SVM classification for imbalanced data sets using a multiobjective optimization framework," Annals of Operations Research, Springer, vol. 216(1), pages 191-203, May.
    5. Pablo Aparicio-Ruiz & Elena Barbadilla-Martín & José Guadix & Pablo Cortés, 2021. "KNN and adaptive comfort applied in decision making for HVAC systems," Annals of Operations Research, Springer, vol. 303(1), pages 217-231, August.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Yata, Kazuyoshi & Aoshima, Makoto, 2013. "PCA consistency for the power spiked model in high-dimensional settings," Journal of Multivariate Analysis, Elsevier, vol. 122(C), pages 334-354.
    2. Jung, Sungkyu & Sen, Arusharka & Marron, J.S., 2012. "Boundary behavior in High Dimension, Low Sample Size asymptotics of PCA," Journal of Multivariate Analysis, Elsevier, vol. 109(C), pages 190-203.
    3. Wang, Shao-Hsuan & Huang, Su-Yun, 2022. "Perturbation theory for cross data matrix-based PCA," Journal of Multivariate Analysis, Elsevier, vol. 190(C).
    4. Saha, Enakshi & Sarkar, Soham & Ghosh, Anil K., 2017. "Some high-dimensional one-sample tests based on functions of interpoint distances," Journal of Multivariate Analysis, Elsevier, vol. 161(C), pages 83-95.
    5. Yugo Nakayama & Kazuyoshi Yata & Makoto Aoshima, 2020. "Bias-corrected support vector machine with Gaussian kernel in high-dimension, low-sample-size settings," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 72(5), pages 1257-1286, October.
    6. Mao, Guangyu, 2018. "Testing independence in high dimensions using Kendall’s tau," Computational Statistics & Data Analysis, Elsevier, vol. 117(C), pages 128-137.
    7. Shin-ichi Tsukada, 2019. "High dimensional two-sample test based on the inter-point distance," Computational Statistics, Springer, vol. 34(2), pages 599-615, June.
    8. Jianqing Fan & Jinchi Lv, 2008. "Sure independence screening for ultrahigh dimensional feature space," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 70(5), pages 849-911, November.
    9. Fionn Murtagh, 2009. "The Remarkable Simplicity of Very High Dimensional Data: Application of Model-Based Clustering," Journal of Classification, Springer;The Classification Society, vol. 26(3), pages 249-277, December.
    10. Mao, Guangyu, 2015. "A note on testing complete independence for high dimensional data," Statistics & Probability Letters, Elsevier, vol. 106(C), pages 82-85.
    11. Chung, Hee Cheol & Ahn, Jeongyoun, 2021. "Subspace rotations for high-dimensional outlier detection," Journal of Multivariate Analysis, Elsevier, vol. 183(C).
    12. repec:jss:jstsof:47:i05 is not listed on IDEAS
    13. Mondal, Pronoy K. & Biswas, Munmun & Ghosh, Anil K., 2015. "On high dimensional two-sample tests based on nearest neighbors," Journal of Multivariate Analysis, Elsevier, vol. 141(C), pages 168-178.
    14. Jack Jewson & David Rossell, 2022. "General Bayesian loss function selection and the use of improper models," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 84(5), pages 1640-1665, November.
    15. Jun Li, 2018. "Asymptotic normality of interpoint distances for high-dimensional data with applications to the two-sample problem," Biometrika, Biometrika Trust, vol. 105(3), pages 529-546.
    16. Ursula Laa & Dianne Cook & Stuart Lee, 2020. "Burning Sage: Reversing the Curse of Dimensionality in the Visualization of High-Dimensional Data," Monash Econometrics and Business Statistics Working Papers 36/20, Monash University, Department of Econometrics and Business Statistics.
    17. Jianqing Fan & Yuan Liao & Martina Mincheva, 2013. "Large covariance estimation by thresholding principal orthogonal complements," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 75(4), pages 603-680, September.
    18. Anil K. Ghosh & Munmun Biswas, 2016. "Distribution-free high-dimensional two-sample tests based on discriminating hyperplanes," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 25(3), pages 525-547, September.
    19. Jung, Sungkyu, 2018. "Continuum directions for supervised dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 125(C), pages 27-43.
    20. Bolivar-Cime, A. & Marron, J.S., 2013. "Comparison of binary discrimination methods for high dimension low sample size data," Journal of Multivariate Analysis, Elsevier, vol. 115(C), pages 108-121.
    21. Makoto Aoshima & Kazuyoshi Yata, 2019. "High-Dimensional Quadratic Classifiers in Non-sparse Settings," Methodology and Computing in Applied Probability, Springer, vol. 21(3), pages 663-682, September.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:annopr:v:174:y:2010:i:1:p:83-101:10.1007/s10479-008-0495-y. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.