IDEAS home Printed from https://ideas.repec.org/a/spr/jclass/v33y2016i1d10.1007_s00357-016-9197-3.html
   My bibliography  Save this article

Simultaneous Predictive Gaussian Classifiers

Author

Listed:
  • Yaqiong Cui

    (University of Helsinki)

  • Jukka Sirén

    (University of Helsinki)

  • Timo Koski

    (KTH Royal Institute of Technology)

  • Jukka Corander

    (University of Helsinki
    Åbo Akademi University
    University of Helsinki
    Åbo Akademi University)

Abstract

Gaussian distribution has for several decades been ubiquitous in the theory and practice of statistical classification. Despite the early proposals motivating the use of predictive inference to design a classifier, this approach has gained relatively little attention apart from certain specific applications, such as speech recognition where its optimality has been widely acknowledged. Here we examine statistical properties of different inductive classification rules under a generic Gaussian model and demonstrate the optimality of considering simultaneous classification of multiple samples under an attractive loss function. It is shown that the simpler independent classification of samples leads asymptotically to the same optimal rule as the simultaneous classifier when the amount of training data increases, if the dimensionality of the feature space is bounded in an appropriate manner. Numerical investigations suggest that the simultaneous predictive classifier can lead to higher classification accuracy than the independent rule in the low-dimensional case, whereas the simultaneous approach suffers more from noise when the dimensionality increases.

Suggested Citation

  • Yaqiong Cui & Jukka Sirén & Timo Koski & Jukka Corander, 2016. "Simultaneous Predictive Gaussian Classifiers," Journal of Classification, Springer;The Classification Society, vol. 33(1), pages 73-102, April.
  • Handle: RePEc:spr:jclass:v:33:y:2016:i:1:d:10.1007_s00357-016-9197-3
    DOI: 10.1007/s00357-016-9197-3
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s00357-016-9197-3
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s00357-016-9197-3?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Jukka Corander & Mats Gyllenberg & Timo Koski, 2009. "Bayesian unsupervised classification framework based on stochastic partitions of data and a parallel search strategy," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 3(1), pages 3-24, June.
    2. Guido Consonni & Luca La Rocca, 2012. "Objective Bayes Factors for Gaussian Directed Acyclic Graphical Models," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 39(4), pages 743-756, December.
    3. Jianqing Fan & Yang Feng & Xin Tong, 2012. "A road to classification in high dimensional space: the regularized optimal affine discriminant," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 74(4), pages 745-771, September.
    4. Olvi L. Mangasarian & W. Nick Street & William H. Wolberg, 1995. "Breast Cancer Diagnosis and Prognosis Via Linear Programming," Operations Research, INFORMS, vol. 43(4), pages 570-577, August.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Pedro Duarte Silva, A., 2017. "Optimization approaches to Supervised Classification," European Journal of Operational Research, Elsevier, vol. 261(2), pages 772-788.
    2. Oda, Ryoya & Suzuki, Yuya & Yanagihara, Hirokazu & Fujikoshi, Yasunori, 2020. "A consistent variable selection method in high-dimensional canonical discriminant analysis," Journal of Multivariate Analysis, Elsevier, vol. 175(C).
    3. Sexton, Randall S. & Dorsey, Robert E. & Johnson, John D., 1999. "Optimization of neural networks: A comparative analysis of the genetic algorithm and simulated annealing," European Journal of Operational Research, Elsevier, vol. 114(3), pages 589-601, May.
    4. Jianqing Fan & Yang Feng & Jiancheng Jiang & Xin Tong, 2016. "Feature Augmentation via Nonparametrics and Selection (FANS) in High-Dimensional Classification," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(513), pages 275-287, March.
    5. Zeyu Wu & Cheng Wang & Weidong Liu, 2023. "A unified precision matrix estimation framework via sparse column-wise inverse operator under weak sparsity," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 75(4), pages 619-648, August.
    6. Qiang Sun & Hongtu Zhu & Yufeng Liu & Joseph G. Ibrahim, 2015. "SPReM: Sparse Projection Regression Model For High-Dimensional Linear Regression," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 110(509), pages 289-302, March.
    7. Brandner, Hubertus & Lessmann, Stefan & Voß, Stefan, 2013. "A memetic approach to construct transductive discrete support vector machines," European Journal of Operational Research, Elsevier, vol. 230(3), pages 581-595.
    8. Yixin Fang & Yang Feng & Ming Yuan, 2014. "Regularized principal components of heritability," Computational Statistics, Springer, vol. 29(3), pages 455-465, June.
    9. W. Art Chaovalitwongse & Ya-Ju Fan & Rajesh C. Sachdeo, 2008. "Novel Optimization Models for Abnormal Brain Activity Classification," Operations Research, INFORMS, vol. 56(6), pages 1450-1460, December.
    10. Liu, Jianyu & Yu, Guan & Liu, Yufeng, 2019. "Graph-based sparse linear discriminant analysis for high-dimensional classification," Journal of Multivariate Analysis, Elsevier, vol. 171(C), pages 250-269.
    11. Tamilselvan, Prasanna & Wang, Pingfeng, 2013. "Failure diagnosis using deep belief learning based health state classification," Reliability Engineering and System Safety, Elsevier, vol. 115(C), pages 124-135.
    12. Shen, Yanfeng & Lin, Zhengyan, 2015. "An adaptive test for the mean vector in large-p-small-n problems," Computational Statistics & Data Analysis, Elsevier, vol. 89(C), pages 25-38.
    13. Hongtu Zhu & Dan Shen & Xuewei Peng & Leo Yufeng Liu, 2017. "MWPCR: Multiscale Weighted Principal Component Regression for High-Dimensional Prediction," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 112(519), pages 1009-1021, July.
    14. Ramazan Ünlü & Petros Xanthopoulos, 2019. "A weighted framework for unsupervised ensemble learning based on internal quality measures," Annals of Operations Research, Springer, vol. 276(1), pages 229-247, May.
    15. Nickolay T. Trendafilov & Tsegay Gebrehiwot Gebru, 2016. "Recipes for sparse LDA of horizontal data," METRON, Springer;Sapienza Università di Roma, vol. 74(2), pages 207-221, August.
    16. Davide Altomare & Guido Consonni & Luca La Rocca, 2013. "Objective Bayesian Search of Gaussian Directed Acyclic Graphical Models for Ordered Variables with Non-Local Priors," Biometrics, The International Biometric Society, vol. 69(2), pages 478-487, June.
    17. Irina Gaynanova & James G. Booth & Martin T. Wells, 2016. "Simultaneous Sparse Estimation of Canonical Vectors in the ≫ Setting," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(514), pages 696-706, April.
    18. Bak, Britta Anker & Jensen, Jens Ledet, 2016. "High dimensional classifiers in the imbalanced case," Computational Statistics & Data Analysis, Elsevier, vol. 98(C), pages 46-59.
    19. Morris, Katherine & McNicholas, Paul D., 2016. "Clustering, classification, discriminant analysis, and dimension reduction via generalized hyperbolic mixtures," Computational Statistics & Data Analysis, Elsevier, vol. 97(C), pages 133-150.
    20. Ryu, Young U. & Chandrasekaran, R. & Jacob, Varghese S., 2007. "Breast cancer prediction using the isotonic separation technique," European Journal of Operational Research, Elsevier, vol. 181(2), pages 842-854, September.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:jclass:v:33:y:2016:i:1:d:10.1007_s00357-016-9197-3. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.