IDEAS home Printed from https://ideas.repec.org/a/eee/jmvana/v190y2022ics0047259x22000276.html
   My bibliography  Save this article

Quadratic discriminant analysis by projection

Author

Listed:
  • Wu, Ruiyang
  • Hao, Ning

Abstract

Discriminant analysis, including linear discriminant analysis (LDA) and quadratic discriminant analysis (QDA), is a popular approach to classification problems. It is well known that LDA is suboptimal to analyze heteroscedastic data, for which QDA would be an ideal tool. However, QDA is less helpful when the number of features in a data set is moderate or high, and LDA and its variants often perform better due to their robustness against dimensionality. In this work, we introduce a new dimension reduction and classification method based on QDA. In particular, we define and estimate the optimal one-dimensional (1D) subspace for QDA, which is a novel hybrid approach to discriminant analysis. The new method can handle data heteroscedasticity with number of parameters equal to that of LDA. Therefore, it is more stable than the standard QDA and works well for data in moderate dimensions. We show an estimation consistency property of our method, and compare it with LDA, QDA, regularized discriminant analysis (RDA) and a few other competitors by simulated and real data examples.

Suggested Citation

  • Wu, Ruiyang & Hao, Ning, 2022. "Quadratic discriminant analysis by projection," Journal of Multivariate Analysis, Elsevier, vol. 190(C).
  • Handle: RePEc:eee:jmvana:v:190:y:2022:i:c:s0047259x22000276
    DOI: 10.1016/j.jmva.2022.104987
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0047259X22000276
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.jmva.2022.104987?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Gaynanova, Irina & Wang, Tianying, 2019. "Sparse quadratic classification rules via linear dimension reduction," Journal of Multivariate Analysis, Elsevier, vol. 169(C), pages 278-299.
    2. Robert T. Krafty, 2016. "Discriminant Analysis of Time Series in the Presence of Within-Group Spectral Variability," Journal of Time Series Analysis, Wiley Blackwell, vol. 37(4), pages 435-450, July.
    3. Oliveira, Victor De, 2000. "Bayesian prediction of clipped Gaussian random fields," Computational Statistics & Data Analysis, Elsevier, vol. 34(3), pages 299-314, September.
    4. Timothy I. Cannings & Richard J. Samworth, 2017. "Random-projection ensemble classification," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 79(4), pages 959-1035, September.
    5. Qing Mai & Hui Zou & Ming Yuan, 2012. "A direct approach to sparse discriminant analysis in ultra-high dimensions," Biometrika, Biometrika Trust, vol. 99(1), pages 29-42.
    6. Jianan Zhu & Yang Feng, 2021. "Super RaSE: Super Random Subspace Ensemble Classification," JRFM, MDPI, vol. 14(12), pages 1-18, December.
    7. Tony Cai & Weidong Liu & Yin Xia, 2013. "Two-Sample Covariance Matrix Testing and Support Recovery in High-Dimensional and Sparse Settings," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 108(501), pages 265-277, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Deepak Nag Ayyala & Santu Ghosh & Daniel F. Linder, 2022. "Covariance matrix testing in high dimension using random projections," Computational Statistics, Springer, vol. 37(3), pages 1111-1141, July.
    2. Zhang, Yangchun & Zhou, Yirui & Liu, Xiaowei, 2023. "Applications on linear spectral statistics of high-dimensional sample covariance matrix with divergent spectrum," Computational Statistics & Data Analysis, Elsevier, vol. 178(C).
    3. Bergsma, Wicher P, 2020. "Regression with I-priors," Econometrics and Statistics, Elsevier, vol. 14(C), pages 89-111.
    4. Oda, Ryoya & Suzuki, Yuya & Yanagihara, Hirokazu & Fujikoshi, Yasunori, 2020. "A consistent variable selection method in high-dimensional canonical discriminant analysis," Journal of Multivariate Analysis, Elsevier, vol. 175(C).
    5. Jianqing Fan & Yang Feng & Jiancheng Jiang & Xin Tong, 2016. "Feature Augmentation via Nonparametrics and Selection (FANS) in High-Dimensional Classification," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(513), pages 275-287, March.
    6. Chen, Song Xi & Guo, Bin & Qiu, Yumou, 2023. "Testing and signal identification for two-sample high-dimensional covariances via multi-level thresholding," Journal of Econometrics, Elsevier, vol. 235(2), pages 1337-1354.
    7. Zeyu Wu & Cheng Wang & Weidong Liu, 2023. "A unified precision matrix estimation framework via sparse column-wise inverse operator under weak sparsity," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 75(4), pages 619-648, August.
    8. Luo, Shan & Chen, Zehua, 2020. "A procedure of linear discrimination analysis with detected sparsity structure for high-dimensional multi-class classification," Journal of Multivariate Analysis, Elsevier, vol. 179(C).
    9. Bar, Haim & Wells, Martin T., 2023. "On graphical models and convex geometry," Computational Statistics & Data Analysis, Elsevier, vol. 187(C).
    10. Nicolas Städler & Sach Mukherjee, 2017. "Two-sample testing in high dimensions," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 79(1), pages 225-246, January.
    11. Fuli Zhang & Kung‐Sik Chan, 2023. "Random projection ensemble classification with high‐dimensional time series," Biometrics, The International Biometric Society, vol. 79(2), pages 964-974, June.
    12. Liu, Jianyu & Yu, Guan & Liu, Yufeng, 2019. "Graph-based sparse linear discriminant analysis for high-dimensional classification," Journal of Multivariate Analysis, Elsevier, vol. 171(C), pages 250-269.
    13. Solaiman Afroughi & Soghrat Faghihzadeh & Majid Jafari Khaledi & Mehdi Ghandehari Motlagh & Ebrahim Hajizadeh, 2011. "Analysis of clustered spatially correlated binary data using autologistic model and Bayesian method with an application to dental caries of 3--5-year-old children," Journal of Applied Statistics, Taylor & Francis Journals, vol. 38(12), pages 2763-2774, February.
    14. Mai, Qing & Zou, Hui, 2015. "Sparse semiparametric discriminant analysis," Journal of Multivariate Analysis, Elsevier, vol. 135(C), pages 175-188.
    15. Dawit G. Tadesse & Mark Carpenter, 2019. "A method for selecting the relevant dimensions for high-dimensional classification in singular vector spaces," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 13(2), pages 405-426, June.
    16. L. A. Stefanski & Yichao Wu & Kyle White, 2014. "Variable Selection in Nonparametric Classification Via Measurement Error Model Selection Likelihoods," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 109(506), pages 574-589, June.
    17. Feng, Long & Dicker, Lee H., 2018. "Approximate nonparametric maximum likelihood for mixture models: A convex optimization approach to fitting arbitrary multivariate mixing distributions," Computational Statistics & Data Analysis, Elsevier, vol. 122(C), pages 80-91.
    18. Zhidong Bai & Jiang Hu & Chen Wang & Chao Zhang, 2021. "Test on the linear combinations of covariance matrices in high-dimensional data," Statistical Papers, Springer, vol. 62(2), pages 701-719, April.
    19. Bergsma, Wicher, 2020. "Regression with I-priors," LSE Research Online Documents on Economics 102136, London School of Economics and Political Science, LSE Library.
    20. Yin, Yanqing, 2021. "Test for high-dimensional mean vector under missing observations," Journal of Multivariate Analysis, Elsevier, vol. 186(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:jmvana:v:190:y:2022:i:c:s0047259x22000276. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/622892/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.