IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v53y2009i5p1674-1687.html
   My bibliography  Save this article

Modified linear discriminant analysis approaches for classification of high-dimensional microarray data

Author

Listed:
  • Xu, Ping
  • Brock, Guy N.
  • Parrish, Rudolph S.

Abstract

Linear discriminant analysis (LDA) is one of the most popular methods of classification. For high-dimensional microarray data classification, due to the small number of samples and large number of features, classical LDA has sub-optimal performance corresponding to the singularity and instability of the within-group covariance matrix. Two modified LDA approaches (MLDA and NLDA) were applied for microarray classification and their performance criteria were compared with other popular classification algorithms across a range of feature set sizes (number of genes) using both simulated and real datasets. The results showed that the overall performance of the two modified LDA approaches was as competitive as support vector machines and other regularized LDA approaches and better than diagonal linear discriminant analysis, k-nearest neighbor, and classical LDA. It was concluded that the modified LDA approaches can be used as an effective classification tool in limited sample size and high-dimensional microarray classification problems.

Suggested Citation

  • Xu, Ping & Brock, Guy N. & Parrish, Rudolph S., 2009. "Modified linear discriminant analysis approaches for classification of high-dimensional microarray data," Computational Statistics & Data Analysis, Elsevier, vol. 53(5), pages 1674-1687, March.
  • Handle: RePEc:eee:csdana:v:53:y:2009:i:5:p:1674-1687
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167-9473(08)00055-8
    Download Restriction: Full text for ScienceDirect subscribers only.
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Ledoit, Olivier & Wolf, Michael, 2004. "A well-conditioned estimator for large-dimensional covariance matrices," Journal of Multivariate Analysis, Elsevier, vol. 88(2), pages 365-411, February.
    2. Duintjer Tebbens, Jurjen & Schlesinger, Pavel, 2007. "Improving implementation of linear discriminant analysis for the high dimension/small sample size problem," Computational Statistics & Data Analysis, Elsevier, vol. 52(1), pages 423-437, September.
    3. Smyth Gordon K, 2004. "Linear Models and Empirical Bayes Methods for Assessing Differential Expression in Microarray Experiments," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 3(1), pages 1-28, February.
    4. Parrish, Rudolph S. & Spencer III, Horace J. & Xu, Ping, 2009. "Distribution modeling and simulation of gene expression data," Computational Statistics & Data Analysis, Elsevier, vol. 53(5), pages 1650-1660, March.
    5. Schäfer Juliane & Strimmer Korbinian, 2005. "A Shrinkage Approach to Large-Scale Covariance Matrix Estimation and Implications for Functional Genomics," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 4(1), pages 1-32, November.
    6. Dudoit S. & Fridlyand J. & Speed T. P, 2002. "Comparison of Discrimination Methods for the Classification of Tumors Using Gene Expression Data," Journal of the American Statistical Association, American Statistical Association, vol. 97, pages 77-87, March.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Frénay, Benoît & Doquire, Gauthier & Verleysen, Michel, 2014. "Estimating mutual information for feature selection in the presence of label noise," Computational Statistics & Data Analysis, Elsevier, vol. 71(C), pages 832-848.
    2. Sung, Bongjung & Lee, Jaeyong, 2023. "Covariance structure estimation with Laplace approximation," Journal of Multivariate Analysis, Elsevier, vol. 198(C).
    3. Parrish, Rudolph S. & Spencer III, Horace J. & Xu, Ping, 2009. "Distribution modeling and simulation of gene expression data," Computational Statistics & Data Analysis, Elsevier, vol. 53(5), pages 1650-1660, March.
    4. Kubokawa, Tatsuya & Hyodo, Masashi & Srivastava, Muni S., 2013. "Asymptotic expansion and estimation of EPMC for linear classification rules in high dimension," Journal of Multivariate Analysis, Elsevier, vol. 115(C), pages 496-515.
    5. Irina Gaynanova & James G. Booth & Martin T. Wells, 2016. "Simultaneous Sparse Estimation of Canonical Vectors in the ≫ Setting," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(514), pages 696-706, April.
    6. Michael Fop & Pierre-Alexandre Mattei & Charles Bouveyron & Thomas Brendan Murphy, 2022. "Unobserved classes and extra variables in high-dimensional discriminant analysis," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 16(1), pages 55-92, March.
    7. A. Poterie & J.-F. Dupuy & V. Monbet & L. Rouvière, 2019. "Classification tree algorithm for grouped variables," Computational Statistics, Springer, vol. 34(4), pages 1613-1648, December.
    8. Brendan P. W. Ames & Mingyi Hong, 2016. "Alternating direction method of multipliers for penalized zero-variance discriminant analysis," Computational Optimization and Applications, Springer, vol. 64(3), pages 725-754, July.
    9. Pires, Ana M. & Branco, João A., 2010. "Projection-pursuit approach to robust linear discriminant analysis," Journal of Multivariate Analysis, Elsevier, vol. 101(10), pages 2464-2485, November.
    10. Pedro Duarte Silva, A., 2011. "Two-group classification with high-dimensional correlated data: A factor model approach," Computational Statistics & Data Analysis, Elsevier, vol. 55(11), pages 2975-2990, November.
    11. Ruiyan Luo & Xin Qi, 2017. "Asymptotic Optimality of Sparse Linear Discriminant Analysis with Arbitrary Number of Classes," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 44(3), pages 598-616, September.
    12. Shen, Yanfeng & Lin, Zhengyan & Zhu, Jun, 2011. "Shrinkage-based regularization tests for high-dimensional data with application to gene set analysis," Computational Statistics & Data Analysis, Elsevier, vol. 55(7), pages 2221-2233, July.
    13. Ivana Krtolica & Dragan Savić & Bojana Bajić & Snežana Radulović, 2022. "Machine Learning for Water Quality Assessment Based on Macrophyte Presence," Sustainability, MDPI, vol. 15(1), pages 1-13, December.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Pedro Duarte Silva, A., 2011. "Two-group classification with high-dimensional correlated data: A factor model approach," Computational Statistics & Data Analysis, Elsevier, vol. 55(11), pages 2975-2990, November.
    2. Fisher, Thomas J. & Sun, Xiaoqian, 2011. "Improved Stein-type shrinkage estimators for the high-dimensional multivariate normal covariance matrix," Computational Statistics & Data Analysis, Elsevier, vol. 55(5), pages 1909-1918, May.
    3. Touloumis, Anestis, 2015. "Nonparametric Stein-type shrinkage covariance matrix estimators in high-dimensional settings," Computational Statistics & Data Analysis, Elsevier, vol. 83(C), pages 251-261.
    4. Hannart, Alexis & Naveau, Philippe, 2014. "Estimating high dimensional covariance matrices: A new look at the Gaussian conjugate framework," Journal of Multivariate Analysis, Elsevier, vol. 131(C), pages 149-162.
    5. Hossain, Ahmed & Beyene, Joseph & Willan, Andrew R. & Hu, Pingzhao, 2009. "A flexible approximate likelihood ratio test for detecting differential expression in microarray data," Computational Statistics & Data Analysis, Elsevier, vol. 53(10), pages 3685-3695, August.
    6. Christian Bongiorno, 2020. "Bootstraps Regularize Singular Correlation Matrices," Working Papers hal-02536278, HAL.
    7. van Wieringen, Wessel N. & Stam, Koen A. & Peeters, Carel F.W. & van de Wiel, Mark A., 2020. "Updating of the Gaussian graphical model through targeted penalized estimation," Journal of Multivariate Analysis, Elsevier, vol. 178(C).
    8. Ledoit, Olivier & Wolf, Michael, 2017. "Numerical implementation of the QuEST function," Computational Statistics & Data Analysis, Elsevier, vol. 115(C), pages 199-223.
    9. Sumanjay Dutta & Shashi Jain, 2023. "Precision versus Shrinkage: A Comparative Analysis of Covariance Estimation Methods for Portfolio Allocation," Papers 2305.11298, arXiv.org.
    10. Lam, Clifford, 2020. "High-dimensional covariance matrix estimation," LSE Research Online Documents on Economics 101667, London School of Economics and Political Science, LSE Library.
    11. Sahra Uygun & Cheng Peng & Melissa D Lehti-Shiu & Robert L Last & Shin-Han Shiu, 2016. "Utility and Limitations of Using Gene Expression Data to Identify Functional Associations," PLOS Computational Biology, Public Library of Science, vol. 12(12), pages 1-27, December.
    12. Shen, Yanfeng & Lin, Zhengyan, 2015. "An adaptive test for the mean vector in large-p-small-n problems," Computational Statistics & Data Analysis, Elsevier, vol. 89(C), pages 25-38.
    13. Brett Naul & Bala Rajaratnam & Dario Vincenzi, 2016. "The role of the isotonizing algorithm in Stein’s covariance matrix estimator," Computational Statistics, Springer, vol. 31(4), pages 1453-1476, December.
    14. Gautam Sabnis & Debdeep Pati & Anirban Bhattacharya, 2019. "Compressed Covariance Estimation with Automated Dimension Learning," Sankhya A: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 81(2), pages 466-481, December.
    15. Brendan P. W. Ames & Mingyi Hong, 2016. "Alternating direction method of multipliers for penalized zero-variance discriminant analysis," Computational Optimization and Applications, Springer, vol. 64(3), pages 725-754, July.
    16. Daniele Girolimetto & George Athanasopoulos & Tommaso Di Fonzo & Rob J Hyndman, 2023. "Cross-temporal Probabilistic Forecast Reconciliation," Monash Econometrics and Business Statistics Working Papers 6/23, Monash University, Department of Econometrics and Business Statistics.
    17. Arthur Tenenhaus & Michel Tenenhaus, 2011. "Regularized Generalized Canonical Correlation Analysis," Psychometrika, Springer;The Psychometric Society, vol. 76(2), pages 257-284, April.
    18. Jan Kalina & Jan Tichavský, 2022. "The minimum weighted covariance determinant estimator for high-dimensional data," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 16(4), pages 977-999, December.
    19. Bailey, Natalia & Pesaran, M. Hashem & Smith, L. Vanessa, 2019. "A multiple testing approach to the regularisation of large sample correlation matrices," Journal of Econometrics, Elsevier, vol. 208(2), pages 507-534.
    20. Ikeda, Yuki & Kubokawa, Tatsuya & Srivastava, Muni S., 2016. "Comparison of linear shrinkage estimators of a large covariance matrix in normal and non-normal distributions," Computational Statistics & Data Analysis, Elsevier, vol. 95(C), pages 95-108.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:53:y:2009:i:5:p:1674-1687. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.