A U-classifier for high-dimensional data under non-normality
Author
Abstract
Suggested Citation
DOI: 10.1016/j.jmva.2018.05.008
Download full text from publisher
As the access to this document is restricted, you may want to search for a different version of it.
References listed on IDEAS
- Zhong, Ping-Shou & Chen, Song Xi, 2011. "Tests for High-Dimensional Regression Coefficients With Factorial Designs," Journal of the American Statistical Association, American Statistical Association, vol. 106(493), pages 260-274.
- Tae Kim & Zhi-Ming Luo & Chiho Kim, 2011. "The central limit theorem for degenerate variable -statistics under dependence," Journal of Nonparametric Statistics, Taylor & Francis Journals, vol. 23(3), pages 683-699.
- Makoto Aoshima & Kazuyoshi Yata, 2014. "A distance-based, misclassification rate adjusted classifier for multiclass, high-dimensional data," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 66(5), pages 983-1010, October.
- Dudoit S. & Fridlyand J. & Speed T. P, 2002. "Comparison of Discrimination Methods for the Classification of Tumors Using Gene Expression Data," Journal of the American Statistical Association, American Statistical Association, vol. 97, pages 77-87, March.
- Ruiyan Luo & Xin Qi, 2017. "Asymptotic Optimality of Sparse Linear Discriminant Analysis with Arbitrary Number of Classes," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 44(3), pages 598-616, September.
- M. Rauf Ahmad, 2017. "Location-invariant Multi-sample U-tests for Covariance Matrices with Large Dimension," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 44(2), pages 500-523, June.
- Song Huang & Tiejun Tong & Hongyu Zhao, 2010. "Bias-Corrected Diagonal Discriminant Rules for High-Dimensional Classification," Biometrics, The International Biometric Society, vol. 66(4), pages 1096-1106, December.
- Ning Hao & Bin Dong & Jianqing Fan, 2015. "Sparsifying the Fisher linear discriminant by rotation," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 77(4), pages 827-851, September.
- Peter Hall & Yvonne Pittelkow & Malay Ghosh, 2008. "Theoretical measures of relative performance of classifiers for high dimensional data with small sample sizes," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 70(1), pages 159-173, February.
- Pinheiro, Aluísio & Sen, Pranab Kumar & Pinheiro, Hildete Prisco, 2009. "Decomposability of high-dimensional diversity measures: Quasi-U-statistics, martingales and nonstandard asymptotics," Journal of Multivariate Analysis, Elsevier, vol. 100(8), pages 1645-1656, September.
- Mikosch, T., 1993. "A Weak Invariance Principle for Weighted U-Statistics with Varying Kernels," Journal of Multivariate Analysis, Elsevier, vol. 47(1), pages 82-102, October.
- M. Ahmad, 2014. "A $$U$$ -statistic approach for a high-dimensional two-sample mean testing problem under non-normality and Behrens–Fisher setting," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 66(1), pages 33-61, February.
- Yao-Ban Chan & Peter Hall, 2009. "Scale adjustments for classifiers in high-dimensional, low sample size settings," Biometrika, Biometrika Trust, vol. 96(2), pages 469-478.
Most related items
These are the items that most often cite the same works as this one and are cited by the same works as this one.- Makoto Aoshima & Kazuyoshi Yata, 2014. "A distance-based, misclassification rate adjusted classifier for multiclass, high-dimensional data," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 66(5), pages 983-1010, October.
- Makoto Aoshima & Kazuyoshi Yata, 2019. "High-Dimensional Quadratic Classifiers in Non-sparse Settings," Methodology and Computing in Applied Probability, Springer, vol. 21(3), pages 663-682, September.
- Makoto Aoshima & Kazuyoshi Yata, 2019. "Distance-based classifier by data transformation for high-dimension, strongly spiked eigenvalue models," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 71(3), pages 473-503, June.
- Yugo Nakayama & Kazuyoshi Yata & Makoto Aoshima, 2020. "Bias-corrected support vector machine with Gaussian kernel in high-dimension, low-sample-size settings," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 72(5), pages 1257-1286, October.
- Ishii, Aki & Yata, Kazuyoshi & Aoshima, Makoto, 2022. "Geometric classifiers for high-dimensional noisy data," Journal of Multivariate Analysis, Elsevier, vol. 188(C).
- Rauf Ahmad, M., 2019. "A significance test of the RV coefficient in high dimensions," Computational Statistics & Data Analysis, Elsevier, vol. 131(C), pages 116-130.
- Zongliang Hu & Tiejun Tong & Marc G. Genton, 2019. "Diagonal likelihood ratio test for equality of mean vectors in high‐dimensional data," Biometrics, The International Biometric Society, vol. 75(1), pages 256-267, March.
- Nakagawa, Tomoyuki & Watanabe, Hiroki & Hyodo, Masashi, 2021. "Kick-one-out-based variable selection method for Euclidean distance-based classifier in high-dimensional settings," Journal of Multivariate Analysis, Elsevier, vol. 184(C).
- Xu, Kai & Hao, Xinxin, 2019. "A nonparametric test for block-diagonal covariance structure in high dimension and small samples," Journal of Multivariate Analysis, Elsevier, vol. 173(C), pages 551-567.
- Long Feng & Changliang Zou & Zhaojun Wang, 2016. "Multivariate-Sign-Based High-Dimensional Tests for the Two-Sample Location Problem," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(514), pages 721-735, April.
- Ahmad, Rauf, 2022. "Tests for proportionality of matrices with large dimension," Journal of Multivariate Analysis, Elsevier, vol. 189(C).
- Kubokawa, Tatsuya & Srivastava, Muni S., 2008. "Estimation of the precision matrix of a singular Wishart distribution and its application in high-dimensional data," Journal of Multivariate Analysis, Elsevier, vol. 99(9), pages 1906-1928, October.
- Egashira, Kento & Yata, Kazuyoshi & Aoshima, Makoto, 2024. "Asymptotic properties of hierarchical clustering in high-dimensional settings," Journal of Multivariate Analysis, Elsevier, vol. 199(C).
- Hossain, Ahmed & Beyene, Joseph & Willan, Andrew R. & Hu, Pingzhao, 2009. "A flexible approximate likelihood ratio test for detecting differential expression in microarray data," Computational Statistics & Data Analysis, Elsevier, vol. 53(10), pages 3685-3695, August.
- Luca Scrucca, 2014. "Graphical tools for model-based mixture discriminant analysis," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 8(2), pages 147-165, June.
- Bilin Zeng & Xuerong Meggie Wen & Lixing Zhu, 2017. "A link-free sparse group variable selection method for single-index model," Journal of Applied Statistics, Taylor & Francis Journals, vol. 44(13), pages 2388-2400, October.
- Watanabe, Hiroki & Hyodo, Masashi & Seo, Takashi & Pavlenko, Tatjana, 2015. "Asymptotic properties of the misclassification rates for Euclidean Distance Discriminant rule in high-dimensional data," Journal of Multivariate Analysis, Elsevier, vol. 140(C), pages 234-244.
- J. Burez & D. Van Den Poel, 2005. "CRM at a Pay-TV Company: Using Analytical Models to Reduce Customer Attrition by Targeted Marketing for Subscription Services," Working Papers of Faculty of Economics and Business Administration, Ghent University, Belgium 05/348, Ghent University, Faculty of Economics and Business Administration.
- Won, Joong-Ho & Lim, Johan & Yu, Donghyeon & Kim, Byung Soo & Kim, Kyunga, 2014. "Monotone false discovery rate," Statistics & Probability Letters, Elsevier, vol. 87(C), pages 86-93.
- Jan, Budczies & Kosztyla, Daniel & von Törne, Christian & Stenzinger, Albrecht & Darb-Esfahani, Silvia & Dietel, Manfred & Denkert, Carsten, 2014. "cancerclass: An R Package for Development and Validation of Diagnostic Tests from High-Dimensional Molecular Data," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 59(i01).
More about this item
Keywords
Bias-adjusted classifier; High-dimensional classification; U-statistics;All these keywords.
Statistics
Access and download statisticsCorrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:jmvana:v:167:y:2018:i:c:p:269-283. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/622892/description#description .
Please note that corrections may take a couple of weeks to filter through the various RePEc services.