IDEAS home Printed from https://ideas.repec.org/a/taf/jnlasa/v111y2016i514p834-845.html
   My bibliography  Save this article

Convex Banding of the Covariance Matrix

Author

Listed:
  • Jacob Bien
  • Florentina Bunea
  • Luo Xiao

Abstract

We introduce a new sparse estimator of the covariance matrix for high-dimensional models in which the variables have a known ordering. Our estimator, which is the solution to a convex optimization problem, is equivalently expressed as an estimator that tapers the sample covariance matrix by a Toeplitz, sparsely banded, data-adaptive matrix. As a result of this adaptivity, the convex banding estimator enjoys theoretical optimality properties not attained by previous banding or tapered estimators. In particular, our convex banding estimator is minimax rate adaptive in Frobenius and operator norms, up to log factors, over commonly studied classes of covariance matrices, and over more general classes. Furthermore, it correctly recovers the bandwidth when the true covariance is exactly banded. Our convex formulation admits a simple and efficient algorithm. Empirical studies demonstrate its practical effectiveness and illustrate that our exactly banded estimator works well even when the true covariance matrix is only close to a banded matrix, confirming our theoretical results. Our method compares favorably with all existing methods, in terms of accuracy and speed. We illustrate the practical merits of the convex banding estimator by showing that it can be used to improve the performance of discriminant analysis for classifying sound recordings. Supplementary materials for this article are available online.

Suggested Citation

  • Jacob Bien & Florentina Bunea & Luo Xiao, 2016. "Convex Banding of the Covariance Matrix," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(514), pages 834-845, April.
  • Handle: RePEc:taf:jnlasa:v:111:y:2016:i:514:p:834-845
    DOI: 10.1080/01621459.2015.1058265
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1080/01621459.2015.1058265
    Download Restriction: Access to full text is restricted to subscribers.

    File URL: https://libkey.io/10.1080/01621459.2015.1058265?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. P. Tseng, 2001. "Convergence of a Block Coordinate Descent Method for Nondifferentiable Minimization," Journal of Optimization Theory and Applications, Springer, vol. 109(3), pages 475-494, June.
    2. Radchenko, Peter & James, Gareth M., 2010. "Variable Selection Using Adaptive Nonlinear Interaction Structures in High Dimensions," Journal of the American Statistical Association, American Statistical Association, vol. 105(492), pages 1541-1553.
    3. Adam J. Rothman & Elizaveta Levina & Ji Zhu, 2010. "A new approach to Cholesky-based covariance regularization in high dimensions," Biometrika, Biometrika Trust, vol. 97(3), pages 539-550.
    4. Cheng, Yu, 2004. "Asymptotic probabilities of misclassification of two discriminant functions in cases of high dimensional data," Statistics & Probability Letters, Elsevier, vol. 67(1), pages 9-17, March.
    5. Ming Yuan & Yi Lin, 2006. "Model selection and estimation in regression with grouped variables," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 68(1), pages 49-67, February.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Fang, Qian & Yu, Chen & Weiping, Zhang, 2020. "Regularized estimation of precision matrix for high-dimensional multivariate longitudinal data," Journal of Multivariate Analysis, Elsevier, vol. 176(C).
    2. Zhu, Xiaonan & Chen, Yu & Hu, Jie, 2024. "Estimation of banded time-varying precision matrix based on SCAD and group lasso," Computational Statistics & Data Analysis, Elsevier, vol. 189(C).
    3. Leprince, Julien & Madsen, Henrik & Møller, Jan Kloppenborg & Zeiler, Wim, 2023. "Hierarchical learning, forecasting coherent spatio-temporal individual and aggregated building loads," Applied Energy, Elsevier, vol. 348(C).
    4. Lam, Clifford, 2020. "High-dimensional covariance matrix estimation," LSE Research Online Documents on Economics 101667, London School of Economics and Political Science, LSE Library.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Dewei Zhang & Yin Liu & Sam Davanloo Tajbakhsh, 2022. "A First-Order Optimization Algorithm for Statistical Learning with Hierarchical Sparsity Structure," INFORMS Journal on Computing, INFORMS, vol. 34(2), pages 1126-1140, March.
    2. Fang, Qian & Yu, Chen & Weiping, Zhang, 2020. "Regularized estimation of precision matrix for high-dimensional multivariate longitudinal data," Journal of Multivariate Analysis, Elsevier, vol. 176(C).
    3. Jun Yan & Jian Huang, 2012. "Model Selection for Cox Models with Time-Varying Coefficients," Biometrics, The International Biometric Society, vol. 68(2), pages 419-428, June.
    4. Loann David Denis Desboulets, 2018. "A Review on Variable Selection in Regression Analysis," Econometrics, MDPI, vol. 6(4), pages 1-27, November.
    5. Nicholson, William B. & Matteson, David S. & Bien, Jacob, 2017. "VARX-L: Structured regularization for large vector autoregressions with exogenous variables," International Journal of Forecasting, Elsevier, vol. 33(3), pages 627-651.
    6. David Degras, 2021. "Sparse group fused lasso for model segmentation: a hybrid approach," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 15(3), pages 625-671, September.
    7. Li Yun & O’Connor George T. & Dupuis Josée & Kolaczyk Eric, 2015. "Modeling gene-covariate interactions in sparse regression with group structure for genome-wide association studies," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 14(3), pages 265-277, June.
    8. Jonathan Boss & Alexander Rix & Yin‐Hsiu Chen & Naveen N. Narisetty & Zhenke Wu & Kelly K. Ferguson & Thomas F. McElrath & John D. Meeker & Bhramar Mukherjee, 2021. "A hierarchical integrative group least absolute shrinkage and selection operator for analyzing environmental mixtures," Environmetrics, John Wiley & Sons, Ltd., vol. 32(8), December.
    9. Yanming Li & Bin Nan & Ji Zhu, 2015. "Multivariate sparse group lasso for the multivariate multiple linear regression with an arbitrary group structure," Biometrics, The International Biometric Society, vol. 71(2), pages 354-363, June.
    10. Murat Genç, 2022. "A new double-regularized regression using Liu and lasso regularization," Computational Statistics, Springer, vol. 37(1), pages 159-227, March.
    11. Jin Liu & Jian Huang & Yawei Zhang & Qing Lan & Nathaniel Rothman & Tongzhang Zheng & Shuangge Ma, 2014. "Integrative analysis of prognosis data on multiple cancer subtypes," Biometrics, The International Biometric Society, vol. 70(3), pages 480-488, September.
    12. Fabian Scheipl & Thomas Kneib & Ludwig Fahrmeir, 2013. "Penalized likelihood and Bayesian function selection in regression models," AStA Advances in Statistical Analysis, Springer;German Statistical Society, vol. 97(4), pages 349-385, October.
    13. Pan, Yuqing & Mai, Qing, 2020. "Efficient computation for differential network analysis with applications to quadratic discriminant analysis," Computational Statistics & Data Analysis, Elsevier, vol. 144(C).
    14. Wang, Cheng & Chen, Haozhe & Jiang, Binyan, 2024. "HiQR: An efficient algorithm for high-dimensional quadratic regression with penalties," Computational Statistics & Data Analysis, Elsevier, vol. 192(C).
    15. Diego Vidaurre & Concha Bielza & Pedro Larrañaga, 2013. "A Survey of L1 Regression," International Statistical Review, International Statistical Institute, vol. 81(3), pages 361-387, December.
    16. Yawei He & Zehua Chen, 2016. "The EBIC and a sequential procedure for feature selection in interactive linear models with high-dimensional data," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 68(1), pages 155-180, February.
    17. Yen, Yu-Min & Yen, Tso-Jung, 2014. "Solving norm constrained portfolio optimization via coordinate-wise descent algorithms," Computational Statistics & Data Analysis, Elsevier, vol. 76(C), pages 737-759.
    18. Garcia-Magariños Manuel & Antoniadis Anestis & Cao Ricardo & González-Manteiga Wenceslao, 2010. "Lasso Logistic Regression, GSoft and the Cyclic Coordinate Descent Algorithm: Application to Gene Expression Data," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 9(1), pages 1-30, August.
    19. Mingrui Zhong & Zanhua Yin & Zhichao Wang, 2023. "Variable Selection for Sparse Logistic Regression with Grouped Variables," Mathematics, MDPI, vol. 11(24), pages 1-21, December.
    20. Benjamin G. Stokell & Rajen D. Shah & Ryan J. Tibshirani, 2021. "Modelling high‐dimensional categorical data using nonconvex fusion penalties," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 83(3), pages 579-611, July.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:taf:jnlasa:v:111:y:2016:i:514:p:834-845. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Longhurst (email available below). General contact details of provider: http://www.tandfonline.com/UASA20 .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.