IDEAS home Printed from https://ideas.repec.org/a/spr/stpapr/v62y2021i6d10.1007_s00362-020-01214-z.html
   My bibliography  Save this article

The generalized equivalence of regularization and min–max robustification in linear mixed models

Author

Listed:
  • Jan Pablo Burgard

    (Trier University)

  • Joscha Krause

    (Trier University)

  • Dennis Kreber

    (Trier University)

  • Domingo Morales

    (University Miguel Hernández de Elche)

Abstract

The connection between regularization and min–max robustification in the presence of unobservable covariate measurement errors in linear mixed models is addressed. We prove that regularized model parameter estimation is equivalent to robust loss minimization under a min–max approach. On the example of the LASSO, Ridge regression, and the Elastic Net, we derive uncertainty sets that characterize the feasible noise that can be added to a given estimation problem. These sets allow us to determine measurement error bounds without distribution assumptions. A conservative Jackknife estimator of the mean squared error in this setting is proposed. We further derive conditions under which min-max robust estimation of model parameters is consistent. The theoretical findings are supported by a Monte Carlo simulation study under multiple measurement error scenarios.

Suggested Citation

  • Jan Pablo Burgard & Joscha Krause & Dennis Kreber & Domingo Morales, 2021. "The generalized equivalence of regularization and min–max robustification in linear mixed models," Statistical Papers, Springer, vol. 62(6), pages 2857-2883, December.
  • Handle: RePEc:spr:stpapr:v:62:y:2021:i:6:d:10.1007_s00362-020-01214-z
    DOI: 10.1007/s00362-020-01214-z
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s00362-020-01214-z
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s00362-020-01214-z?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Bertsimas, Dimitris & Copenhaver, Martin S., 2018. "Characterization of the equivalence of robustification and regularization in linear and matrix regression," European Journal of Operational Research, Elsevier, vol. 270(3), pages 931-942.
    2. Sergio Davalos, 2017. "Big Data has a Big Role in Biostatistics with Big Challenges and Big Expectations," Biostatistics and Biometrics Open Access Journal, Juniper Publishers Inc., vol. 1(3), pages 63-64, May.
    3. Friedman, Jerome H. & Hastie, Trevor & Tibshirani, Rob, 2010. "Regularization Paths for Generalized Linear Models via Coordinate Descent," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 33(i01).
    4. M. Norouzirad & M. Arashi, 2019. "Preliminary test and Stein-type shrinkage ridge estimators in robust regression," Statistical Papers, Springer, vol. 60(6), pages 1849-1882, December.
    5. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    6. N. Neykov & P. Filzmoser & P. Neytchev, 2014. "Ultrahigh dimensional variable selection through the penalized maximum trimmed likelihood estimator," Statistical Papers, Springer, vol. 55(1), pages 187-207, February.
    7. Jiantao Li & Min Zheng, 2009. "Robust estimation of multivariate regression model," Statistical Papers, Springer, vol. 50(1), pages 81-100, January.
    8. P. Tseng & S. Yun, 2009. "Block-Coordinate Gradient Descent Method for Linearly Constrained Nonsmooth Separable Optimization," Journal of Optimization Theory and Applications, Springer, vol. 140(3), pages 513-535, March.
    9. Jan Pablo Burgard & Joscha Krause & Dennis Kreber, 2019. "Regularized Area-level Modelling for Robust Small Area Estimation in the Presence of Unknown Covariate Measurement Errors," Research Papers in Economics 2019-04, University of Trier, Department of Economics.
    10. Abhik Ghosh & Magne Thoresen, 2018. "Non-concave penalization in linear mixed-effect models and regularized selection of fixed effects," AStA Advances in Statistical Analysis, Springer;German Statistical Society, vol. 102(2), pages 179-210, April.
    11. Andreas Alfons & Matthias Templ & Peter Filzmoser, 2013. "Robust estimation of economic indicators from survey samples based on Pareto tail modelling," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 62(2), pages 271-286, March.
    12. Hui Zou & Trevor Hastie, 2005. "Addendum: Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(5), pages 768-768, November.
    13. Hui Zou & Trevor Hastie, 2005. "Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(2), pages 301-320, April.
    14. Timo Schmid & Ralf Münnich, 2014. "Spatial robust small area estimation," Statistical Papers, Springer, vol. 55(3), pages 653-670, August.
    15. N. Neykov & P. Filzmoser & P. Neytchev, 2014. "Erratum to: Ultrahigh dimensional variable selection through the penalized maximum trimmed likelihood estimator," Statistical Papers, Springer, vol. 55(3), pages 917-918, August.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Jan Pablo Burgard & Joscha Krause & Ralf Münnich, 2019. "Penalized Small Area Models for the Combination of Unit- and Area-level Data," Research Papers in Economics 2019-05, University of Trier, Department of Economics.
    2. Pei Wang & Shunjie Chen & Sijia Yang, 2022. "Recent Advances on Penalized Regression Models for Biological Data," Mathematics, MDPI, vol. 10(19), pages 1-24, October.
    3. Ning Li & Hu Yang, 2021. "Nonnegative estimation and variable selection under minimax concave penalty for sparse high-dimensional linear regression models," Statistical Papers, Springer, vol. 62(2), pages 661-680, April.
    4. G. S. Monti & P. Filzmoser, 2022. "Robust logistic zero-sum regression for microbiome compositional data," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 16(2), pages 301-324, June.
    5. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    6. Camila Epprecht & Dominique Guegan & Álvaro Veiga & Joel Correa da Rosa, 2017. "Variable selection and forecasting via automated methods for linear models: LASSO/adaLASSO and Autometrics," Post-Print halshs-00917797, HAL.
    7. Zichen Zhang & Ye Eun Bae & Jonathan R. Bradley & Lang Wu & Chong Wu, 2022. "SUMMIT: An integrative approach for better transcriptomic data imputation improves causal gene identification," Nature Communications, Nature, vol. 13(1), pages 1-12, December.
    8. Peter Bühlmann & Jacopo Mandozzi, 2014. "High-dimensional variable screening and bias in subsequent inference, with an empirical comparison," Computational Statistics, Springer, vol. 29(3), pages 407-430, June.
    9. Peter Martey Addo & Dominique Guegan & Bertrand Hassani, 2018. "Credit Risk Analysis Using Machine and Deep Learning Models," Risks, MDPI, vol. 6(2), pages 1-20, April.
    10. Capanu, Marinela & Giurcanu, Mihai & Begg, Colin B. & Gönen, Mithat, 2023. "Subsampling based variable selection for generalized linear models," Computational Statistics & Data Analysis, Elsevier, vol. 184(C).
    11. Loann David Denis Desboulets, 2018. "A Review on Variable Selection in Regression Analysis," Econometrics, MDPI, vol. 6(4), pages 1-27, November.
    12. Zeyu Bian & Erica E. M. Moodie & Susan M. Shortreed & Sahir Bhatnagar, 2023. "Variable selection in regression‐based estimation of dynamic treatment regimes," Biometrics, The International Biometric Society, vol. 79(2), pages 988-999, June.
    13. Simona Buscemi & Antonella Plaia, 2020. "Model selection in linear mixed-effect models," AStA Advances in Statistical Analysis, Springer;German Statistical Society, vol. 104(4), pages 529-575, December.
    14. Jingxuan Luo & Lili Yue & Gaorong Li, 2023. "Overview of High-Dimensional Measurement Error Regression Models," Mathematics, MDPI, vol. 11(14), pages 1-22, July.
    15. Shutes, Karl & Adcock, Chris, 2013. "Regularized Extended Skew-Normal Regression," MPRA Paper 58445, University Library of Munich, Germany, revised 09 Sep 2014.
    16. Zanhua Yin, 2020. "Variable selection for sparse logistic regression," Metrika: International Journal for Theoretical and Applied Statistics, Springer, vol. 83(7), pages 821-836, October.
    17. Cui, Hailong & Rajagopalan, Sampath & Ward, Amy R., 2020. "Predicting product return volume using machine learning methods," European Journal of Operational Research, Elsevier, vol. 281(3), pages 612-627.
    18. Feng Hong & Lu Tian & Viswanath Devanarayan, 2023. "Improving the Robustness of Variable Selection and Predictive Performance of Regularized Generalized Linear Models and Cox Proportional Hazard Models," Mathematics, MDPI, vol. 11(3), pages 1-13, January.
    19. Dumitrescu, Elena & Hué, Sullivan & Hurlin, Christophe & Tokpavi, Sessi, 2022. "Machine learning for credit scoring: Improving logistic regression with non-linear decision-tree effects," European Journal of Operational Research, Elsevier, vol. 297(3), pages 1178-1192.
    20. Peter Bühlmann & Domagoj Ćevid, 2020. "Deconfounding and Causal Regularisation for Stability and External Validity," International Statistical Review, International Statistical Institute, vol. 88(S1), pages 114-134, December.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:stpapr:v:62:y:2021:i:6:d:10.1007_s00362-020-01214-z. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.