IDEAS home Printed from https://ideas.repec.org/a/bla/jorssb/v84y2022i4p1105-1128.html
   My bibliography  Save this article

Optimal and maximin procedures for multiple testing problems

Author

Listed:
  • Saharon Rosset
  • Ruth Heller
  • Amichai Painsky
  • Ehud Aharoni

Abstract

Multiple testing problems (MTPs) are a staple of modern statistical analysis. The fundamental objective of MTPs is to reject as many false null hypotheses as possible (that is, maximize some notion of power), subject to controlling an overall measure of false discovery, like family‐wise error rate (FWER) or false discovery rate (FDR). In this paper we provide generalizations to MTPs of the optimal Neyman‐Pearson test for a single hypothesis. We show that for simple hypotheses, for both FWER and FDR and relevant notions of power, finding the optimal multiple testing procedure can be formulated as infinite dimensional binary programs and can in principle be solved for any number of hypotheses. We also characterize maximin rules for complex alternatives, and demonstrate that such rules can be found in practice, leading to improved practical procedures compared to existing alternatives that guarantee strong error control on the entire parameter space. We demonstrate the usefulness of these novel rules for identifying which studies contain signal in numerical experiments as well as in application to clinical trials with multiple studies. In various settings, the increase in power from using optimal and maximin procedures can range from 15% to more than 100%.

Suggested Citation

  • Saharon Rosset & Ruth Heller & Amichai Painsky & Ehud Aharoni, 2022. "Optimal and maximin procedures for multiple testing problems," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 84(4), pages 1105-1128, September.
  • Handle: RePEc:bla:jorssb:v:84:y:2022:i:4:p:1105-1128
    DOI: 10.1111/rssb.12507
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/rssb.12507
    Download Restriction: no

    File URL: https://libkey.io/10.1111/rssb.12507?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Edgar Dobriban & Kristen Fortney & Stuart K. Kim & Art B. Owen, 2015. "Optimal multiple testing under a Gaussian prior on the effect sizes," Biometrika, Biometrika Trust, vol. 102(4), pages 753-766.
    2. Sun, Wenguang & Cai, T. Tony, 2007. "Oracle and Adaptive Compound Decision Rules for False Discovery Rate Control," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 901-912, September.
    3. Richard M. Bittman & Joseph P. Romano & Carlos Vallarino & Michael Wolf, 2009. "Optimal testing of multiple hypotheses with common effect direction," Biometrika, Biometrika Trust, vol. 96(2), pages 399-410.
    4. Michael Rosenblum & Han Liu & En-Hsu Yen, 2014. "Optimal Tests of Treatment Effects for the Overall Population and Two Subpopulations in Randomized Trials, Using Sparse Linear Programming," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 109(507), pages 1216-1228, September.
    5. John D. Storey, 2007. "The optimal discovery procedure: a new approach to simultaneous significance testing," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 69(3), pages 347-368, June.
    6. Christopher Genovese & Larry Wasserman, 2002. "Operating characteristics and extensions of the false discovery rate procedure," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 64(3), pages 499-517, August.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Yoav Benjamini & Ruth Heller & Abba Krieger & Saharon Rosset, 2023. "Discussion on “Optimal test procedures for multiple hypotheses controlling the familywise expected loss” by Willi Maurer, Frank Bretz, and Xiaolei Xun," Biometrics, The International Biometric Society, vol. 79(4), pages 2794-2797, December.
    2. Ruth Heller & Abba Krieger & Saharon Rosset, 2023. "Optimal multiple testing and design in clinical trials," Biometrics, The International Biometric Society, vol. 79(3), pages 1908-1919, September.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Ruth Heller & Saharon Rosset, 2021. "Optimal control of false discovery criteria in the two‐group model," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 83(1), pages 133-155, February.
    2. Wenguang Sun & T. Tony Cai, 2009. "Large‐scale multiple testing under dependence," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 71(2), pages 393-424, April.
    3. Ruth Heller & Abba Krieger & Saharon Rosset, 2023. "Optimal multiple testing and design in clinical trials," Biometrics, The International Biometric Society, vol. 79(3), pages 1908-1919, September.
    4. Shiyun Chen & Ery Arias-Castro, 2021. "On the power of some sequential multiple testing procedures," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 73(2), pages 311-336, April.
    5. Shonosuke Sugasawa & Hisashi Noma, 2021. "Efficient screening of predictive biomarkers for individual treatment selection," Biometrics, The International Biometric Society, vol. 77(1), pages 249-257, March.
    6. Daniel Yekutieli, 2015. "Bayesian tests for composite alternative hypotheses in cross-tabulated data," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 24(2), pages 287-301, June.
    7. Gómez-Villegas Miguel A. & Salazar Isabel & Sanz Luis, 2014. "A Bayesian decision procedure for testing multiple hypotheses in DNA microarray experiments," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 13(1), pages 49-65, February.
    8. Edsel Peña & Joshua Habiger & Wensong Wu, 2015. "Classes of multiple decision functions strongly controlling FWER and FDR," Metrika: International Journal for Theoretical and Applied Statistics, Springer, vol. 78(5), pages 563-595, July.
    9. Xiaoquan Wen, 2017. "Robust Bayesian FDR Control Using Bayes Factors, with Applications to Multi-tissue eQTL Discovery," Statistics in Biosciences, Springer;International Chinese Statistical Association, vol. 9(1), pages 28-49, June.
    10. Joshua Habiger & David Watts & Michael Anderson, 2017. "Multiple testing with heterogeneous multinomial distributions," Biometrics, The International Biometric Society, vol. 73(2), pages 562-570, June.
    11. Long Qu & Dan Nettleton & Jack C. M. Dekkers, 2012. "Improved Estimation of the Noncentrality Parameter Distribution from a Large Number of t-Statistics, with Applications to False Discovery Rate Estimation in Microarray Data Analysis," Biometrics, The International Biometric Society, vol. 68(4), pages 1178-1187, December.
    12. Zhigen Zhao, 2022. "Where to find needles in a haystack?," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 31(1), pages 148-174, March.
    13. Izmirlian, Grant, 2020. "Strong consistency and asymptotic normality for quantities related to the Benjamini–Hochberg false discovery rate procedure," Statistics & Probability Letters, Elsevier, vol. 160(C).
    14. Cipolli III, William & Hanson, Timothy & McLain, Alexander C., 2016. "Bayesian nonparametric multiple testing," Computational Statistics & Data Analysis, Elsevier, vol. 101(C), pages 64-79.
    15. Dazard, Jean-Eudes & Sunil Rao, J., 2012. "Joint adaptive mean–variance regularization and variance stabilization of high dimensional data," Computational Statistics & Data Analysis, Elsevier, vol. 56(7), pages 2317-2333.
    16. T. Tony Cai & Wenguang Sun, 2017. "Optimal screening and discovery of sparse signals with applications to multistage high throughput studies," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 79(1), pages 197-223, January.
    17. Jiaying Gu & Roger Koenker, 2020. "Invidious Comparisons: Ranking and Selection as Compound Decisions," Papers 2012.12550, arXiv.org, revised Sep 2021.
    18. Hai Shu & Bin Nan & Robert Koeppe, 2015. "Multiple testing for neuroimaging via hidden Markov random field," Biometrics, The International Biometric Society, vol. 71(3), pages 741-750, September.
    19. Nikolaos Ignatiadis & Wolfgang Huber, 2021. "Covariate powered cross‐weighted multiple testing," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 83(4), pages 720-751, September.
    20. T. Tony Cai & Wenguang Sun & Weinan Wang, 2019. "Covariate‐assisted ranking and screening for large‐scale two‐sample inference," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 81(2), pages 187-234, April.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:jorssb:v:84:y:2022:i:4:p:1105-1128. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://edirc.repec.org/data/rssssea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.