IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v201y2025ics0167947324001257.html
   My bibliography  Save this article

Minimax rates of convergence for sliced inverse regression with differential privacy

Author

Listed:
  • Zhao, Wenbiao
  • Zhu, Xuehu
  • Zhu, Lixing

Abstract

Sliced inverse regression (SIR) is a highly efficient paradigm used for the purpose of dimension reduction by replacing high-dimensional covariates with a limited number of linear combinations. This paper focuses on the implementation of the classical SIR approach integrated with a Gaussian differential privacy mechanism to estimate the central space while preserving privacy. We illustrate the tradeoff between statistical accuracy and privacy in sufficient dimension reduction problems under both the classical low- dimensional and modern high-dimensional settings. Additionally, we achieve the minimax rate of the proposed estimator with Gaussian differential privacy constraint and illustrate that this rate is also optimal for multiple index models with bounded dimension of the central space. Extensive numerical studies on synthetic data sets are conducted to assess the effectiveness of the proposed technique in finite sample scenarios, and a real data analysis is presented to showcase its practical application.

Suggested Citation

  • Zhao, Wenbiao & Zhu, Xuehu & Zhu, Lixing, 2025. "Minimax rates of convergence for sliced inverse regression with differential privacy," Computational Statistics & Data Analysis, Elsevier, vol. 201(C).
  • Handle: RePEc:eee:csdana:v:201:y:2025:i:c:s0167947324001257
    DOI: 10.1016/j.csda.2024.108041
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947324001257
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2024.108041?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Zhu, Lixing & Miao, Baiqi & Peng, Heng, 2006. "On Sliced Inverse Regression With High-Dimensional Covariates," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 630-643, June.
    2. Wasserman, Larry & Zhou, Shuheng, 2010. "A Statistical Framework for Differential Privacy," Journal of the American Statistical Association, American Statistical Association, vol. 105(489), pages 375-389.
    3. Zhu, Xuehu & Guo, Xu & Wang, Tao & Zhu, Lixing, 2020. "Dimensionality determination: A thresholding double ridge ratio approach," Computational Statistics & Data Analysis, Elsevier, vol. 146(C).
    4. Jinshuo Dong & Aaron Roth & Weijie J. Su, 2022. "Gaussian differential privacy," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 84(1), pages 3-37, February.
    5. Qian Lin & Zhigen Zhao & Jun S. Liu, 2019. "Sparse Sliced Inverse Regression via Lasso," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 114(528), pages 1726-1739, October.
    6. Li, Bing & Wang, Shaoli, 2007. "On Directional Regression for Dimension Reduction," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 997-1008, September.
    7. Yanyuan Ma & Liping Zhu, 2013. "A Review on Dimension Reduction," International Statistical Review, International Statistical Institute, vol. 81(1), pages 134-150, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Pircalabelu, Eugen & Artemiou, Andreas, 2021. "Graph informed sliced inverse regression," Computational Statistics & Data Analysis, Elsevier, vol. 164(C).
    2. Fang, Fang & Yu, Zhou, 2020. "Model averaging assisted sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 152(C).
    3. Xiao, Zhen & Zhang, Qi, 2022. "Dimension reduction for block-missing data based on sparse sliced inverse regression," Computational Statistics & Data Analysis, Elsevier, vol. 167(C).
    4. Shih‐Hao Huang & Kerby Shedden & Hsin‐wen Chang, 2023. "Inference for the dimension of a regression relationship using pseudo‐covariates," Biometrics, The International Biometric Society, vol. 79(3), pages 2394-2403, September.
    5. Wei Luo, 2022. "On efficient dimension reduction with respect to the interaction between two response variables," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 84(2), pages 269-294, April.
    6. Chen, Canyi & Xu, Wangli & Zhu, Liping, 2022. "Distributed estimation in heterogeneous reduced rank regression: With application to order determination in sufficient dimension reduction," Journal of Multivariate Analysis, Elsevier, vol. 190(C).
    7. Scrucca, Luca, 2011. "Model-based SIR for dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 55(11), pages 3010-3026, November.
    8. Weng, Jiaying, 2022. "Fourier transform sparse inverse regression estimators for sufficient variable selection," Computational Statistics & Data Analysis, Elsevier, vol. 168(C).
    9. Hung Hung & Su‐Yun Huang, 2019. "Sufficient dimension reduction via random‐partitions for the large‐p‐small‐n problem," Biometrics, The International Biometric Society, vol. 75(1), pages 245-255, March.
    10. Feng, Zhenghui & Zhu, Lixing, 2012. "An alternating determination–optimization approach for an additive multi-index model," Computational Statistics & Data Analysis, Elsevier, vol. 56(6), pages 1981-1993.
    11. Ming-Yueh Huang & Chin-Tsang Chiang, 2017. "An Effective Semiparametric Estimation Approach for the Sufficient Dimension Reduction Model," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 112(519), pages 1296-1310, July.
    12. Kapla, Daniel & Fertl, Lukas & Bura, Efstathia, 2022. "Fusing sufficient dimension reduction with neural networks," Computational Statistics & Data Analysis, Elsevier, vol. 168(C).
    13. Zhenghui Feng & Lu Lin & Ruoqing Zhu & Lixing Zhu, 2020. "Nonparametric variable selection and its application to additive models," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 72(3), pages 827-854, June.
    14. Wang, Qin & Xue, Yuan, 2021. "An ensemble of inverse moment estimators for sufficient dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 161(C).
    15. Kyongwon Kim, 2024. "A note on sufficient dimension reduction with post dimension reduction statistical inference," AStA Advances in Statistical Analysis, Springer;German Statistical Society, vol. 108(4), pages 733-753, December.
    16. Baek, Seungchul & Hoyoung, Park & Park, Junyong, 2024. "Variable selection using data splitting and projection for principal fitted component models in high dimension," Computational Statistics & Data Analysis, Elsevier, vol. 196(C).
    17. Lu Li & Kai Tan & Xuerong Meggie Wen & Zhou Yu, 2023. "Variable-dependent partial dimension reduction," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 32(2), pages 521-541, June.
    18. Barbarino, Alessandro & Bura, Efstathia, 2024. "Forecasting Near-equivalence of Linear Dimension Reduction Methods in Large Panels of Macro-variables," Econometrics and Statistics, Elsevier, vol. 31(C), pages 1-18.
    19. Dong, Yuexiao & Li, Zeda, 2024. "A note on marginal coordinate test in sufficient dimension reduction," Statistics & Probability Letters, Elsevier, vol. 204(C).
    20. Xiaobing Zhao & Xian Zhou, 2020. "Partial sufficient dimension reduction on additive rates model for recurrent event data with high-dimensional covariates," Statistical Papers, Springer, vol. 61(2), pages 523-541, April.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:201:y:2025:i:c:s0167947324001257. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.