IDEAS home Printed from https://ideas.repec.org/a/bla/jorssc/v71y2022i5p1753-1786.html
   My bibliography  Save this article

Sequential one‐step estimator by sub‐sampling for customer churn analysis with massive data sets

Author

Listed:
  • Feifei Wang
  • Danyang Huang
  • Tianchen Gao
  • Shuyuan Wu
  • Hansheng Wang

Abstract

Customer churn is one of the most important concerns for large companies. Currently, massive data are often encountered in customer churn analysis, which bring new challenges for model computation. To cope with these concerns, sub‐sampling methods are often used to accomplish data analysis tasks of large scale. To cover more informative samples in one sampling round, classic sub‐sampling methods need to compute non‐uniform sampling probabilities for all data points. However, this method creates a huge computational burden for data sets of large scale and therefore, is not applicable in practice. In this study, we propose a sequential one‐step (SOS) estimation method based on repeated sub‐sampling data sets. In the SOS method, data points need to be sampled only with uniform probabilities, and the sampling step is conducted repeatedly. In each sampling step, a new estimate is computed via one‐step updating based on the newly sampled data points. This leads to a sequence of estimates, of which the final SOS estimate is their average. We theoretically show that both the bias and the standard error of the SOS estimator can decrease with increasing sub‐sampling sizes or sub‐sampling times. The finite sample SOS performances are assessed through simulations. Finally, we apply this SOS method to analyse a real large‐scale customer churn data set in a securities company. The results show that the SOS method has good interpretability and prediction power in this real application.

Suggested Citation

  • Feifei Wang & Danyang Huang & Tianchen Gao & Shuyuan Wu & Hansheng Wang, 2022. "Sequential one‐step estimator by sub‐sampling for customer churn analysis with massive data sets," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 71(5), pages 1753-1786, November.
  • Handle: RePEc:bla:jorssc:v:71:y:2022:i:5:p:1753-1786
    DOI: 10.1111/rssc.12597
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/rssc.12597
    Download Restriction: no

    File URL: https://libkey.io/10.1111/rssc.12597?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Matias Quiroz & Robert Kohn & Mattias Villani & Minh-Ngoc Tran, 2019. "Speeding Up MCMC by Efficient Data Subsampling," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 114(526), pages 831-843, April.
    2. Michael I. Jordan & Jason D. Lee & Yun Yang, 2019. "Communication-Efficient Distributed Statistical Inference," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 114(526), pages 668-681, April.
    3. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    4. Maldonado, Sebastián & Domínguez, Gonzalo & Olaya, Diego & Verbeke, Wouter, 2021. "Profit-driven churn prediction for the mutual fund industry: A multisegment approach," Omega, Elsevier, vol. 100(C).
    5. HaiYing Wang & Min Yang & John Stufken, 2019. "Information-Based Optimal Subdata Selection for Big Data Linear Regression," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 114(525), pages 393-405, January.
    6. HaiYing Wang & Rong Zhu & Ping Ma, 2018. "Optimal Subsampling for Large Sample Logistic Regression," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 113(522), pages 829-844, April.
    7. Eva Ascarza & Scott A. Neslin & Oded Netzer & Zachery Anderson & Peter S. Fader & Sunil Gupta & Bruce G. S. Hardie & Aurélie Lemmens & Barak Libai & David Neal & Foster Provost & Rom Schrift, 2018. "In Pursuit of Enhanced Customer Retention Management: Review, Key Issues, and Future Directions," Customer Needs and Solutions, Springer;Institute for Sustainable Innovation and Growth (iSIG), vol. 5(1), pages 65-81, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Jun Yu & HaiYing Wang, 2022. "Subdata selection algorithm for linear model discrimination," Statistical Papers, Springer, vol. 63(6), pages 1883-1906, December.
    2. Zhang, Haixiang & Wang, HaiYing, 2021. "Distributed subdata selection for big data via sampling-based approach," Computational Statistics & Data Analysis, Elsevier, vol. 153(C).
    3. Lulu Zuo & Haixiang Zhang & HaiYing Wang & Liuquan Sun, 2021. "Optimal subsample selection for massive logistic regression with distributed data," Computational Statistics, Springer, vol. 36(4), pages 2535-2562, December.
    4. Guangbao Guo & Guoqi Qian & Lu Lin & Wei Shao, 2021. "Parallel inference for big data with the group Bayesian method," Metrika: International Journal for Theoretical and Applied Statistics, Springer, vol. 84(2), pages 225-243, February.
    5. Su, Miaomiao & Wang, Ruoyu & Wang, Qihua, 2022. "A two-stage optimal subsampling estimation for missing data problems with large-scale data," Computational Statistics & Data Analysis, Elsevier, vol. 173(C).
    6. Jun Yu & Jiaqi Liu & HaiYing Wang, 2023. "Information-based optimal subdata selection for non-linear models," Statistical Papers, Springer, vol. 64(4), pages 1069-1093, August.
    7. Wang, Kangning & Li, Shaomin, 2021. "Robust distributed modal regression for massive data," Computational Statistics & Data Analysis, Elsevier, vol. 160(C).
    8. Xingcai Zhou & Yu Xiang, 2022. "ADMM-Based Differential Privacy Learning for Penalized Quantile Regression on Distributed Functional Data," Mathematics, MDPI, vol. 10(16), pages 1-28, August.
    9. Duarte, Belmiro P.M. & Atkinson, Anthony C. & Oliveira, Nuno M.C., 2024. "Using hierarchical information-theoretic criteria to optimize subsampling of extensive datasets," LSE Research Online Documents on Economics 121641, London School of Economics and Political Science, LSE Library.
    10. Lu Lin & Feng Li, 2023. "Global debiased DC estimations for biased estimators via pro forma regression," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 32(2), pages 726-758, June.
    11. Wei Wang & Shou‐En Lu & Jerry Q. Cheng & Minge Xie & John B. Kostis, 2022. "Multivariate survival analysis in big data: A divide‐and‐combine approach," Biometrics, The International Biometric Society, vol. 78(3), pages 852-866, September.
    12. Tianzhen Wang & Haixiang Zhang, 2022. "Optimal subsampling for multiplicative regression with massive data," Statistica Neerlandica, Netherlands Society for Statistics and Operations Research, vol. 76(4), pages 418-449, November.
    13. J. Lars Kirkby & Dang H. Nguyen & Duy Nguyen & Nhu N. Nguyen, 2022. "Inversion-free subsampling Newton’s method for large sample logistic regression," Statistical Papers, Springer, vol. 63(3), pages 943-963, June.
    14. Ziyang Wang & HaiYing Wang & Nalini Ravishanker, 2023. "Subsampling in Longitudinal Models," Methodology and Computing in Applied Probability, Springer, vol. 25(1), pages 1-29, March.
    15. Amalan Mahendran & Helen Thompson & James M. McGree, 2023. "A model robust subsampling approach for Generalised Linear Models in big data settings," Statistical Papers, Springer, vol. 64(4), pages 1137-1157, August.
    16. Boris Beranger & Huan Lin & Scott Sisson, 2023. "New models for symbolic data analysis," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 17(3), pages 659-699, September.
    17. Deng, Jiayi & Huang, Danyang & Ding, Yi & Zhu, Yingqiu & Jing, Bingyi & Zhang, Bo, 2024. "Subsampling spectral clustering for stochastic block models in large-scale networks," Computational Statistics & Data Analysis, Elsevier, vol. 189(C).
    18. Xingcai Zhou & Hao Shen, 2022. "Communication-Efficient Distributed Learning for High-Dimensional Support Vector Machines," Mathematics, MDPI, vol. 10(7), pages 1-21, March.
    19. Hector, Emily C. & Luo, Lan & Song, Peter X.-K., 2023. "Parallel-and-stream accelerator for computationally fast supervised learning," Computational Statistics & Data Analysis, Elsevier, vol. 177(C).
    20. Sokbae Lee & Serena Ng, 2020. "An Econometric Perspective on Algorithmic Subsampling," Annual Review of Economics, Annual Reviews, vol. 12(1), pages 45-80, August.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:jorssc:v:71:y:2022:i:5:p:1753-1786. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://edirc.repec.org/data/rssssea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.