IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v94y2016icp221-237.html
   My bibliography  Save this article

On stepwise pattern recovery of the fused Lasso

Author

Listed:
  • Qian, Junyang
  • Jia, Jinzhu

Abstract

We study the property of the Fused Lasso Signal Approximator (FLSA) for estimating a blocky signal sequence with additive noise. We transform the FLSA to an ordinary Lasso problem, and find that in general the resulting design matrix does not satisfy the irrepresentable condition that is known as an almost necessary and sufficient condition for exact pattern recovery. We give necessary and sufficient conditions on the expected signal pattern such that the irrepresentable condition holds in the transformed Lasso problem. However, these conditions turn out to be very restrictive. We apply the newly developed preconditioning method — Puffer Transformation (Jia and Rohe, 2015) to the transformed Lasso and call the new procedure the preconditioned fused Lasso. We give non-asymptotic results for this method, showing that as long as the signal-to-noise ratio is not too small, our preconditioned fused Lasso estimator always recovers the correct pattern with high probability. Theoretical results give insight into what controls the ability of recovering the pattern — it is the noise level instead of the length of the signal sequence. Simulations further confirm our theorems and visualize the significant improvement of the preconditioned fused Lasso estimator over the vanilla FLSA in exact pattern recovery.

Suggested Citation

  • Qian, Junyang & Jia, Jinzhu, 2016. "On stepwise pattern recovery of the fused Lasso," Computational Statistics & Data Analysis, Elsevier, vol. 94(C), pages 221-237.
  • Handle: RePEc:eee:csdana:v:94:y:2016:i:c:p:221-237
    DOI: 10.1016/j.csda.2015.08.013
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947315002017
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2015.08.013?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Harchaoui, Z. & Lévy-Leduc, C., 2010. "Multiple Change-Point Estimation With a Total Variation Penalty," Journal of the American Statistical Association, American Statistical Association, vol. 105(492), pages 1480-1493.
    2. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Won Son & Johan Lim & Donghyeon Yu, 2023. "Path algorithms for fused lasso signal approximator with application to COVID‐19 spread in Korea," International Statistical Review, International Statistical Institute, vol. 91(2), pages 218-242, August.
    2. Karsten Schweikert, 2020. "Oracle Efficient Estimation of Structural Breaks in Cointegrating Regressions," Papers 2001.07949, arXiv.org, revised Apr 2021.
    3. Karsten Schweikert, 2022. "Oracle Efficient Estimation of Structural Breaks in Cointegrating Regressions," Journal of Time Series Analysis, Wiley Blackwell, vol. 43(1), pages 83-104, January.
    4. E. Ollier & V. Viallon, 2017. "Regression modelling on stratified data with the lasso," Biometrika, Biometrika Trust, vol. 104(1), pages 83-96.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Sokbae Lee & Myung Hwan Seo & Youngki Shin, 2016. "The lasso for high dimensional regression with a possible change point," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 78(1), pages 193-210, January.
    2. Zheng Tracy Ke & Jianqing Fan & Yichao Wu, 2015. "Homogeneity Pursuit," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 110(509), pages 175-194, March.
    3. Degui Li & Junhui Qian & Liangjun Su, 2016. "Panel Data Models With Interactive Fixed Effects and Multiple Structural Breaks," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(516), pages 1804-1819, October.
    4. Behrendt, Simon & Schweikert, Karsten, 2021. "A Note on Adaptive Group Lasso for Structural Break Time Series," Econometrics and Statistics, Elsevier, vol. 17(C), pages 156-172.
    5. Jie Shen & Colin M. Gallagher & QiQi Lu, 2014. "Detection of multiple undocumented change-points using adaptive Lasso," Journal of Applied Statistics, Taylor & Francis Journals, vol. 41(6), pages 1161-1173, June.
    6. Shohoudi, Azadeh & Khalili, Abbas & Wolfson, David B. & Asgharian, Masoud, 2016. "Simultaneous variable selection and de-coarsening in multi-path change-point models," Journal of Multivariate Analysis, Elsevier, vol. 147(C), pages 202-217.
    7. Qiang Li & Liming Wang, 2020. "Robust change point detection method via adaptive LAD-LASSO," Statistical Papers, Springer, vol. 61(1), pages 109-121, February.
    8. Karsten Schweikert, 2022. "Oracle Efficient Estimation of Structural Breaks in Cointegrating Regressions," Journal of Time Series Analysis, Wiley Blackwell, vol. 43(1), pages 83-104, January.
    9. Karsten Schweikert, 2020. "Oracle Efficient Estimation of Structural Breaks in Cointegrating Regressions," Papers 2001.07949, arXiv.org, revised Apr 2021.
    10. Gabriela Ciuperca, 2014. "Model selection by LASSO methods in a change-point model," Statistical Papers, Springer, vol. 55(2), pages 349-374, May.
    11. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    12. Xu, Yang & Zhao, Shishun & Hu, Tao & Sun, Jianguo, 2021. "Variable selection for generalized odds rate mixture cure models with interval-censored failure time data," Computational Statistics & Data Analysis, Elsevier, vol. 156(C).
    13. Emmanouil Androulakis & Christos Koukouvinos & Kalliopi Mylona & Filia Vonta, 2010. "A real survival analysis application via variable selection methods for Cox's proportional hazards model," Journal of Applied Statistics, Taylor & Francis Journals, vol. 37(8), pages 1399-1406.
    14. Singh, Rakhi & Stufken, John, 2024. "Factor selection in screening experiments by aggregation over random models," Computational Statistics & Data Analysis, Elsevier, vol. 194(C).
    15. Koki Momoki & Takuma Yoshida, 2024. "Hypothesis testing for varying coefficient models in tail index regression," Statistical Papers, Springer, vol. 65(6), pages 3821-3852, August.
    16. Lili Pan & Ziyan Luo & Naihua Xiu, 2017. "Restricted Robinson Constraint Qualification and Optimality for Cardinality-Constrained Cone Programming," Journal of Optimization Theory and Applications, Springer, vol. 175(1), pages 104-118, October.
    17. Gerhard Tutz & Moritz Berger, 2018. "Tree-structured modelling of categorical predictors in generalized additive regression," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 12(3), pages 737-758, September.
    18. Chenchuan (Mark) Li & Ulrich K. Müller, 2021. "Linear regression with many controls of limited explanatory power," Quantitative Economics, Econometric Society, vol. 12(2), pages 405-442, May.
    19. Shuichi Kawano, 2014. "Selection of tuning parameters in bridge regression models via Bayesian information criterion," Statistical Papers, Springer, vol. 55(4), pages 1207-1223, November.
    20. Hang Yu & Yuanjia Wang & Donglin Zeng, 2023. "A general framework of nonparametric feature selection in high‐dimensional data," Biometrics, The International Biometric Society, vol. 79(2), pages 951-963, June.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:94:y:2016:i:c:p:221-237. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.