IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v94y2016icp221-237.html
   My bibliography  Save this article

On stepwise pattern recovery of the fused Lasso

Author

Listed:
  • Qian, Junyang
  • Jia, Jinzhu

Abstract

We study the property of the Fused Lasso Signal Approximator (FLSA) for estimating a blocky signal sequence with additive noise. We transform the FLSA to an ordinary Lasso problem, and find that in general the resulting design matrix does not satisfy the irrepresentable condition that is known as an almost necessary and sufficient condition for exact pattern recovery. We give necessary and sufficient conditions on the expected signal pattern such that the irrepresentable condition holds in the transformed Lasso problem. However, these conditions turn out to be very restrictive. We apply the newly developed preconditioning method — Puffer Transformation (Jia and Rohe, 2015) to the transformed Lasso and call the new procedure the preconditioned fused Lasso. We give non-asymptotic results for this method, showing that as long as the signal-to-noise ratio is not too small, our preconditioned fused Lasso estimator always recovers the correct pattern with high probability. Theoretical results give insight into what controls the ability of recovering the pattern — it is the noise level instead of the length of the signal sequence. Simulations further confirm our theorems and visualize the significant improvement of the preconditioned fused Lasso estimator over the vanilla FLSA in exact pattern recovery.

Suggested Citation

  • Qian, Junyang & Jia, Jinzhu, 2016. "On stepwise pattern recovery of the fused Lasso," Computational Statistics & Data Analysis, Elsevier, vol. 94(C), pages 221-237.
  • Handle: RePEc:eee:csdana:v:94:y:2016:i:c:p:221-237
    DOI: 10.1016/j.csda.2015.08.013
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947315002017
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2015.08.013?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Harchaoui, Z. & Lévy-Leduc, C., 2010. "Multiple Change-Point Estimation With a Total Variation Penalty," Journal of the American Statistical Association, American Statistical Association, vol. 105(492), pages 1480-1493.
    2. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Won Son & Johan Lim & Donghyeon Yu, 2023. "Path algorithms for fused lasso signal approximator with application to COVID‐19 spread in Korea," International Statistical Review, International Statistical Institute, vol. 91(2), pages 218-242, August.
    2. Karsten Schweikert, 2020. "Oracle Efficient Estimation of Structural Breaks in Cointegrating Regressions," Papers 2001.07949, arXiv.org, revised Apr 2021.
    3. Karsten Schweikert, 2022. "Oracle Efficient Estimation of Structural Breaks in Cointegrating Regressions," Journal of Time Series Analysis, Wiley Blackwell, vol. 43(1), pages 83-104, January.
    4. E. Ollier & V. Viallon, 2017. "Regression modelling on stratified data with the lasso," Biometrika, Biometrika Trust, vol. 104(1), pages 83-96.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Sokbae Lee & Myung Hwan Seo & Youngki Shin, 2016. "The lasso for high dimensional regression with a possible change point," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 78(1), pages 193-210, January.
    2. Jie Shen & Colin M. Gallagher & QiQi Lu, 2014. "Detection of multiple undocumented change-points using adaptive Lasso," Journal of Applied Statistics, Taylor & Francis Journals, vol. 41(6), pages 1161-1173, June.
    3. Shohoudi, Azadeh & Khalili, Abbas & Wolfson, David B. & Asgharian, Masoud, 2016. "Simultaneous variable selection and de-coarsening in multi-path change-point models," Journal of Multivariate Analysis, Elsevier, vol. 147(C), pages 202-217.
    4. Zheng Tracy Ke & Jianqing Fan & Yichao Wu, 2015. "Homogeneity Pursuit," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 110(509), pages 175-194, March.
    5. Degui Li & Junhui Qian & Liangjun Su, 2016. "Panel Data Models With Interactive Fixed Effects and Multiple Structural Breaks," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(516), pages 1804-1819, October.
    6. Qiang Li & Liming Wang, 2020. "Robust change point detection method via adaptive LAD-LASSO," Statistical Papers, Springer, vol. 61(1), pages 109-121, February.
    7. Karsten Schweikert, 2022. "Oracle Efficient Estimation of Structural Breaks in Cointegrating Regressions," Journal of Time Series Analysis, Wiley Blackwell, vol. 43(1), pages 83-104, January.
    8. Behrendt, Simon & Schweikert, Karsten, 2021. "A Note on Adaptive Group Lasso for Structural Break Time Series," Econometrics and Statistics, Elsevier, vol. 17(C), pages 156-172.
    9. Karsten Schweikert, 2020. "Oracle Efficient Estimation of Structural Breaks in Cointegrating Regressions," Papers 2001.07949, arXiv.org, revised Apr 2021.
    10. Gabriela Ciuperca, 2014. "Model selection by LASSO methods in a change-point model," Statistical Papers, Springer, vol. 55(2), pages 349-374, May.
    11. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    12. Guan, Wei & Gray, Alexander, 2013. "Sparse high-dimensional fractional-norm support vector machine via DC programming," Computational Statistics & Data Analysis, Elsevier, vol. 67(C), pages 136-148.
    13. Margherita Giuzio, 2017. "Genetic algorithm versus classical methods in sparse index tracking," Decisions in Economics and Finance, Springer;Associazione per la Matematica, vol. 40(1), pages 243-256, November.
    14. Chang, Jinyuan & Chen, Song Xi & Chen, Xiaohong, 2015. "High dimensional generalized empirical likelihood for moment restrictions with dependent data," Journal of Econometrics, Elsevier, vol. 185(1), pages 283-304.
    15. Xu, Yang & Zhao, Shishun & Hu, Tao & Sun, Jianguo, 2021. "Variable selection for generalized odds rate mixture cure models with interval-censored failure time data," Computational Statistics & Data Analysis, Elsevier, vol. 156(C).
    16. Alexandre Belloni & Victor Chernozhukov & Denis Chetverikov & Christian Hansen & Kengo Kato, 2018. "High-dimensional econometrics and regularized GMM," CeMMAP working papers CWP35/18, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    17. Emmanouil Androulakis & Christos Koukouvinos & Kalliopi Mylona & Filia Vonta, 2010. "A real survival analysis application via variable selection methods for Cox's proportional hazards model," Journal of Applied Statistics, Taylor & Francis Journals, vol. 37(8), pages 1399-1406.
    18. Meng An & Haixiang Zhang, 2023. "High-Dimensional Mediation Analysis for Time-to-Event Outcomes with Additive Hazards Model," Mathematics, MDPI, vol. 11(24), pages 1-11, December.
    19. Singh, Rakhi & Stufken, John, 2024. "Factor selection in screening experiments by aggregation over random models," Computational Statistics & Data Analysis, Elsevier, vol. 194(C).
    20. Hao Wang & Hao Zeng & Jiashan Wang, 2022. "An extrapolated iteratively reweighted $$\ell _1$$ ℓ 1 method with complexity analysis," Computational Optimization and Applications, Springer, vol. 83(3), pages 967-997, December.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:94:y:2016:i:c:p:221-237. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.