IDEAS home Printed from https://ideas.repec.org/a/spr/compst/v29y2014i5p1005-1023.html
   My bibliography  Save this article

Adaptive basis expansion via $$\ell _1$$ ℓ 1 trend filtering

Author

Listed:
  • Daeju Kim
  • Shuichi Kawano
  • Yoshiyuki Ninomiya

Abstract

We propose a new approach for nonlinear regression modeling by employing basis expansion for the case where the underlying regression function has inhomogeneous smoothness. In this case, conventional nonlinear regression models tend to be over- or underfitting, where the function is more or less smoother, respectively. First, the underlying regression function is roughly approximated with a locally linear function using an $$\ell _1$$ ℓ 1 penalized method, where this procedure is executed by extending an algorithm for the fused lasso signal approximator. We then extend the fused lasso signal approximator and develop an algorithm. Next, the residuals between the locally linear function and the data are used to adaptively prepare the basis functions. Finally, we construct a nonlinear regression model with these basis functions along with the technique of a regularization method. To select the optimal values of the tuning parameters for the regularization method, we provide an explicit form of the generalized information criterion. The validity of our proposed method is then demonstrated through several numerical examples. Copyright Springer-Verlag Berlin Heidelberg 2014

Suggested Citation

  • Daeju Kim & Shuichi Kawano & Yoshiyuki Ninomiya, 2014. "Adaptive basis expansion via $$\ell _1$$ ℓ 1 trend filtering," Computational Statistics, Springer, vol. 29(5), pages 1005-1023, October.
  • Handle: RePEc:spr:compst:v:29:y:2014:i:5:p:1005-1023
    DOI: 10.1007/s00180-013-0477-7
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1007/s00180-013-0477-7
    Download Restriction: Access to full text is restricted to subscribers.

    File URL: https://libkey.io/10.1007/s00180-013-0477-7?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Robert Tibshirani & Michael Saunders & Saharon Rosset & Ji Zhu & Keith Knight, 2005. "Sparsity and smoothness via the fused lasso," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(1), pages 91-108, February.
    2. D. G. T. Denison & B. K. Mallick & A. F. M. Smith, 1998. "Automatic Bayesian curve fitting," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 60(2), pages 333-350.
    3. Hui Zou & Trevor Hastie, 2005. "Addendum: Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(5), pages 768-768, November.
    4. Hui Zou & Trevor Hastie, 2005. "Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(2), pages 301-320, April.
    5. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    2. Yize Zhao & Matthias Chung & Brent A. Johnson & Carlos S. Moreno & Qi Long, 2016. "Hierarchical Feature Selection Incorporating Known and Novel Biological Information: Identifying Genomic Features Related to Prostate Cancer Recurrence," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(516), pages 1427-1439, October.
    3. Loann David Denis Desboulets, 2018. "A Review on Variable Selection in Regression Analysis," Econometrics, MDPI, vol. 6(4), pages 1-27, November.
    4. Victor Chernozhukov & Christian Hansen & Yuan Liao, 2015. "A lava attack on the recovery of sums of dense and sparse signals," CeMMAP working papers CWP56/15, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    5. Takumi Saegusa & Tianzhou Ma & Gang Li & Ying Qing Chen & Mei-Ling Ting Lee, 2020. "Variable Selection in Threshold Regression Model with Applications to HIV Drug Adherence Data," Statistics in Biosciences, Springer;International Chinese Statistical Association, vol. 12(3), pages 376-398, December.
    6. Feng Hong & Lu Tian & Viswanath Devanarayan, 2023. "Improving the Robustness of Variable Selection and Predictive Performance of Regularized Generalized Linear Models and Cox Proportional Hazard Models," Mathematics, MDPI, vol. 11(3), pages 1-13, January.
    7. Massimiliano Caporin & Francesco Poli, 2017. "Building News Measures from Textual Data and an Application to Volatility Forecasting," Econometrics, MDPI, vol. 5(3), pages 1-46, August.
    8. Pei Wang & Shunjie Chen & Sijia Yang, 2022. "Recent Advances on Penalized Regression Models for Biological Data," Mathematics, MDPI, vol. 10(19), pages 1-24, October.
    9. Justin B. Post & Howard D. Bondell, 2013. "Factor Selection and Structural Identification in the Interaction ANOVA Model," Biometrics, The International Biometric Society, vol. 69(1), pages 70-79, March.
    10. Jiang, Liewen & Bondell, Howard D. & Wang, Huixia Judy, 2014. "Interquantile shrinkage and variable selection in quantile regression," Computational Statistics & Data Analysis, Elsevier, vol. 69(C), pages 208-219.
    11. Matthias Weber & Martin Schumacher & Harald Binder, 2014. "Regularized Regression Incorporating Network Information: Simultaneous Estimation of Covariate Coefficients and Connection Signs," Tinbergen Institute Discussion Papers 14-089/I, Tinbergen Institute.
    12. Zhiyong Huang & Ziyan Luo & Naihua Xiu, 2019. "High-Dimensional Least-Squares with Perfect Positive Correlation," Asia-Pacific Journal of Operational Research (APJOR), World Scientific Publishing Co. Pte. Ltd., vol. 36(04), pages 1-16, August.
    13. Xiaofei Wu & Rongmei Liang & Hu Yang, 2022. "Penalized and constrained LAD estimation in fixed and high dimension," Statistical Papers, Springer, vol. 63(1), pages 53-95, February.
    14. Minh Pham & Xiaodong Lin & Andrzej Ruszczyński & Yu Du, 2021. "An outer–inner linearization method for non-convex and nondifferentiable composite regularization problems," Journal of Global Optimization, Springer, vol. 81(1), pages 179-202, September.
    15. Siwei Xia & Yuehan Yang & Hu Yang, 2022. "Sparse Laplacian Shrinkage with the Graphical Lasso Estimator for Regression Problems," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 31(1), pages 255-277, March.
    16. Kwon, Sunghoon & Oh, Seungyoung & Lee, Youngjo, 2016. "The use of random-effect models for high-dimensional variable selection problems," Computational Statistics & Data Analysis, Elsevier, vol. 103(C), pages 401-412.
    17. Ismail Shah & Hina Naz & Sajid Ali & Amani Almohaimeed & Showkat Ahmad Lone, 2023. "A New Quantile-Based Approach for LASSO Estimation," Mathematics, MDPI, vol. 11(6), pages 1-13, March.
    18. Hu, Qinqin & Zeng, Peng & Lin, Lu, 2015. "The dual and degrees of freedom of linearly constrained generalized lasso," Computational Statistics & Data Analysis, Elsevier, vol. 86(C), pages 13-26.
    19. Peng Zeng & Qinqin Hu & Xiaoyu Li, 2017. "Geometry and Degrees of Freedom of Linearly Constrained Generalized Lasso," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 44(4), pages 989-1008, December.
    20. Sunkyung Kim & Wei Pan & Xiaotong Shen, 2013. "Network-Based Penalized Regression With Application to Genomic Data," Biometrics, The International Biometric Society, vol. 69(3), pages 582-593, September.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:compst:v:29:y:2014:i:5:p:1005-1023. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.