Author
Abstract
This article introduces a Factor Augmented Sparse Throughput (FAST) model that uses both latent factors and sparse idiosyncratic components for nonparametric regression. It contains many popular statistical models. The FAST model bridges factor models on one end and sparse nonparametric models on the other end. It encompasses structured nonparametric models such as factor augmented additive models and sparse low-dimensional nonparametric interaction models and covers the cases where the covariates do not admit factor structures. This model allows us to conduct high-dimensional nonparametric model selection for both strong dependent and weak dependent covariates and hence contributes to interpretable machine learning, particularly to the feature selections for neural networks. Via diversified projections as estimation of latent factor space, we employ truncated deep ReLU networks to nonparametric factor regression without regularization and to a more general FAST model using nonconvex regularization, resulting in factor augmented regression using neural network (FAR-NN) and FAST-NN estimators, respectively. We show that FAR-NN and FAST-NN estimators adapt to the unknown low-dimensional structure using hierarchical composition models in nonasymptotic minimax rates. We also study statistical learning for the factor augmented sparse additive model using a more specific neural network architecture. Our results are applicable to the weak dependent cases without factor structures. In proving the main technical result for FAST-NN, we establish a new deep ReLU network approximation result that contributes to the foundation of neural network theory. Numerical studies further support our theory and methods. Supplementary materials for this article are available online.
Suggested Citation
Jianqing Fan & Yihong Gu, 2024.
"Factor Augmented Sparse Throughput Deep ReLU Neural Networks for High Dimensional Regression,"
Journal of the American Statistical Association, Taylor & Francis Journals, vol. 119(548), pages 2680-2694, October.
Handle:
RePEc:taf:jnlasa:v:119:y:2024:i:548:p:2680-2694
DOI: 10.1080/01621459.2023.2271605
Download full text from publisher
As the access to this document is restricted, you may want to search for a different version of it.
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:taf:jnlasa:v:119:y:2024:i:548:p:2680-2694. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Longhurst (email available below). General contact details of provider: http://www.tandfonline.com/UASA20 .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.