IDEAS home Printed from https://ideas.repec.org/a/inm/orijoc/v32y2020i1p182-198.html
   My bibliography  Save this article

Convex Optimization for Group Feature Selection in Networked Data

Author

Listed:
  • Daehan Won

    (Systems Science and Industrial Engineering Department, Binghamton University, the State University of New York, New York, New York 13902)

  • Hasan Manzour

    (Department of Industrial and Systems Engineering, University of Washington, Seattle, Washington 98195)

  • Wanpracha Chaovalitwongse

    (Institute for Advanced Data Analytics, Department of Industrial Engineering, University of Arkansas, Fayetteville, Arkansas 72701)

Abstract

Feature selection is at the heart of machine learning, and it is effective at facilitating data interpretability and improving prediction performance by defying the curse of dimensionality. Group feature selection is often used to reveal relationships in structured data and provide better predictive power compared with the standard feature selection methods without consideration of the grouped structure. We study a group feature selection problem in networked data in which edge weights are considered as features, while each node in the network is regarded as a group feature. This problem is particularly useful in feature selection for neuroimaging data, where the data are high dimensional and the intrinsic networked structure among the features (i.e., connectivities between regions) in brain data has to be captured properly. We propose a mathematical model based on the support vector machines (SVM), which entails the ℓ 0 norm regularization to restrict the number of nodes (i.e., groups). To cope with the computational challenge of the ℓ 0 norm regularization, we develop a convex relaxation reformulation of the proposed model as a convex semiinfinite programming (SIP). We then introduce a new iterative algorithm that achieves an optimal solution for this convex SIP. Experimental results for synthetic and real brain network data sets show that our approach gives better predictive performance compared with the state-of-the-art group feature selection and the standard feature selection methods. Our technique additionally yields a sparse subnetwork solution that is easier to interpret than those obtained by other methods.

Suggested Citation

  • Daehan Won & Hasan Manzour & Wanpracha Chaovalitwongse, 2020. "Convex Optimization for Group Feature Selection in Networked Data," INFORMS Journal on Computing, INFORMS, vol. 32(1), pages 182-198, January.
  • Handle: RePEc:inm:orijoc:v:32:y:2020:i:1:p:182-198
    DOI: 10.1287/ijoc.2018.0868
    as

    Download full text from publisher

    File URL: https://doi.org/10.1287/ijoc.2018.0868
    Download Restriction: no

    File URL: https://libkey.io/10.1287/ijoc.2018.0868?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Wang, Hansheng & Leng, Chenlei, 2008. "A note on adaptive group lasso," Computational Statistics & Data Analysis, Elsevier, vol. 52(12), pages 5277-5286, August.
    2. Borgwardt, S. & Schmiedl, F., 2014. "Threshold-based preprocessing for approximating the weighted dense k-subgraph problem," European Journal of Operational Research, Elsevier, vol. 234(3), pages 631-640.
    3. Bertolazzi, P. & Felici, G. & Festa, P. & Fiscon, G. & Weitschek, E., 2016. "Integer programming models for feature selection: New extensions and a randomized solution algorithm," European Journal of Operational Research, Elsevier, vol. 250(2), pages 389-399.
    4. E. Y. Pee & J. O. Royset, 2011. "On Solving Large-Scale Finite Minimax Problems Using Exponential Smoothing," Journal of Optimization Theory and Applications, Springer, vol. 148(2), pages 390-421, February.
    5. Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
    6. Macambira, Elder Magalhaes & de Souza, Cid Carvalho, 2000. "The edge-weighted clique problem: Valid inequalities, facets and polyhedral computations," European Journal of Operational Research, Elsevier, vol. 123(2), pages 346-371, June.
    7. Robert Tibshirani & Michael Saunders & Saharon Rosset & Ji Zhu & Keith Knight, 2005. "Sparsity and smoothness via the fused lasso," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(1), pages 91-108, February.
    8. Ming Yuan & Yi Lin, 2006. "Model selection and estimation in regression with grouped variables," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 68(1), pages 49-67, February.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. He Jiang, 2023. "Robust forecasting in spatial autoregressive model with total variation regularization," Journal of Forecasting, John Wiley & Sons, Ltd., vol. 42(2), pages 195-211, March.
    2. Zhang, Yishi & Zhu, Ruilin & Chen, Zhijun & Gao, Jie & Xia, De, 2021. "Evaluating and selecting features via information theoretic lower bounds of feature inner correlations for high-dimensional data," European Journal of Operational Research, Elsevier, vol. 290(1), pages 235-247.
    3. Zemin Zheng & Jie Zhang & Yang Li, 2022. "L 0 -Regularized Learning for High-Dimensional Additive Hazards Regression," INFORMS Journal on Computing, INFORMS, vol. 34(5), pages 2762-2775, September.
    4. Yanhang Zhang & Junxian Zhu & Jin Zhu & Xueqin Wang, 2023. "A Splicing Approach to Best Subset of Groups Selection," INFORMS Journal on Computing, INFORMS, vol. 35(1), pages 104-119, January.
    5. Decui Liang & Fangshun Li & Xinyi Chen, 2024. "Failure mode and effect analysis by exploiting text mining and multi-view group consensus for the defect detection of electric vehicles in social media data," Annals of Operations Research, Springer, vol. 340(1), pages 289-324, September.
    6. Canhong Wen & Zhenduo Li & Ruipeng Dong & Yijin Ni & Wenliang Pan, 2023. "Simultaneous Dimension Reduction and Variable Selection for Multinomial Logistic Regression," INFORMS Journal on Computing, INFORMS, vol. 35(5), pages 1044-1060, September.
    7. Ali Hamzenejad & Saeid Jafarzadeh Ghoushchi & Vahid Baradaran & Abbas Mardani, 2020. "A Robust Algorithm for Classification and Diagnosis of Brain Disease Using Local Linear Approximation and Generalized Autoregressive Conditional Heteroscedasticity Model," Mathematics, MDPI, vol. 8(8), pages 1-19, August.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    2. Bang, Sungwan & Jhun, Myoungshic, 2012. "Simultaneous estimation and factor selection in quantile regression via adaptive sup-norm regularization," Computational Statistics & Data Analysis, Elsevier, vol. 56(4), pages 813-826.
    3. Qian, Junhui & Su, Liangjun, 2016. "Shrinkage estimation of common breaks in panel data models via adaptive group fused Lasso," Journal of Econometrics, Elsevier, vol. 191(1), pages 86-109.
    4. Gerhard Tutz & Margret-Ruth Oelker, 2017. "Modelling Clustered Heterogeneity: Fixed Effects, Random Effects and Mixtures," International Statistical Review, International Statistical Institute, vol. 85(2), pages 204-227, August.
    5. Justin B. Post & Howard D. Bondell, 2013. "Factor Selection and Structural Identification in the Interaction ANOVA Model," Biometrics, The International Biometric Society, vol. 69(1), pages 70-79, March.
    6. Diego Vidaurre & Concha Bielza & Pedro Larrañaga, 2013. "A Survey of L1 Regression," International Statistical Review, International Statistical Institute, vol. 81(3), pages 361-387, December.
    7. Bastien Marquis & Maarten Jansen, 2022. "Information criteria bias correction for group selection," Statistical Papers, Springer, vol. 63(5), pages 1387-1414, October.
    8. Dong, C. & Li, S., 2021. "Specification Lasso and an Application in Financial Markets," Cambridge Working Papers in Economics 2139, Faculty of Economics, University of Cambridge.
    9. Tomáš Plíhal, 2021. "Scheduled macroeconomic news announcements and Forex volatility forecasting," Journal of Forecasting, John Wiley & Sons, Ltd., vol. 40(8), pages 1379-1397, December.
    10. Loann David Denis Desboulets, 2018. "A Review on Variable Selection in Regression Analysis," Econometrics, MDPI, vol. 6(4), pages 1-27, November.
    11. Fei Jin & Lung-fei Lee, 2018. "Lasso Maximum Likelihood Estimation of Parametric Models with Singular Information Matrices," Econometrics, MDPI, vol. 6(1), pages 1-24, February.
    12. Takumi Saegusa & Tianzhou Ma & Gang Li & Ying Qing Chen & Mei-Ling Ting Lee, 2020. "Variable Selection in Threshold Regression Model with Applications to HIV Drug Adherence Data," Statistics in Biosciences, Springer;International Chinese Statistical Association, vol. 12(3), pages 376-398, December.
    13. Yuanyuan Shen & Katherine P. Liao & Tianxi Cai, 2015. "Sparse kernel machine regression for ordinal outcomes," Biometrics, The International Biometric Society, vol. 71(1), pages 63-70, March.
    14. Pei Wang & Shunjie Chen & Sijia Yang, 2022. "Recent Advances on Penalized Regression Models for Biological Data," Mathematics, MDPI, vol. 10(19), pages 1-24, October.
    15. Ricardo P. Masini & Marcelo C. Medeiros & Eduardo F. Mendes, 2023. "Machine learning advances for time series forecasting," Journal of Economic Surveys, Wiley Blackwell, vol. 37(1), pages 76-111, February.
    16. Jin, Fei & Lee, Lung-fei, 2018. "Irregular N2SLS and LASSO estimation of the matrix exponential spatial specification model," Journal of Econometrics, Elsevier, vol. 206(2), pages 336-358.
    17. Korobilis, Dimitris, 2013. "Hierarchical shrinkage priors for dynamic regressions with many predictors," International Journal of Forecasting, Elsevier, vol. 29(1), pages 43-59.
    18. Hu, Jianhua & Liu, Xiaoqian & Liu, Xu & Xia, Ningning, 2022. "Some aspects of response variable selection and estimation in multivariate linear regression," Journal of Multivariate Analysis, Elsevier, vol. 188(C).
    19. Xiaoping Liu & Xiao-Bai Li & Sumit Sarkar, 2023. "Cost-Restricted Feature Selection for Data Acquisition," Management Science, INFORMS, vol. 69(7), pages 3976-3992, July.
    20. Degui Li & Junhui Qian & Liangjun Su, 2016. "Panel Data Models With Interactive Fixed Effects and Multiple Structural Breaks," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(516), pages 1804-1819, October.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:inm:orijoc:v:32:y:2020:i:1:p:182-198. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Asher (email available below). General contact details of provider: https://edirc.repec.org/data/inforea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.