IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v11y2023i13p2829-d1178056.html
   My bibliography  Save this article

Sparse Support Tensor Machine with Scaled Kernel Functions

Author

Listed:
  • Shuangyue Wang

    (School of Mathematics and Statistics, Beijing Jiaotong University, Beijing 100044, China)

  • Ziyan Luo

    (School of Mathematics and Statistics, Beijing Jiaotong University, Beijing 100044, China)

Abstract

As one of the supervised tensor learning methods, the support tensor machine (STM) for tensorial data classification is receiving increasing attention in machine learning and related applications, including remote sensing imaging, video processing, fault diagnosis, etc. Existing STM approaches lack consideration for support tensors in terms of data reduction. To address this deficiency, we built a novel sparse STM model to control the number of support tensors in the binary classification of tensorial data. The sparsity is imposed on the dual variables in the context of the feature space, which facilitates the nonlinear classification with kernel tricks, such as the widely used Gaussian RBF kernel. To alleviate the local risk associated with the constant width in the tensor Gaussian RBF kernel, we propose a two-stage classification approach; in the second stage, we advocate for a scaling strategy on the kernel function in a data-dependent way, using the information of the support tensors obtained from the first stage. The essential optimization models in both stages share the same type, which is non-convex and discontinuous, due to the sparsity constraint. To resolve the computational challenge, a subspace Newton method is tailored for the sparsity-constrained optimization for effective computation with local convergence. Numerical experiments were conducted on real datasets, and the numerical results demonstrate the effectiveness of our proposed two-stage sparse STM approach in terms of classification accuracy, compared with the state-of-the-art binary classification approaches.

Suggested Citation

  • Shuangyue Wang & Ziyan Luo, 2023. "Sparse Support Tensor Machine with Scaled Kernel Functions," Mathematics, MDPI, vol. 11(13), pages 1-20, June.
  • Handle: RePEc:gam:jmathe:v:11:y:2023:i:13:p:2829-:d:1178056
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/11/13/2829/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/11/13/2829/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Qi Wang & Yue Ma & Kun Zhao & Yingjie Tian, 2022. "A Comprehensive Survey of Loss Functions in Machine Learning," Annals of Data Science, Springer, vol. 9(2), pages 187-212, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Henrik Seckler & Ralf Metzler, 2022. "Bayesian deep learning for error estimation in the analysis of anomalous diffusion," Nature Communications, Nature, vol. 13(1), pages 1-13, December.
    2. Iqbal H. Sarker, 2023. "Machine Learning for Intelligent Data Analysis and Automation in Cybersecurity: Current and Future Prospects," Annals of Data Science, Springer, vol. 10(6), pages 1473-1498, December.
    3. Loutfi, Ahmad Amine & Sun, Mengtao & Loutfi, Ijlal & Solibakke, Per Bjarte, 2022. "Empirical study of day-ahead electricity spot-price forecasting: Insights into a novel loss function for training neural networks," Applied Energy, Elsevier, vol. 319(C).
    4. Emma King-Smith & Felix A. Faber & Usa Reilly & Anton V. Sinitskiy & Qingyi Yang & Bo Liu & Dennis Hyek & Alpha A. Lee, 2024. "Predictive Minisci late stage functionalization with transfer learning," Nature Communications, Nature, vol. 15(1), pages 1-13, December.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:11:y:2023:i:13:p:2829-:d:1178056. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.