IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v10y2022i21p3959-d952233.html
   My bibliography  Save this article

Relaxation Subgradient Algorithms with Machine Learning Procedures

Author

Listed:
  • Vladimir Krutikov

    (Department of Applied Mathematics, Kemerovo State University, Krasnaya Street 6, Kemerovo 650043, Russia)

  • Svetlana Gutova

    (Department of Applied Mathematics, Kemerovo State University, Krasnaya Street 6, Kemerovo 650043, Russia)

  • Elena Tovbis

    (Institute of Informatics and Telecommunications, Reshetnev Siberian State University of Science and Technology, Prosp. Krasnoyarskiy Rabochiy 31, Krasnoyarsk 660031, Russia)

  • Lev Kazakovtsev

    (Institute of Informatics and Telecommunications, Reshetnev Siberian State University of Science and Technology, Prosp. Krasnoyarskiy Rabochiy 31, Krasnoyarsk 660031, Russia)

  • Eugene Semenkin

    (Institute of Informatics and Telecommunications, Reshetnev Siberian State University of Science and Technology, Prosp. Krasnoyarskiy Rabochiy 31, Krasnoyarsk 660031, Russia)

Abstract

In the modern digital economy, optimal decision support systems, as well as machine learning systems, are becoming an integral part of production processes. Artificial neural network training as well as other engineering problems generate such problems of high dimension that are difficult to solve with traditional gradient or conjugate gradient methods. Relaxation subgradient minimization methods (RSMMs) construct a descent direction that forms an obtuse angle with all subgradients of the current minimum neighborhood, which reduces to the problem of solving systems of inequalities. Having formalized the model and taking into account the specific features of subgradient sets, we reduced the problem of solving a system of inequalities to an approximation problem and obtained an efficient rapidly converging iterative learning algorithm for finding the direction of descent, conceptually similar to the iterative least squares method. The new algorithm is theoretically substantiated, and an estimate of its convergence rate is obtained depending on the parameters of the subgradient set. On this basis, we have developed and substantiated a new RSMM, which has the properties of the conjugate gradient method on quadratic functions. We have developed a practically realizable version of the minimization algorithm that uses a rough one-dimensional search. A computational experiment on complex functions in a space of high dimension confirms the effectiveness of the proposed algorithm. In the problems of training neural network models, where it is required to remove insignificant variables or neurons using methods such as the Tibshirani LASSO, our new algorithm outperforms known methods.

Suggested Citation

  • Vladimir Krutikov & Svetlana Gutova & Elena Tovbis & Lev Kazakovtsev & Eugene Semenkin, 2022. "Relaxation Subgradient Algorithms with Machine Learning Procedures," Mathematics, MDPI, vol. 10(21), pages 1-33, October.
  • Handle: RePEc:gam:jmathe:v:10:y:2022:i:21:p:3959-:d:952233
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/10/21/3959/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/10/21/3959/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Friedman, Jerome H. & Hastie, Trevor & Tibshirani, Rob, 2010. "Regularization Paths for Generalized Linear Models via Coordinate Descent," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 33(i01).
    2. Pavel Dvurechensky & Alexander Gasnikov, 2016. "Stochastic Intermediate Gradient Method for Convex Problems with Stochastic Inexact Oracle," Journal of Optimization Theory and Applications, Springer, vol. 171(1), pages 121-145, October.
    3. NESTEROV, Yurii, 2015. "Universal gradient methods for convex optimization problems," LIDAM Reprints CORE 2701, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Elena Tovbis & Vladimir Krutikov & Predrag Stanimirović & Vladimir Meshechkin & Aleksey Popov & Lev Kazakovtsev, 2023. "A Family of Multi-Step Subgradient Minimization Methods," Mathematics, MDPI, vol. 11(10), pages 1-24, May.
    2. Liliya A. Demidova, 2023. "Applied and Computational Mathematics for Digital Environments," Mathematics, MDPI, vol. 11(7), pages 1-5, March.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Fedor Stonyakin & Alexander Gasnikov & Pavel Dvurechensky & Alexander Titov & Mohammad Alkousa, 2022. "Generalized Mirror Prox Algorithm for Monotone Variational Inequalities: Universality and Inexact Oracle," Journal of Optimization Theory and Applications, Springer, vol. 194(3), pages 988-1013, September.
    2. Dvurechensky, Pavel & Gorbunov, Eduard & Gasnikov, Alexander, 2021. "An accelerated directional derivative method for smooth stochastic convex optimization," European Journal of Operational Research, Elsevier, vol. 290(2), pages 601-621.
    3. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    4. Ernesto Carrella & Richard M. Bailey & Jens Koed Madsen, 2018. "Indirect inference through prediction," Papers 1807.01579, arXiv.org.
    5. Rui Wang & Naihua Xiu & Kim-Chuan Toh, 2021. "Subspace quadratic regularization method for group sparse multinomial logistic regression," Computational Optimization and Applications, Springer, vol. 79(3), pages 531-559, July.
    6. Mkhadri, Abdallah & Ouhourane, Mohamed, 2013. "An extended variable inclusion and shrinkage algorithm for correlated variables," Computational Statistics & Data Analysis, Elsevier, vol. 57(1), pages 631-644.
    7. Masakazu Higuchi & Mitsuteru Nakamura & Shuji Shinohara & Yasuhiro Omiya & Takeshi Takano & Daisuke Mizuguchi & Noriaki Sonota & Hiroyuki Toda & Taku Saito & Mirai So & Eiji Takayama & Hiroo Terashi &, 2022. "Detection of Major Depressive Disorder Based on a Combination of Voice Features: An Exploratory Approach," IJERPH, MDPI, vol. 19(18), pages 1-13, September.
    8. Susan Athey & Guido W. Imbens & Stefan Wager, 2018. "Approximate residual balancing: debiased inference of average treatment effects in high dimensions," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 80(4), pages 597-623, September.
    9. Vincent, Martin & Hansen, Niels Richard, 2014. "Sparse group lasso and high dimensional multinomial classification," Computational Statistics & Data Analysis, Elsevier, vol. 71(C), pages 771-786.
    10. Chen, Le-Yu & Lee, Sokbae, 2018. "Best subset binary prediction," Journal of Econometrics, Elsevier, vol. 206(1), pages 39-56.
    11. Perrot-Dockès Marie & Lévy-Leduc Céline & Chiquet Julien & Sansonnet Laure & Brégère Margaux & Étienne Marie-Pierre & Robin Stéphane & Genta-Jouve Grégory, 2018. "A variable selection approach in the multivariate linear model: an application to LC-MS metabolomics data," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 17(5), pages 1-14, October.
    12. Fan, Jianqing & Jiang, Bai & Sun, Qiang, 2022. "Bayesian factor-adjusted sparse regression," Journal of Econometrics, Elsevier, vol. 230(1), pages 3-19.
    13. Chuliá, Helena & Garrón, Ignacio & Uribe, Jorge M., 2024. "Daily growth at risk: Financial or real drivers? The answer is not always the same," International Journal of Forecasting, Elsevier, vol. 40(2), pages 762-776.
    14. Jun Li & Serguei Netessine & Sergei Koulayev, 2018. "Price to Compete … with Many: How to Identify Price Competition in High-Dimensional Space," Management Science, INFORMS, vol. 64(9), pages 4118-4136, September.
    15. Sung Jae Jun & Sokbae Lee, 2024. "Causal Inference Under Outcome-Based Sampling with Monotonicity Assumptions," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 42(3), pages 998-1009, July.
    16. Rina Friedberg & Julie Tibshirani & Susan Athey & Stefan Wager, 2018. "Local Linear Forests," Papers 1807.11408, arXiv.org, revised Sep 2020.
    17. Xiangwei Li & Thomas Delerue & Ben Schöttker & Bernd Holleczek & Eva Grill & Annette Peters & Melanie Waldenberger & Barbara Thorand & Hermann Brenner, 2022. "Derivation and validation of an epigenetic frailty risk score in population-based cohorts of older adults," Nature Communications, Nature, vol. 13(1), pages 1-11, December.
    18. Hewamalage, Hansika & Bergmeir, Christoph & Bandara, Kasun, 2021. "Recurrent Neural Networks for Time Series Forecasting: Current status and future directions," International Journal of Forecasting, Elsevier, vol. 37(1), pages 388-427.
    19. Hui Xiao & Yiguo Sun, 2020. "Forecasting the Returns of Cryptocurrency: A Model Averaging Approach," JRFM, MDPI, vol. 13(11), pages 1-15, November.
    20. Christopher J Greenwood & George J Youssef & Primrose Letcher & Jacqui A Macdonald & Lauryn J Hagg & Ann Sanson & Jenn Mcintosh & Delyse M Hutchinson & John W Toumbourou & Matthew Fuller-Tyszkiewicz &, 2020. "A comparison of penalised regression methods for informing the selection of predictive markers," PLOS ONE, Public Library of Science, vol. 15(11), pages 1-14, November.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:10:y:2022:i:21:p:3959-:d:952233. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.