IDEAS home Printed from https://ideas.repec.org/a/spr/jglopt/v73y2019i2d10.1007_s10898-018-0701-7.html
   My bibliography  Save this article

Global optimization issues in deep network regression: an overview

Author

Listed:
  • Laura Palagi

    (Sapienza - University of Rome)

Abstract

The paper presents an overview of global issues in optimization methods for training feedforward neural networks (FNN) in a regression setting. We first recall the learning optimization paradigm for FNN and we briefly discuss global scheme for the joint choice of the network topologies and of the network parameters. The main part of the paper focuses on the core subproblem which is the continuous unconstrained (regularized) weights optimization problem with the aim of reviewing global methods specifically arising both in multi layer perceptron/deep networks and in radial basis networks. We review some recent results on the existence of non-global stationary points of the unconstrained nonlinear problem and the role of determining a global solution in a supervised learning paradigm. Local algorithms that are widespread used to solve the continuous unconstrained problems are addressed with focus on possible improvements to exploit the global properties. Hybrid global methods specifically devised for FNN training optimization problems which embed local algorithms are discussed too.

Suggested Citation

  • Laura Palagi, 2019. "Global optimization issues in deep network regression: an overview," Journal of Global Optimization, Springer, vol. 73(2), pages 239-277, February.
  • Handle: RePEc:spr:jglopt:v:73:y:2019:i:2:d:10.1007_s10898-018-0701-7
    DOI: 10.1007/s10898-018-0701-7
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10898-018-0701-7
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10898-018-0701-7?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Hamm, Lonnie & Brorsen, B. Wade, 2002. "Global Optimization Methods," 2002 Annual Meeting, July 28-31, 2002, Long Beach, California 36631, Western Agricultural Economics Association.
    2. A. Bagirov & A. Rubinov & N. Soukhoroukova & J. Yearwood, 2003. "Unsupervised and supervised data classification via nonsmooth and global optimization," TOP: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 11(1), pages 1-75, June.
    3. Sexton, Randall S. & Dorsey, Robert E. & Johnson, John D., 1999. "Optimization of neural networks: A comparative analysis of the genetic algorithm and simulated annealing," European Journal of Operational Research, Elsevier, vol. 114(3), pages 589-601, May.
    4. Veronica Piccialli & Marco Sciandrone, 2018. "Nonlinear optimization and support vector machines," 4OR, Springer, vol. 16(2), pages 111-149, June.
    5. Dimitris Bertsimas & Romy Shioda, 2007. "Classification and Regression via Integer Optimization," Operations Research, INFORMS, vol. 55(2), pages 252-271, April.
    6. Pedro Duarte Silva, A., 2017. "Optimization approaches to Supervised Classification," European Journal of Operational Research, Elsevier, vol. 261(2), pages 772-788.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Liu, Longlong & Zhou, Suyu & Jie, Qian & Du, Pei & Xu, Yan & Wang, Jianzhou, 2024. "A robust time-varying weight combined model for crude oil price forecasting," Energy, Elsevier, vol. 299(C).
    2. Emilio Carrizosa & Cristina Molero-Río & Dolores Romero Morales, 2021. "Mathematical optimization in classification and regression trees," TOP: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 29(1), pages 5-33, April.
    3. Tommaso Colombo & Massimiliano Mangone & Andrea Bernetti & Marco Paoloni & Valter Santilli & Laura Palagi, 2019. "Supervised and unsupervised learning to classify scoliosis and healthy subjects based on non-invasive rasterstereography analysis," DIAG Technical Reports 2019-08, Department of Computer, Control and Management Engineering, Universita' degli Studi di Roma "La Sapienza".
    4. Corrado Coppola & Lorenzo Papa & Marco Boresta & Irene Amerini & Laura Palagi, 2024. "Tuning parameters of deep neural network training algorithms pays off: a computational study," TOP: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 32(3), pages 579-620, October.
    5. Laura Palagi & Ruggiero Seccia, 2020. "Block layer decomposition schemes for training deep neural networks," Journal of Global Optimization, Springer, vol. 77(1), pages 97-124, May.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Laura Palagi, 2017. "Global Optimization issues in Supervised Learning. An overview," DIAG Technical Reports 2017-11, Department of Computer, Control and Management Engineering, Universita' degli Studi di Roma "La Sapienza".
    2. Emilio Carrizosa & Cristina Molero-Río & Dolores Romero Morales, 2021. "Mathematical optimization in classification and regression trees," TOP: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 29(1), pages 5-33, April.
    3. Rubinov, A.M. & Soukhorokova, N.V. & Ugon, J., 2006. "Classes and clusters in data analysis," European Journal of Operational Research, Elsevier, vol. 173(3), pages 849-865, September.
    4. Yves Crama & Michel Grabisch & Silvano Martello, 2022. "Preface," Annals of Operations Research, Springer, vol. 314(1), pages 1-3, July.
    5. Christopher Boyer & B. Brorsen, 2014. "Implications of a Reserve Price in an Agent-Based Common-Value Auction," Computational Economics, Springer;Society for Computational Economics, vol. 43(1), pages 33-51, January.
    6. Karmitsa, Napsu & Bagirov, Adil M. & Taheri, Sona, 2017. "New diagonal bundle method for clustering problems in large data sets," European Journal of Operational Research, Elsevier, vol. 263(2), pages 367-379.
    7. Young Woong Park & Yan Jiang & Diego Klabjan & Loren Williams, 2017. "Algorithms for Generalized Clusterwise Linear Regression," INFORMS Journal on Computing, INFORMS, vol. 29(2), pages 301-317, May.
    8. Geraint Johnes, 2000. "Up Around the Bend: Linear and nonlinear models of the UK economy compared," International Review of Applied Economics, Taylor & Francis Journals, vol. 14(4), pages 485-493.
    9. Pendharkar, Parag C., 2002. "A computational study on the performance of artificial neural networks under changing structural design and data distribution," European Journal of Operational Research, Elsevier, vol. 138(1), pages 155-177, April.
    10. Joo, Rocío & Bertrand, Sophie & Chaigneau, Alexis & Ñiquen, Miguel, 2011. "Optimization of an artificial neural network for identifying fishing set positions from VMS data: An example from the Peruvian anchovy purse seine fishery," Ecological Modelling, Elsevier, vol. 222(4), pages 1048-1059.
    11. Brandner, Hubertus & Lessmann, Stefan & Voß, Stefan, 2013. "A memetic approach to construct transductive discrete support vector machines," European Journal of Operational Research, Elsevier, vol. 230(3), pages 581-595.
    12. Araújo, Paulo H. M. & Campêlo, Manoel & Corrêa, Ricardo C. & Labbé, Martine, 2024. "Integer programming models and polyhedral study for the geodesic classification problem on graphs," European Journal of Operational Research, Elsevier, vol. 314(3), pages 894-911.
    13. Gambella, Claudio & Ghaddar, Bissan & Naoum-Sawaya, Joe, 2021. "Optimization problems for machine learning: A survey," European Journal of Operational Research, Elsevier, vol. 290(3), pages 807-828.
    14. Blanquero, Rafael & Carrizosa, Emilio & Molero-Río, Cristina & Romero Morales, Dolores, 2020. "Sparsity in optimal randomized classification trees," European Journal of Operational Research, Elsevier, vol. 284(1), pages 255-272.
    15. Astorino, Annabella & Avolio, Matteo & Fuduli, Antonio, 2022. "A maximum-margin multisphere approach for binary Multiple Instance Learning," European Journal of Operational Research, Elsevier, vol. 299(2), pages 642-652.
    16. Baldomero-Naranjo, Marta & Martínez-Merino, Luisa I. & Rodríguez-Chía, Antonio M., 2020. "Tightening big Ms in integer programming formulations for support vector machines with ramp loss," European Journal of Operational Research, Elsevier, vol. 286(1), pages 84-100.
    17. Sandra Benítez-Peña & Rafael Blanquero & Emilio Carrizosa & Pepa Ramírez-Cobo, 2019. "On support vector machines under a multiple-cost scenario," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 13(3), pages 663-682, September.
    18. Miriyala, Srinivas Soumitri & Subramanian, Venkat & Mitra, Kishalay, 2018. "TRANSFORM-ANN for online optimization of complex industrial processes: Casting process as case study," European Journal of Operational Research, Elsevier, vol. 264(1), pages 294-309.
    19. Gupta, Jatinder N. D. & Sexton, Randall S., 1999. "Comparing backpropagation with a genetic algorithm for neural network training," Omega, Elsevier, vol. 27(6), pages 679-684, December.
    20. Ruslan Abdulkadirov & Pavel Lyakhov & Nikolay Nagornov, 2023. "Survey of Optimization Algorithms in Modern Neural Networks," Mathematics, MDPI, vol. 11(11), pages 1-37, May.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:jglopt:v:73:y:2019:i:2:d:10.1007_s10898-018-0701-7. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.