IDEAS home Printed from https://ideas.repec.org/a/eee/ejores/v213y2011i2p395-404.html
   My bibliography  Save this article

Optimization problems in statistical learning: Duality and optimality conditions

Author

Listed:
  • Bot, Radu Ioan
  • Lorenz, Nicole

Abstract

Regularization methods are techniques for learning functions from given data. We consider regularization problems the objective function of which consisting of a cost function and a regularization term with the aim of selecting a prediction function f with a finite representation which minimizes the error of prediction. Here the role of the regularizer is to avoid overfitting. In general these are convex optimization problems with not necessarily differentiable objective functions. Thus in order to provide optimality conditions for this class of problems one needs to appeal on some specific techniques from the convex analysis. In this paper we provide a general approach for deriving necessary and sufficient optimality conditions for the regularized problem via the so-called conjugate duality theory. Afterwards we employ the obtained results to the Support Vector Machines problem and Support Vector Regression problem formulated for different cost functions.

Suggested Citation

  • Bot, Radu Ioan & Lorenz, Nicole, 2011. "Optimization problems in statistical learning: Duality and optimality conditions," European Journal of Operational Research, Elsevier, vol. 213(2), pages 395-404, September.
  • Handle: RePEc:eee:ejores:v:213:y:2011:i:2:p:395-404
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0377221711002323
    Download Restriction: Full text for ScienceDirect subscribers only
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. NESTEROV, Yu., 2005. "Smooth minimization of non-smooth functions," LIDAM Reprints CORE 1819, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    2. R. I. Boţ & S. M. Grad & G. Wanka, 2007. "New Constraint Qualification and Conjugate Duality for Composed Convex Optimization Problems," Journal of Optimization Theory and Applications, Springer, vol. 135(2), pages 241-255, November.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Toriello, Alejandro & Vielma, Juan Pablo, 2012. "Fitting piecewise linear continuous functions," European Journal of Operational Research, Elsevier, vol. 219(1), pages 86-95.
    2. Brandner, Hubertus & Lessmann, Stefan & Voß, Stefan, 2013. "A memetic approach to construct transductive discrete support vector machines," European Journal of Operational Research, Elsevier, vol. 230(3), pages 581-595.
    3. Radu Boţ & Christopher Hendrich, 2015. "A variable smoothing algorithm for solving convex optimization problems," TOP: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 23(1), pages 124-150, April.
    4. Corne, David & Dhaenens, Clarisse & Jourdan, Laetitia, 2012. "Synergies between operations research and data mining: The emerging use of multi-objective approaches," European Journal of Operational Research, Elsevier, vol. 221(3), pages 469-479.
    5. Gambella, Claudio & Ghaddar, Bissan & Naoum-Sawaya, Joe, 2021. "Optimization problems for machine learning: A survey," European Journal of Operational Research, Elsevier, vol. 290(3), pages 807-828.
    6. Radu Boţ & André Heinrich, 2014. "Regression tasks in machine learning via Fenchel duality," Annals of Operations Research, Springer, vol. 222(1), pages 197-211, November.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Vincenzo Bonifaci, 2021. "A Laplacian approach to $$\ell _1$$ ℓ 1 -norm minimization," Computational Optimization and Applications, Springer, vol. 79(2), pages 441-469, June.
    2. Chao, Shih-Kang & Härdle, Wolfgang K. & Yuan, Ming, 2021. "Factorisable Multitask Quantile Regression," Econometric Theory, Cambridge University Press, vol. 37(4), pages 794-816, August.
    3. Xiubo Liang & Guoqiang Wang & Bo Yu, 2022. "A reduced proximal-point homotopy method for large-scale non-convex BQP," Computational Optimization and Applications, Springer, vol. 81(2), pages 539-567, March.
    4. Huynh Ngai & Ta Anh Son, 2022. "Generalized Nesterov’s accelerated proximal gradient algorithms with convergence rate of order o(1/k2)," Computational Optimization and Applications, Springer, vol. 83(2), pages 615-649, November.
    5. Shipra Agrawal & Nikhil R. Devanur, 2019. "Bandits with Global Convex Constraints and Objective," Operations Research, INFORMS, vol. 67(5), pages 1486-1502, September.
    6. Samid Hoda & Andrew Gilpin & Javier Peña & Tuomas Sandholm, 2010. "Smoothing Techniques for Computing Nash Equilibria of Sequential Games," Mathematics of Operations Research, INFORMS, vol. 35(2), pages 494-512, May.
    7. Raphael Hauser & Sergey Shahverdyan, 2015. "A New Approach to Model Free Option Pricing," Papers 1501.03701, arXiv.org.
    8. Manlio Gaudioso & Giovanni Giallombardo & Giovanna Miglionico, 2022. "Essentials of numerical nonsmooth optimization," Annals of Operations Research, Springer, vol. 314(1), pages 213-253, July.
    9. Daniel Dadush & László A. Végh & Giacomo Zambelli, 2020. "Rescaling Algorithms for Linear Conic Feasibility," Mathematics of Operations Research, INFORMS, vol. 45(2), pages 732-754, May.
    10. Nishanth Dikkala & Greg Lewis & Lester Mackey & Vasilis Syrgkanis, 2020. "Minimax Estimation of Conditional Moment Models," Papers 2006.07201, arXiv.org.
    11. Dirk Lorenz & Marc Pfetsch & Andreas Tillmann, 2014. "An infeasible-point subgradient method using adaptive approximate projections," Computational Optimization and Applications, Springer, vol. 57(2), pages 271-306, March.
    12. Chunming Tang & Bo He & Zhenzhen Wang, 2020. "Modified Accelerated Bundle-Level Methods and Their Application in Two-Stage Stochastic Programming," Mathematics, MDPI, vol. 8(2), pages 1-26, February.
    13. Maurya, Ashwini, 2014. "A joint convex penalty for inverse covariance matrix estimation," Computational Statistics & Data Analysis, Elsevier, vol. 75(C), pages 15-27.
    14. Jueyou Li & Zhiyou Wu & Changzhi Wu & Qiang Long & Xiangyu Wang, 2016. "An Inexact Dual Fast Gradient-Projection Method for Separable Convex Optimization with Linear Coupled Constraints," Journal of Optimization Theory and Applications, Springer, vol. 168(1), pages 153-171, January.
    15. Nguyen Thai An & Daniel Giles & Nguyen Mau Nam & R. Blake Rector, 2016. "The Log-Exponential Smoothing Technique and Nesterov’s Accelerated Gradient Method for Generalized Sylvester Problems," Journal of Optimization Theory and Applications, Springer, vol. 168(2), pages 559-583, February.
    16. Guoyin Li & Alfred Ma & Ting Pong, 2014. "Robust least square semidefinite programming with applications," Computational Optimization and Applications, Springer, vol. 58(2), pages 347-379, June.
    17. Masaru Ito, 2016. "New results on subgradient methods for strongly convex optimization problems with a unified analysis," Computational Optimization and Applications, Springer, vol. 65(1), pages 127-172, September.
    18. TAYLOR, Adrien B. & HENDRICKX, Julien M. & François GLINEUR, 2016. "Exact worst-case performance of first-order methods for composite convex optimization," LIDAM Discussion Papers CORE 2016052, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    19. Chen, Xuerong & Li, Haoqi & Liang, Hua & Lin, Huazhen, 2019. "Functional response regression analysis," Journal of Multivariate Analysis, Elsevier, vol. 169(C), pages 218-233.
    20. Nguyen Dinh & Dang Hai Long, 2022. "A Perturbation Approach to Vector Optimization Problems: Lagrange and Fenchel–Lagrange Duality," Journal of Optimization Theory and Applications, Springer, vol. 194(2), pages 713-748, August.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:ejores:v:213:y:2011:i:2:p:395-404. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/eor .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.