IDEAS home Printed from https://ideas.repec.org/a/eee/ejores/v213y2011i2p395-404.html
   My bibliography  Save this article

Optimization problems in statistical learning: Duality and optimality conditions

Author

Listed:
  • Bot, Radu Ioan
  • Lorenz, Nicole

Abstract

Regularization methods are techniques for learning functions from given data. We consider regularization problems the objective function of which consisting of a cost function and a regularization term with the aim of selecting a prediction function f with a finite representation which minimizes the error of prediction. Here the role of the regularizer is to avoid overfitting. In general these are convex optimization problems with not necessarily differentiable objective functions. Thus in order to provide optimality conditions for this class of problems one needs to appeal on some specific techniques from the convex analysis. In this paper we provide a general approach for deriving necessary and sufficient optimality conditions for the regularized problem via the so-called conjugate duality theory. Afterwards we employ the obtained results to the Support Vector Machines problem and Support Vector Regression problem formulated for different cost functions.

Suggested Citation

  • Bot, Radu Ioan & Lorenz, Nicole, 2011. "Optimization problems in statistical learning: Duality and optimality conditions," European Journal of Operational Research, Elsevier, vol. 213(2), pages 395-404, September.
  • Handle: RePEc:eee:ejores:v:213:y:2011:i:2:p:395-404
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0377221711002323
    Download Restriction: Full text for ScienceDirect subscribers only
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. R. I. Boţ & S. M. Grad & G. Wanka, 2007. "New Constraint Qualification and Conjugate Duality for Composed Convex Optimization Problems," Journal of Optimization Theory and Applications, Springer, vol. 135(2), pages 241-255, November.
    2. NESTEROV, Yu., 2005. "Smooth minimization of non-smooth functions," LIDAM Reprints CORE 1819, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Radu Boţ & Christopher Hendrich, 2015. "A variable smoothing algorithm for solving convex optimization problems," TOP: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 23(1), pages 124-150, April.
    2. Toriello, Alejandro & Vielma, Juan Pablo, 2012. "Fitting piecewise linear continuous functions," European Journal of Operational Research, Elsevier, vol. 219(1), pages 86-95.
    3. Radu Boţ & André Heinrich, 2014. "Regression tasks in machine learning via Fenchel duality," Annals of Operations Research, Springer, vol. 222(1), pages 197-211, November.
    4. Corne, David & Dhaenens, Clarisse & Jourdan, Laetitia, 2012. "Synergies between operations research and data mining: The emerging use of multi-objective approaches," European Journal of Operational Research, Elsevier, vol. 221(3), pages 469-479.
    5. Brandner, Hubertus & Lessmann, Stefan & Voß, Stefan, 2013. "A memetic approach to construct transductive discrete support vector machines," European Journal of Operational Research, Elsevier, vol. 230(3), pages 581-595.
    6. Gambella, Claudio & Ghaddar, Bissan & Naoum-Sawaya, Joe, 2021. "Optimization problems for machine learning: A survey," European Journal of Operational Research, Elsevier, vol. 290(3), pages 807-828.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Masaru Ito, 2016. "New results on subgradient methods for strongly convex optimization problems with a unified analysis," Computational Optimization and Applications, Springer, vol. 65(1), pages 127-172, September.
    2. TAYLOR, Adrien B. & HENDRICKX, Julien M. & François GLINEUR, 2016. "Exact worst-case performance of first-order methods for composite convex optimization," LIDAM Discussion Papers CORE 2016052, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    3. Nguyen Dinh & Dang Hai Long, 2022. "A Perturbation Approach to Vector Optimization Problems: Lagrange and Fenchel–Lagrange Duality," Journal of Optimization Theory and Applications, Springer, vol. 194(2), pages 713-748, August.
    4. Dimitris Bertsimas & Nishanth Mundru, 2021. "Sparse Convex Regression," INFORMS Journal on Computing, INFORMS, vol. 33(1), pages 262-279, January.
    5. Alexandre Belloni & Victor Chernozhukov & Lie Wang, 2013. "Pivotal estimation via square-root lasso in nonparametric regression," CeMMAP working papers CWP62/13, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    6. DEVOLDER, Olivier & GLINEUR, François & NESTEROV, Yurii, 2013. "First-order methods with inexact oracle: the strongly convex case," LIDAM Discussion Papers CORE 2013016, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    7. Chao, Shih-Kang & Härdle, Wolfgang K. & Yuan, Ming, 2021. "Factorisable Multitask Quantile Regression," Econometric Theory, Cambridge University Press, vol. 37(4), pages 794-816, August.
    8. David Degras, 2021. "Sparse group fused lasso for model segmentation: a hybrid approach," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 15(3), pages 625-671, September.
    9. Yunmei Chen & Xiaojing Ye & Wei Zhang, 2020. "Acceleration techniques for level bundle methods in weakly smooth convex constrained optimization," Computational Optimization and Applications, Springer, vol. 77(2), pages 411-432, November.
    10. Silvia Villa & Lorenzo Rosasco & Sofia Mosci & Alessandro Verri, 2014. "Proximal methods for the latent group lasso penalty," Computational Optimization and Applications, Springer, vol. 58(2), pages 381-407, June.
    11. Wenjie Huang & Xun Zhang, 2021. "Randomized Smoothing Variance Reduction Method for Large-Scale Non-smooth Convex Optimization," SN Operations Research Forum, Springer, vol. 2(2), pages 1-28, June.
    12. Le Thi Khanh Hien & Cuong V. Nguyen & Huan Xu & Canyi Lu & Jiashi Feng, 2019. "Accelerated Randomized Mirror Descent Algorithms for Composite Non-strongly Convex Optimization," Journal of Optimization Theory and Applications, Springer, vol. 181(2), pages 541-566, May.
    13. DEVOLDER, Olivier, 2011. "Stochastic first order methods in smooth convex optimization," LIDAM Discussion Papers CORE 2011070, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    14. Boriss Siliverstovs, 2017. "Short-term forecasting with mixed-frequency data: a MIDASSO approach," Applied Economics, Taylor & Francis Journals, vol. 49(13), pages 1326-1343, March.
    15. Ya-Feng Liu & Xin Liu & Shiqian Ma, 2019. "On the Nonergodic Convergence Rate of an Inexact Augmented Lagrangian Framework for Composite Convex Programming," Mathematics of Operations Research, INFORMS, vol. 44(2), pages 632-650, May.
    16. David Müller & Vladimir Shikhman, 2022. "Network manipulation algorithm based on inexact alternating minimization," Computational Management Science, Springer, vol. 19(4), pages 627-664, October.
    17. Gondzio, Jacek, 2012. "Interior point methods 25 years later," European Journal of Operational Research, Elsevier, vol. 218(3), pages 587-601.
    18. A. Chambolle & Ch. Dossal, 2015. "On the Convergence of the Iterates of the “Fast Iterative Shrinkage/Thresholding Algorithm”," Journal of Optimization Theory and Applications, Springer, vol. 166(3), pages 968-982, September.
    19. Nima Rabiei & Jose Muñoz, 2015. "AAR-based decomposition algorithm for non-linear convex optimisation," Computational Optimization and Applications, Springer, vol. 62(3), pages 761-786, December.
    20. ARAVENA, Ignacio & PAPAVASILIOU, Anthony, 2016. "An Asynchronous Distributed Algorithm for solving Stochastic Unit Commitment," LIDAM Discussion Papers CORE 2016038, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:ejores:v:213:y:2011:i:2:p:395-404. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/eor .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.