IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v56y2012i12p3909-3920.html
   My bibliography  Save this article

EM vs MM: A case study

Author

Listed:
  • Zhou, Hua
  • Zhang, Yiwen

Abstract

The celebrated expectation–maximization (EM) algorithm is one of the most widely used optimization methods in statistics. In recent years it has been realized that EM algorithm is a special case of the more general minorization–maximization (MM) principle. Both algorithms create a surrogate function in the first (E or M) step that is maximized in the second M step. This two step process always drives the objective function uphill and is iterated until the parameters converge. The two algorithms differ in the way the surrogate function is constructed. The expectation step of the EM algorithm relies on calculating conditional expectations, while the minorization step of the MM algorithm builds on crafty use of inequalities. For many problems, EM and MM derivations yield the same algorithm. This expository note walks through the construction of both algorithms for estimating the parameters of the Dirichlet-Multinomial distribution. This particular case is of interest because EM and MM derivations lead to two different algorithms with completely distinct operating characteristics. The EM algorithm converges quickly but involves solving a nontrivial maximization problem in the M step. In contrast the MM updates are extremely simple but converge slowly. An EM–MM hybrid algorithm is derived which shows faster convergence than the MM algorithm in certain parameter regimes. The local convergence rates of the three algorithms are studied theoretically from the unifying MM point of view and also compared on numerical examples.

Suggested Citation

  • Zhou, Hua & Zhang, Yiwen, 2012. "EM vs MM: A case study," Computational Statistics & Data Analysis, Elsevier, vol. 56(12), pages 3909-3920.
  • Handle: RePEc:eee:csdana:v:56:y:2012:i:12:p:3909-3920
    DOI: 10.1016/j.csda.2012.05.018
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947312002174
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2012.05.018?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Hunter D.R. & Lange K., 2004. "A Tutorial on MM Algorithms," The American Statistician, American Statistical Association, vol. 58, pages 30-37, February.
    2. Ionita-Laza Iuliana & Laird Nan M, 2010. "On the Optimal Design of Genetic Variant Discovery Studies," Statistical Applications in Genetics and Molecular Biology, De Gruyter, vol. 9(1), pages 1-17, August.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Kenneth Lange & Eric C. Chi & Hua Zhou, 2014. "A Brief Survey of Modern Optimization for Statisticians," International Statistical Review, International Statistical Institute, vol. 82(1), pages 46-70, April.
    2. Nguyen, Hien D. & McLachlan, Geoffrey J., 2016. "Maximum likelihood estimation of triangular and polygonal distributions," Computational Statistics & Data Analysis, Elsevier, vol. 102(C), pages 23-36.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Sakyajit Bhattacharya & Paul McNicholas, 2014. "A LASSO-penalized BIC for mixture model selection," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 8(1), pages 45-61, March.
    2. Utkarsh J. Dang & Michael P.B. Gallaugher & Ryan P. Browne & Paul D. McNicholas, 2023. "Model-Based Clustering and Classification Using Mixtures of Multivariate Skewed Power Exponential Distributions," Journal of Classification, Springer;The Classification Society, vol. 40(1), pages 145-167, April.
    3. V. Maume-Deschamps & D. Rullière & A. Usseglio-Carleve, 2018. "Spatial Expectile Predictions for Elliptical Random Fields," Methodology and Computing in Applied Probability, Springer, vol. 20(2), pages 643-671, June.
    4. Sanjeena Subedi & Drew Neish & Stephen Bak & Zeny Feng, 2020. "Cluster analysis of microbiome data by using mixtures of Dirichlet–multinomial regression models," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 69(5), pages 1163-1187, November.
    5. Lian, Heng & Meng, Jie & Fan, Zengyan, 2015. "Simultaneous estimation of linear conditional quantiles with penalized splines," Journal of Multivariate Analysis, Elsevier, vol. 141(C), pages 1-21.
    6. Tian, Guo-Liang & Tang, Man-Lai & Liu, Chunling, 2012. "Accelerating the quadratic lower-bound algorithm via optimizing the shrinkage parameter," Computational Statistics & Data Analysis, Elsevier, vol. 56(2), pages 255-265.
    7. Vu, Duy & Aitkin, Murray, 2015. "Variational algorithms for biclustering models," Computational Statistics & Data Analysis, Elsevier, vol. 89(C), pages 12-24.
    8. Gunter Maris & Han Maas, 2012. "Speed-Accuracy Response Models: Scoring Rules based on Response Time and Accuracy," Psychometrika, Springer;The Psychometric Society, vol. 77(4), pages 615-633, October.
    9. Yen, Tso-Jung & Yen, Yu-Min, 2016. "Structured variable selection via prior-induced hierarchical penalty functions," Computational Statistics & Data Analysis, Elsevier, vol. 96(C), pages 87-103.
    10. Ouindllassida Jean-Etienne Ou´edraogo & Edoh Katchekpele & Simplice Dossou-Gb´et´e, 2021. "Marginalized Maximum Likelihood for Parameters Estimation of the Three Parameter Weibull Distribution," International Journal of Statistics and Probability, Canadian Center of Science and Education, vol. 10(4), pages 1-62, July.
    11. Rasmus Lentz & Suphanit Piyapromdee & Jean-Marc Robin, 2022. "The Anatomy of Sorting - Evidence from Danish Data," Working Papers hal-03869383, HAL.
    12. Groenen, P.J.F. & Bioch, J.C. & Nalbantov, G.I., 2006. "Nonlinear support vector machines through iterative majorization and I-splines," Econometric Institute Research Papers EI 2006-25, Erasmus University Rotterdam, Erasmus School of Economics (ESE), Econometric Institute.
    13. Sharon M. McNicholas & Paul D. McNicholas & Daniel A. Ashlock, 2021. "An Evolutionary Algorithm with Crossover and Mutation for Model-Based Clustering," Journal of Classification, Springer;The Classification Society, vol. 38(2), pages 264-279, July.
    14. Yoshihiro Kanno, 2018. "Robust truss topology optimization via semidefinite programming with complementarity constraints: a difference-of-convex programming approach," Computational Optimization and Applications, Springer, vol. 71(2), pages 403-433, November.
    15. Hirose, Kei & Fujisawa, Hironori & Sese, Jun, 2017. "Robust sparse Gaussian graphical modeling," Journal of Multivariate Analysis, Elsevier, vol. 161(C), pages 172-190.
    16. Deng, Lifeng & Ding, Jieli & Liu, Yanyan & Wei, Chengdong, 2018. "Regression analysis for the proportional hazards model with parameter constraints under case-cohort design," Computational Statistics & Data Analysis, Elsevier, vol. 117(C), pages 194-206.
    17. Matthew Pietrosanu & Jueyu Gao & Linglong Kong & Bei Jiang & Di Niu, 2021. "Advanced algorithms for penalized quantile and composite quantile regression," Computational Statistics, Springer, vol. 36(1), pages 333-346, March.
    18. Nguyen Thai An & Daniel Giles & Nguyen Mau Nam & R. Blake Rector, 2016. "The Log-Exponential Smoothing Technique and Nesterov’s Accelerated Gradient Method for Generalized Sylvester Problems," Journal of Optimization Theory and Applications, Springer, vol. 168(2), pages 559-583, February.
    19. Groenen, P.J.F. & Nalbantov, G.I. & Bioch, J.C., 2007. "SVM-Maj: a majorization approach to linear support vector machines with different hinge errors," Econometric Institute Research Papers EI 2007-49, Erasmus University Rotterdam, Erasmus School of Economics (ESE), Econometric Institute.
    20. Ziping Zhao & Daniel P. Palomar, 2017. "Robust Maximum Likelihood Estimation of Sparse Vector Error Correction Model," Papers 1710.05513, arXiv.org.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:56:y:2012:i:12:p:3909-3920. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.