IDEAS home Printed from https://ideas.repec.org/a/spr/pharme/v35y2017i6d10.1007_s40273-017-0494-4.html
   My bibliography  Save this article

Bayesian Methods for Calibrating Health Policy Models: A Tutorial

Author

Listed:
  • Nicolas A. Menzies

    (Harvard T.H. Chan School of Public Health
    Harvard T.H. Chan School of Public Health)

  • Djøra I. Soeteman

    (Harvard T.H. Chan School of Public Health)

  • Ankur Pandya

    (Harvard T.H. Chan School of Public Health
    Harvard T.H. Chan School of Public Health)

  • Jane J. Kim

    (Harvard T.H. Chan School of Public Health
    Harvard T.H. Chan School of Public Health)

Abstract

Mathematical simulation models are commonly used to inform health policy decisions. These health policy models represent the social and biological mechanisms that determine health and economic outcomes, combine multiple sources of evidence about how policy alternatives will impact those outcomes, and synthesize outcomes into summary measures salient for the policy decision. Calibrating these health policy models to fit empirical data can provide face validity and improve the quality of model predictions. Bayesian methods provide powerful tools for model calibration. These methods summarize information relevant to a particular policy decision into (1) prior distributions for model parameters, (2) structural assumptions of the model, and (3) a likelihood function created from the calibration data, combining these different sources of evidence via Bayes’ theorem. This article provides a tutorial on Bayesian approaches for model calibration, describing the theoretical basis for Bayesian calibration approaches as well as pragmatic considerations that arise in the tasks of creating calibration targets, estimating the posterior distribution, and obtaining results to inform the policy decision. These considerations, as well as the specific steps for implementing the calibration, are described in the context of an extended worked example about the policy choice to provide (or not provide) treatment for a hypothetical infectious disease. Given the many simplifications and subjective decisions required to create prior distributions, model structure, and likelihood, calibration should be considered an exercise in creating a reasonable model that produces valid evidence for policy, rather than as a technique for identifying a unique theoretically optimal summary of the evidence.

Suggested Citation

  • Nicolas A. Menzies & Djøra I. Soeteman & Ankur Pandya & Jane J. Kim, 2017. "Bayesian Methods for Calibrating Health Policy Models: A Tutorial," PharmacoEconomics, Springer, vol. 35(6), pages 613-624, June.
  • Handle: RePEc:spr:pharme:v:35:y:2017:i:6:d:10.1007_s40273-017-0494-4
    DOI: 10.1007/s40273-017-0494-4
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s40273-017-0494-4
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s40273-017-0494-4?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Elisabeth Fenwick & Bernie J. O'Brien & Andrew Briggs, 2004. "Cost‐effectiveness acceptability curves – facts, fallacies and frequently asked questions," Health Economics, John Wiley & Sons, Ltd., vol. 13(5), pages 405-415, May.
    2. Andrew H. Briggs, 1999. "A Bayesian approach to stochastic cost‐effectiveness analysis," Health Economics, John Wiley & Sons, Ltd., vol. 8(3), pages 257-261, May.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Vahab Vahdat & Oguzhan Alagoz & Jing Voon Chen & Leila Saoud & Bijan J. Borah & Paul J. Limburg, 2023. "Calibration and Validation of the Colorectal Cancer and Adenoma Incidence and Mortality (CRC-AIM) Microsimulation Model Using Deep Neural Networks," Medical Decision Making, , vol. 43(6), pages 719-736, August.
    2. C Marijn Hazelbag & Jonathan Dushoff & Emanuel M Dominic & Zinhle E Mthombothi & Wim Delva, 2020. "Calibration of individual-based models to epidemiological data: A systematic review," PLOS Computational Biology, Public Library of Science, vol. 16(5), pages 1-17, May.
    3. Penny R. Breeze & Hazel Squires & Kate Ennis & Petra Meier & Kate Hayes & Nik Lomax & Alan Shiell & Frank Kee & Frank de Vocht & Martin O’Flaherty & Nigel Gilbert & Robin Purshouse & Stewart Robinson , 2023. "Guidance on the use of complex systems models for economic evaluations of public health interventions," Health Economics, John Wiley & Sons, Ltd., vol. 32(7), pages 1603-1625, July.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. O'Neill, Donal, 2009. "A Cost-Benefit Analysis of Early Childhood Intervention: Evidence from a Randomised Evaluation of a Parenting Programme," IZA Discussion Papers 4518, Institute of Labor Economics (IZA).
    2. Saha, Sanjib & Gerdtham, Ulf-G. & Toresson, Håkan & Minthon, Lennart & Jarl, Johan, 2018. "Economic Evaluation of Interventions for Screening of Dementia," Working Papers 2018:20, Lund University, Department of Economics.
    3. A. E. Ades & Karl Claxton & Mark Sculpher, 2006. "Evidence synthesis, parameter correlation and probabilistic sensitivity analysis," Health Economics, John Wiley & Sons, Ltd., vol. 15(4), pages 373-381, April.
    4. Lakdawalla, Darius N. & Seabury, Seth A., 2012. "The welfare effects of medical malpractice liability," International Review of Law and Economics, Elsevier, vol. 32(4), pages 356-369.
    5. Pedram Sendi & Huldrych F Günthard & Mathew Simcock & Bruno Ledergerber & Jörg Schüpbach & Manuel Battegay & for the Swiss HIV Cohort Study, 2007. "Cost-Effectiveness of Genotypic Antiretroviral Resistance Testing in HIV-Infected Patients with Treatment Failure," PLOS ONE, Public Library of Science, vol. 2(1), pages 1-8, January.
    6. Karl Claxton & Elisabeth Fenwick & Mark J. Sculpher, 2012. "Decision-making with Uncertainty: The Value of Information," Chapters, in: Andrew M. Jones (ed.), The Elgar Companion to Health Economics, Second Edition, chapter 51, Edward Elgar Publishing.
    7. Saha, Sanjib & Gerdtham, Ulf-G. & Toresson, Håkan & Minthon, Lennart & Jarl, Johan, 2018. "Economic Evaluation of Nonpharmacological Interventions for Dementia Patients and their Caregivers - A Systematic Literature Review," Working Papers 2018:10, Lund University, Department of Economics.
    8. Salah Ghabri & Françoise F. Hamers & Jean Michel Josselin, 2016. "Exploring Uncertainty in Economic Evaluations of Drugs and Medical Devices: Lessons from the First Review of Manufacturers’ Submissions to the French National Authority for Health," PharmacoEconomics, Springer, vol. 34(6), pages 617-624, June.
    9. Sood Neeraj & Philipson Tomas J. & Huckfeldt Peter, 2013. "Quantifying the Value of Personalized Medicines: Evidence from COX-2 Inhibitors," Forum for Health Economics & Policy, De Gruyter, vol. 16(1), pages 101-122, April.
    10. Anthony O'Hagan & John W. Stevens, 2003. "Assessing and comparing costs: how robust are the bootstrap and methods based on asymptotic normality?," Health Economics, John Wiley & Sons, Ltd., vol. 12(1), pages 33-49, January.
    11. Jonathan Karnon, 2003. "Alternative decision modelling techniques for the evaluation of health care technologies: Markov processes versus discrete event simulation," Health Economics, John Wiley & Sons, Ltd., vol. 12(10), pages 837-848, October.
    12. Richard M. Nixon & David Wonderling & Richard D. Grieve, 2010. "Non‐parametric methods for cost‐effectiveness analysis: the central limit theorem and the bootstrap compared," Health Economics, John Wiley & Sons, Ltd., vol. 19(3), pages 316-333, March.
    13. Michal Jakubczyk, 2016. "Choosing from multiple alternatives in cost-effectiveness analysis with fuzzy willingness-to-pay/accept and uncertainty," KAE Working Papers 2016-006, Warsaw School of Economics, Collegium of Economic Analysis.
    14. S. Boyer & M. L. Nishimwe & L. Sagaon-Teyssier & L. March & S. Koulla-Shiro & M.-Q. Bousmah & R. Toby & M. P. Mpoudi-Etame & N. F. Ngom Gueye & A. Sawadogo & C. Kouanfack & L. Ciaffi & B. Spire & E. D, 2020. "Cost-Effectiveness of Three Alternative Boosted Protease Inhibitor-Based Second-Line Regimens in HIV-Infected Patients in West and Central Africa," PharmacoEconomics - Open, Springer, vol. 4(1), pages 45-60, March.
    15. K. Claxton & P. J. Neumannn & S. S. Araki & M. C. Weinstein, "undated". "Bayesian Value-of-Information Analysis: An Application to a Policy Model of Alzheimer's Disease," Discussion Papers 00/39, Department of Economics, University of York.
    16. Andrew R. Willan & Matthew E. Kowgier, 2008. "Cost‐effectiveness analysis of a multinational RCT with a binary measure of effectiveness and an interacting covariate," Health Economics, John Wiley & Sons, Ltd., vol. 17(7), pages 777-791, July.
    17. Andrea Manca & Nigel Rice & Mark J. Sculpher & Andrew H. Briggs, 2005. "Assessing generalisability by location in trial‐based cost‐effectiveness analysis: the use of multilevel models," Health Economics, John Wiley & Sons, Ltd., vol. 14(5), pages 471-485, May.
    18. Mbathio Dieng & Nikita Khanna & Nadine A. Kasparian & Daniel S. J. Costa & Phyllis N. Butow & Scott W. Menzies & Graham J. Mann & Anne E Cust & Rachael L. Morton, 2019. "Cost-Effectiveness of a Psycho-Educational Intervention Targeting Fear of Cancer Recurrence in People Treated for Early-Stage Melanoma," Applied Health Economics and Health Policy, Springer, vol. 17(5), pages 669-681, October.
    19. Martina Lundqvist & Jenny Alwin & Lars-Åke Levin, 2019. "Certified service dogs – A cost-effectiveness analysis appraisal," PLOS ONE, Public Library of Science, vol. 14(9), pages 1-13, September.
    20. John Mullahy, 2017. "Individual Results May Vary: Elementary Analytics of Inequality-Probability Bounds, with Applications to Health-Outcome Treatment Effects," NBER Working Papers 23603, National Bureau of Economic Research, Inc.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:pharme:v:35:y:2017:i:6:d:10.1007_s40273-017-0494-4. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.