IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0256919.html
   My bibliography  Save this article

Mathematically aggregating experts’ predictions of possible futures

Author

Listed:
  • A M Hanea
  • D P Wilkinson
  • M McBride
  • A Lyon
  • D van Ravenzwaaij
  • F Singleton Thorn
  • C Gray
  • D R Mandel
  • A Willcox
  • E Gould
  • E T Smith
  • F Mody
  • M Bush
  • F Fidler
  • H Fraser
  • B C Wintle

Abstract

Structured protocols offer a transparent and systematic way to elicit and combine/aggregate, probabilistic predictions from multiple experts. These judgements can be aggregated behaviourally or mathematically to derive a final group prediction. Mathematical rules (e.g., weighted linear combinations of judgments) provide an objective approach to aggregation. The quality of this aggregation can be defined in terms of accuracy, calibration and informativeness. These measures can be used to compare different aggregation approaches and help decide on which aggregation produces the “best” final prediction. When experts’ performance can be scored on similar questions ahead of time, these scores can be translated into performance-based weights, and a performance-based weighted aggregation can then be used. When this is not possible though, several other aggregation methods, informed by measurable proxies for good performance, can be formulated and compared. Here, we develop a suite of aggregation methods, informed by previous experience and the available literature. We differentially weight our experts’ estimates by measures of reasoning, engagement, openness to changing their mind, informativeness, prior knowledge, and extremity, asymmetry or granularity of estimates. Next, we investigate the relative performance of these aggregation methods using three datasets. The main goal of this research is to explore how measures of knowledge and behaviour of individuals can be leveraged to produce a better performing combined group judgment. Although the accuracy, calibration, and informativeness of the majority of methods are very similar, a couple of the aggregation methods consistently distinguish themselves as among the best or worst. Moreover, the majority of methods outperform the usual benchmarks provided by the simple average or the median of estimates.

Suggested Citation

  • A M Hanea & D P Wilkinson & M McBride & A Lyon & D van Ravenzwaaij & F Singleton Thorn & C Gray & D R Mandel & A Willcox & E Gould & E T Smith & F Mody & M Bush & F Fidler & H Fraser & B C Wintle, 2021. "Mathematically aggregating experts’ predictions of possible futures," PLOS ONE, Public Library of Science, vol. 16(9), pages 1-24, September.
  • Handle: RePEc:plo:pone00:0256919
    DOI: 10.1371/journal.pone.0256919
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0256919
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0256919&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0256919?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Anca M. Hanea & Marissa F. McBride & Mark A. Burgman & Bonnie C. Wintle, 2018. "The Value of Performance Weights and Discussion in Aggregated Expert Judgments," Risk Analysis, John Wiley & Sons, vol. 38(9), pages 1781-1794, September.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Mauksch, Stefanie & von der Gracht, Heiko A. & Gordon, Theodore J., 2020. "Who is an expert for foresight? A review of identification methods," Technological Forecasting and Social Change, Elsevier, vol. 154(C).
    2. Ren, Xin & Nane, Gabriela F. & Terwel, Karel C. & van Gelder, Pieter H.A.J.M., 2024. "Measuring the impacts of human and organizational factors on human errors in the Dutch construction industry using structured expert judgement," Reliability Engineering and System Safety, Elsevier, vol. 244(C).

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0256919. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.