IDEAS home Printed from https://ideas.repec.org/a/spr/climat/v132y2015i1p31-45.html
   My bibliography  Save this article

Towards improving the framework for probabilistic forecast evaluation

Author

Listed:
  • Leonard Smith
  • Emma Suckling
  • Erica Thompson
  • Trevor Maynard
  • Hailiang Du

Abstract

The evaluation of forecast performance plays a central role both in the interpretation and use of forecast systems and in their development. Different evaluation measures (scores) are available, often quantifying different characteristics of forecast performance. The properties of several proper scores for probabilistic forecast evaluation are contrasted and then used to interpret decadal probability hindcasts of global mean temperature. The Continuous Ranked Probability Score (CRPS), Proper Linear (PL) score, and IJ Good’s logarithmic score (also referred to as Ignorance) are compared; although information from all three may be useful, the logarithmic score has an immediate interpretation and is not insensitive to forecast busts. Neither CRPS nor PL is local; this is shown to produce counter intuitive evaluations by CRPS. Benchmark forecasts from empirical models like Dynamic Climatology place the scores in context. Comparing scores for forecast systems based on physical models (in this case HadCM3, from the CMIP5 decadal archive) against such benchmarks is more informative than internal comparison systems based on similar physical simulation models with each other. It is shown that a forecast system based on HadCM3 out performs Dynamic Climatology in decadal global mean temperature hindcasts; Dynamic Climatology previously outperformed a forecast system based upon HadGEM2 and reasons for these results are suggested. Forecasts of aggregate data (5-year means of global mean temperature) are, of course, narrower than forecasts of annual averages due to the suppression of variance; while the average “distance” between the forecasts and a target may be expected to decrease, little if any discernible improvement in probabilistic skill is achieved. Copyright The Author(s) 2015

Suggested Citation

  • Leonard Smith & Emma Suckling & Erica Thompson & Trevor Maynard & Hailiang Du, 2015. "Towards improving the framework for probabilistic forecast evaluation," Climatic Change, Springer, vol. 132(1), pages 31-45, September.
  • Handle: RePEc:spr:climat:v:132:y:2015:i:1:p:31-45
    DOI: 10.1007/s10584-015-1430-2
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1007/s10584-015-1430-2
    Download Restriction: Access to full text is restricted to subscribers.

    File URL: https://libkey.io/10.1007/s10584-015-1430-2?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Leonard Smith & Hailiang Du & Emma Suckling & Falk Nieh�rster, 2014. "Probabilistic skill in ensemble seasonal forecasts," GRI Working Papers 151, Grantham Research Institute on Climate Change and the Environment.
    2. Suckling, Emma B. & Smith, Leonard A., 2013. "An evaluation of decadal probability forecasts from state-of-the-art climate models," LSE Research Online Documents on Economics 55142, London School of Economics and Political Science, LSE Library.
    3. Daniel Friedman, 1983. "Effective Scoring Rules for Probabilistic Forecasts," Management Science, INFORMS, vol. 29(4), pages 447-454, April.
    4. Gneiting, Tilmann & Raftery, Adrian E., 2007. "Strictly Proper Scoring Rules, Prediction, and Estimation," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 359-378, March.
    5. Reinhard Selten, 1998. "Axiomatic Characterization of the Quadratic Scoring Rule," Experimental Economics, Springer;Economic Science Association, vol. 1(1), pages 43-61, June.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Taillardat, Maxime & Fougères, Anne-Laure & Naveau, Philippe & de Fondeville, Raphaël, 2023. "Evaluating probabilistic forecasts of extremes using continuous ranked probability score distributions," International Journal of Forecasting, Elsevier, vol. 39(3), pages 1448-1459.
    2. Graziani, Carlo & Rosner, Robert & Adams, Jennifer M. & Machete, Reason L., 2021. "Probabilistic recalibration of forecasts," International Journal of Forecasting, Elsevier, vol. 37(1), pages 1-27.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Victor Richmond R. Jose & Robert F. Nau & Robert L. Winkler, 2008. "Scoring Rules, Generalized Entropy, and Utility Maximization," Operations Research, INFORMS, vol. 56(5), pages 1146-1157, October.
    2. Wheatcroft Edward, 2021. "Evaluating probabilistic forecasts of football matches: the case against the ranked probability score," Journal of Quantitative Analysis in Sports, De Gruyter, vol. 17(4), pages 273-287, December.
    3. Karl Schlag & James Tremewan & Joël Weele, 2015. "A penny for your thoughts: a survey of methods for eliciting beliefs," Experimental Economics, Springer;Economic Science Association, vol. 18(3), pages 457-490, September.
    4. Karl Schlag & James Tremewan & Joël Weele, 2015. "A penny for your thoughts: a survey of methods for eliciting beliefs," Experimental Economics, Springer;Economic Science Association, vol. 18(3), pages 457-490, September.
    5. Papakonstantinou, A. & Bogetoft, P., 2013. "Crowd-sourcing with uncertain quality - an auction approach," MPRA Paper 44236, University Library of Munich, Germany.
    6. Norde, Henk & Voorneveld, Mark, 2019. "Feasible best-response correspondences and quadratic scoring rules," SSE Working Paper Series in Economics 2019:2, Stockholm School of Economics.
    7. Plott, Charles R. & Salmon, Timothy C., 2004. "The simultaneous, ascending auction: dynamics of price adjustment in experiments and in the UK3G spectrum auction," Journal of Economic Behavior & Organization, Elsevier, vol. 53(3), pages 353-383, March.
    8. Breitmoser, Yves & Tan, Jonathan H.W., 2020. "Why should majority voting be unfair?," Journal of Economic Behavior & Organization, Elsevier, vol. 175(C), pages 281-295.
    9. de Haan, Thomas, 2020. "Eliciting belief distributions using a random two-level partitioning of the state space," Working Papers in Economics 1/20, University of Bergen, Department of Economics.
    10. Radosveta Ivanova-Stenzel & Timothy C. Salmon, 2004. "Bidder Preferences among Auction Institutions," Economic Inquiry, Western Economic Association International, vol. 42(2), pages 223-236, April.
    11. Luisa Bisaglia & Matteo Grigoletto, 2021. "A new time-varying model for forecasting long-memory series," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 30(1), pages 139-155, March.
    12. Wheatcroft, Edward, 2021. "Evaluating probabilistic forecasts of football matches: the case against the ranked probability score," LSE Research Online Documents on Economics 111494, London School of Economics and Political Science, LSE Library.
    13. Tommaso Proietti & Martyna Marczak & Gianluigi Mazzi, 2017. "Euromind‐ D : A Density Estimate of Monthly Gross Domestic Product for the Euro Area," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 32(3), pages 683-703, April.
    14. Krüger, Fabian & Pavlova, Lora, 2019. "Quantifying subjective oncertainty in survey expectations," Working Papers 0664, University of Heidelberg, Department of Economics.
    15. Fang, Fang & Stinchcombe, Maxwell B. & Whinston, Andrew B., 2010. "Proper scoring rules with arbitrary value functions," Journal of Mathematical Economics, Elsevier, vol. 46(6), pages 1200-1210, November.
    16. Arthur Carvalho & Kate Larson, 2012. "Sharing Rewards Among Strangers Based on Peer Evaluations," Decision Analysis, INFORMS, vol. 9(3), pages 253-273, September.
    17. Eva Regnier, 2018. "Probability Forecasts Made at Multiple Lead Times," Management Science, INFORMS, vol. 64(5), pages 2407-2426, May.
    18. Nolan Miller & Paul Resnick & Richard Zeckhauser, 2005. "Eliciting Informative Feedback: The Peer-Prediction Method," Management Science, INFORMS, vol. 51(9), pages 1359-1373, September.
    19. J. Eric Bickel, 2007. "Some Comparisons among Quadratic, Spherical, and Logarithmic Scoring Rules," Decision Analysis, INFORMS, vol. 4(2), pages 49-65, June.
    20. P. Schanbacher, 2014. "Measuring and adjusting for overconfidence," Decisions in Economics and Finance, Springer;Associazione per la Matematica, vol. 37(2), pages 423-452, October.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:climat:v:132:y:2015:i:1:p:31-45. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.