IDEAS home Printed from https://ideas.repec.org/a/gam/jijerp/v15y2018i5p875-d143662.html
   My bibliography  Save this article

How Well Do COP22 Attendees Understand Graphs on Climate Change Health Impacts from the Fifth IPCC Assessment Report?

Author

Listed:
  • Helen Fischer

    (Department of Psychology, Heidelberg University, 69117 Heidelberg, Germany)

  • Stefanie Schütte

    (Centre Virchow-Villermé for Public Health, Paris 75004, France)

  • Anneliese Depoux

    (Centre Virchow-Villermé for Public Health, Paris 75004, France)

  • Dorothee Amelung

    (Department of Psychology, Heidelberg University, 69117 Heidelberg, Germany)

  • Rainer Sauerborn

    (Institute for Public Health, University Hospital, Heidelberg 69120, Germany)

Abstract

Graphs are prevalent in the reports of the Intergovernmental Panel on Climate Change (IPCC), often depicting key points and major results. However, the popularity of graphs in the IPCC reports contrasts with a neglect of empirical tests of their understandability. Here we put the understandability of three graphs taken from the Health chapter of the Fifth Assessment Report to an empirical test. We present a pilot study where we evaluate objective understanding (mean accuracy in multiple-choice questions) and subjective understanding (self-assessed confidence in accuracy) in a sample of attendees of the United Nations Climate Change Conference in Marrakesh, 2016 (COP22), and a student sample. Results show a mean objective understanding of M = 0.33 for the COP sample, and M = 0.38 for the student sample. Subjective and objective understanding were unrelated for the COP22 sample, but associated for the student sample. These results suggest that (i) understandability of the IPCC health chapter graphs is insufficient, and that (ii) particularly COP22 attendees lacked insight into which graphs they did, and which they did not understand. Implications for the construction of graphs to communicate health impacts of climate change to decision-makers are discussed.

Suggested Citation

  • Helen Fischer & Stefanie Schütte & Anneliese Depoux & Dorothee Amelung & Rainer Sauerborn, 2018. "How Well Do COP22 Attendees Understand Graphs on Climate Change Health Impacts from the Fifth IPCC Assessment Report?," IJERPH, MDPI, vol. 15(5), pages 1-11, April.
  • Handle: RePEc:gam:jijerp:v:15:y:2018:i:5:p:875-:d:143662
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/1660-4601/15/5/875/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/1660-4601/15/5/875/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Carmen Keller & Michael Siegrist, 2009. "Effect of Risk Communication Formats on Risk Perception Depending on Numeracy," Medical Decision Making, , vol. 29(4), pages 483-490, July.
    2. Keren, Gideon, 1987. "Facing uncertainty in the game of bridge: A calibration study," Organizational Behavior and Human Decision Processes, Elsevier, vol. 39(1), pages 98-114, February.
    3. Michael Siegrist & Pascale Orlow & Carmen Keller, 2008. "The Effect of Graphical and Numerical Presentation of Hypothetical Prenatal Diagnosis Results on Risk Perception," Medical Decision Making, , vol. 28(4), pages 567-574, July.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Rebecca Hess & Vivianne H.M. Visschers & Michael Siegrist & Carmen Keller, 2011. "How do people perceive graphical risk communication? The role of subjective numeracy," Journal of Risk Research, Taylor & Francis Journals, vol. 14(1), pages 47-61, January.
    2. Michael Siegrist & Carmen Keller, 2011. "Natural frequencies and Bayesian reasoning: the impact of formal education and problem context," Journal of Risk Research, Taylor & Francis Journals, vol. 14(9), pages 1039-1055, October.
    3. Carmen Keller, 2011. "Using a Familiar Risk Comparison Within a Risk Ladder to Improve Risk Understanding by Low Numerates: A Study of Visual Attention," Risk Analysis, John Wiley & Sons, vol. 31(7), pages 1043-1054, July.
    4. Brenner, Lyle & Griffin, Dale & Koehler, Derek J., 2005. "Modeling patterns of probability calibration with random support theory: Diagnosing case-based judgment," Organizational Behavior and Human Decision Processes, Elsevier, vol. 97(1), pages 64-81, May.
    5. McKenzie, Craig R.M. & Liersch, Michael J. & Yaniv, Ilan, 2008. "Overconfidence in interval estimates: What does expertise buy you?," Organizational Behavior and Human Decision Processes, Elsevier, vol. 107(2), pages 179-191, November.
    6. Oberlechner, Thomas & Osler, Carol, 2012. "Survival of Overconfidence in Currency Markets," Journal of Financial and Quantitative Analysis, Cambridge University Press, vol. 47(1), pages 91-113, February.
    7. repec:cup:judgdm:v:9:y:2014:i:5:p:420-432 is not listed on IDEAS
    8. Lybbert, Travis J. & Barrett, Christopher B. & McPeak, John G. & Luseno, Winnie K., 2007. "Bayesian Herders: Updating of Rainfall Beliefs in Response to External Forecasts," World Development, Elsevier, vol. 35(3), pages 480-497, March.
    9. Garcia-Retamero, Rocio & Hoffrage, Ulrich, 2013. "Visual representation of statistical information improves diagnostic inferences in doctors and their patients," Social Science & Medicine, Elsevier, vol. 83(C), pages 27-33.
    10. Dennis Dittrich & Werner Guth & Boris Maciejovsky, 2005. "Overconfidence in investment decisions: An experimental approach," The European Journal of Finance, Taylor & Francis Journals, vol. 11(6), pages 471-491.
    11. Zaleskiewicz, Tomasz, 2011. "Financial forecasts during the crisis: Were experts more accurate than laypeople?," Journal of Economic Psychology, Elsevier, vol. 32(3), pages 384-390, June.
    12. Bolger, Fergus & Wright, George, 2017. "Use of expert knowledge to anticipate the future: Issues, analysis and directions," International Journal of Forecasting, Elsevier, vol. 33(1), pages 230-243.
    13. Stephan Dickert & Janet Kleber & Ellen Peters & Paul Slovic, 2011. "Numeracy as a precursor to pro-social behavior: The impact of numeracy and presentation format on the cognitive mechanisms underlying donation decisions," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 6(7), pages 638-650, October.
    14. Michał Krawczyk & Maciej Wilamowski, 2022. "Calibration and incentives: evidence from contract bridge," Working Papers 2022-06, Faculty of Economic Sciences, University of Warsaw.
    15. repec:cup:judgdm:v:12:y:2017:i:4:p:369-381 is not listed on IDEAS
    16. Bender, Randall H., 1998. "Judgment and Response Processes across Two Knowledge Domains," Organizational Behavior and Human Decision Processes, Elsevier, vol. 75(3), pages 222-257, September.
    17. repec:cup:judgdm:v:7:y:2012:i:2:p:165-172 is not listed on IDEAS
    18. repec:cup:judgdm:v:12:y:2017:i:1:p:29-41 is not listed on IDEAS
    19. Carmen Keller & Christina Kreuzmair & Rebecca Leins-Hess & Michael Siegrist, 2014. "Numeric and graphic risk information processing of high and low numerates in the intuitive and deliberative decision modes: An eye-tracker study," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 9(5), pages 420-432, September.
    20. Mario GRAZIANO & Daniele SCHILIRÒ, 2011. "Rationality And Choices In Economics: Behavioral And Evolutionary Approaches," Theoretical and Practical Research in the Economic Fields, ASERS Publishing, vol. 2(2), pages 182-195.
    21. Leitner, Stephan & Rausch, Alexandra & Behrens, Doris A., 2017. "Distributed investment decisions and forecasting errors: An analysis based on a multi-agent simulation model," European Journal of Operational Research, Elsevier, vol. 258(1), pages 279-294.
    22. Milou Kievik & Ellen F.J. ter Huurne & Jan M. Gutteling, 2012. "The action suited to the word? Use of the framework of risk information seeking to understand risk-related behaviors," Journal of Risk Research, Taylor & Francis Journals, vol. 15(2), pages 131-147, February.
    23. J. A. Garcia & Rosa Rodriguez-Sánchez & J. Fdez-Valdivia, 2020. "Confirmatory bias in peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(1), pages 517-533, April.
    24. Ehrlinger, Joyce & Johnson, Kerri & Banner, Matthew & Dunning, David & Kruger, Justin, 2008. "Why the unskilled are unaware: Further explorations of (absent) self-insight among the incompetent," Organizational Behavior and Human Decision Processes, Elsevier, vol. 105(1), pages 98-121, January.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jijerp:v:15:y:2018:i:5:p:875-:d:143662. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.