IDEAS home Printed from https://ideas.repec.org/p/nbr/nberwo/22566.html
   My bibliography  Save this paper

Predicting Experimental Results: Who Knows What?

Author

Listed:
  • Stefano DellaVigna
  • Devin Pope

Abstract

Academic experts frequently recommend policies and treatments. But how well do they anticipate the impact of different treatments? And how do their predictions compare to the predictions of non-experts? We analyze how 208 experts forecast the results of 15 treatments involving monetary and non-monetary motivators in a real-effort task. We compare these forecasts to those made by PhD students and non-experts: undergraduates, MBAs, and an online sample. We document seven main results. First, the average forecast of experts predicts quite well the experimental results. Second, there is a strong wisdom-of-crowds effect: the average forecast outperforms 96 percent of individual forecasts. Third, correlates of expertise---citations, academic rank, field, and contextual experience--do not improve forecasting accuracy. Fourth, experts as a group do better than non-experts, but not if accuracy is defined as rank ordering treatments. Fifth, measures of effort, confidence, and revealed ability are predictive of forecast accuracy to some extent, especially for non-experts. Sixth, using these measures we identify `superforecasters' among the non-experts who outperform the experts out of sample. Seventh, we document that these results on forecasting accuracy surprise the forecasters themselves. We present a simple model that organizes several of these results and we stress the implications for the collection of forecasts of future experimental results.

Suggested Citation

  • Stefano DellaVigna & Devin Pope, 2016. "Predicting Experimental Results: Who Knows What?," NBER Working Papers 22566, National Bureau of Economic Research, Inc.
  • Handle: RePEc:nbr:nberwo:22566
    Note: DEV ED EH LS PE PR
    as

    Download full text from publisher

    File URL: http://www.nber.org/papers/w22566.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. John Horton & David Rand & Richard Zeckhauser, 2011. "The online laboratory: conducting experiments in a real labor market," Experimental Economics, Springer;Economic Science Association, vol. 14(3), pages 399-425, September.
    2. Erik Snowberg & Justin Wolfers & Eric Zitzewitz, 2007. "Partisan Impacts on the Economy: Evidence from Prediction Markets and Close Elections," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 122(2), pages 807-829.
    3. Alberto Cavallo & Guillermo Cruces & Ricardo Perez-Truglia, 2017. "Inflation Expectations, Learning, and Supermarket Prices: Evidence from Survey Experiments," American Economic Journal: Macroeconomics, American Economic Association, vol. 9(3), pages 1-35, July.
    4. repec:cup:judgdm:v:5:y:2010:i:5:p:411-419 is not listed on IDEAS
    5. Kahneman, Daniel & Schkade, David & Sunstein, Cass R, 1998. "Shared Outrage and Erratic Awards: The Psychology of Punitive Damages," Journal of Risk and Uncertainty, Springer, vol. 16(1), pages 49-86, April.
    6. Stefano DellaVigna & Devin Pope, 2018. "What Motivates Effort? Evidence and Expert Forecasts," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 85(2), pages 1029-1069.
    7. Eva Vivalt, 0. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    8. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    9. Jonah Berger & Devin Pope, 2011. "Can Losing Lead to Winning?," Management Science, INFORMS, vol. 57(5), pages 817-827, May.
    10. Roth, Alvin E. & Herzog, Stefan & Hau, Robin & Hertwig, Ralph & Erev, Ido & Ert, Eyal & Haruvy, Ernan & Stewart, Terrence & West, Robert & Lebiere, Christian, 2009. "A Choice Prediction Competition: Choices From Experience and From Description," Scholarly Articles 5343169, Harvard University Department of Economics.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Stefano DellaVigna & Devin Pope, 2022. "Stability of Experimental Results: Forecasts and Evidence," American Economic Journal: Microeconomics, American Economic Association, vol. 14(3), pages 889-925, August.
    2. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    3. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    4. Stefano DellaVigna & Devin Pope, 2018. "What Motivates Effort? Evidence and Expert Forecasts," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 85(2), pages 1029-1069.
    5. Chadimová, Kateřina & Cahlíková, Jana & Cingl, Lubomír, 2022. "Foretelling what makes people pay: Predicting the results of field experiments on TV fee enforcement," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 100(C).
    6. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
    7. Grewenig, Elisabeth & Lergetporer, Philipp & Werner, Katharina & Woessmann, Ludger, 2022. "Incentives, search engines, and the elicitation of subjective beliefs: Evidence from representative online survey experiments," Journal of Econometrics, Elsevier, vol. 231(1), pages 304-326.
    8. Dato, Simon & Feess, Eberhard & Nieken, Petra, 2019. "Lying and reciprocity," Games and Economic Behavior, Elsevier, vol. 118(C), pages 193-218.
    9. Cook, Nikolai & Heyes, Anthony, 2022. "Pollution pictures: Psychological exposure to pollution impacts worker productivity in a large-scale field experiment," Journal of Environmental Economics and Management, Elsevier, vol. 114(C).
    10. Peter Andrebriq & Carlo Pizzinelli & Christopher Roth & Johannes Wohlfart, 2022. "Subjective Models of the Macroeconomy: Evidence From Experts and Representative Samples," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 89(6), pages 2958-2991.
    11. Lefgren, Lars J. & Sims, David P. & Stoddard, Olga B., 2016. "Effort, luck, and voting for redistribution," Journal of Public Economics, Elsevier, vol. 143(C), pages 89-97.
    12. Fernando Hoces de la Guardia & Sean Grant & Edward Miguel, 2021. "A framework for open policy analysis," Science and Public Policy, Oxford University Press, vol. 48(2), pages 154-163.
    13. Peter Andre & Ingar Haaland & Christopher Roth & Mirko Wiederholt & Johannes Wohlfart, 2021. "Narratives about the Macroeconomy," ECONtribute Discussion Papers Series 127, University of Bonn and University of Cologne, Germany.
    14. Lena Dräger & Klaus Gründler & Niklas Potrafke, 2022. "Political Shocks and Inflation Expectations: Evidence from the 2022 Russian Invasion of Ukraine," ifo Working Paper Series 371, ifo Institute - Leibniz Institute for Economic Research at the University of Munich.
    15. Julien Senn & Jan Schmitz & Christian Zehnder, 2023. "Leveraging social comparisons: the role of peer assignment policies," ECON - Working Papers 427, Department of Economics - University of Zurich, revised Aug 2023.
    16. Saskia Opitz & Dirk Sliwka & Timo Vogelsang & Tom Zimmermann, 2022. "The Targeted Assignment of Incentive Schemes," ECONtribute Discussion Papers Series 187, University of Bonn and University of Cologne, Germany.
    17. Doerrenberg, Philipp & Duncan, Denvil & Löffler, Max, 2023. "Asymmetric labor-supply responses to wage changes: Experimental evidence from an online labor market," Labour Economics, Elsevier, vol. 81(C).
    18. Guo, Yiting & Shachat, Jason & Walker, Matthew J. & Wei, Lijia, 2023. "On the generalizability of using mobile devices to conduct economic experiments," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 106(C).
    19. Wladislaw Mill & Jonathan Staebler, 2023. "Spite in Litigation," CRC TR 224 Discussion Paper Series crctr224_2023_401, University of Bonn and University of Mannheim, Germany.
    20. Jonathan Quidt & Francesco Fallucchi & Felix Kölle & Daniele Nosenzo & Simone Quercia, 2017. "Bonus versus penalty: How robust are the effects of contract framing?," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 3(2), pages 174-182, December.

    More about this item

    JEL classification:

    • C9 - Mathematical and Quantitative Methods - - Design of Experiments
    • C91 - Mathematical and Quantitative Methods - - Design of Experiments - - - Laboratory, Individual Behavior
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
    • D03 - Microeconomics - - General - - - Behavioral Microeconomics: Underlying Principles

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nbr:nberwo:22566. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: the person in charge (email available below). General contact details of provider: https://edirc.repec.org/data/nberrus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.