IDEAS home Printed from https://ideas.repec.org/a/inm/ormnsc/v66y2020i11p4944-4957.html
   My bibliography  Save this article

The Implied Truth Effect: Attaching Warnings to a Subset of Fake News Headlines Increases Perceived Accuracy of Headlines Without Warnings

Author

Listed:
  • Gordon Pennycook

    (Hill and Levene Schools of Business, University of Regina, Regina, Saskatchewan S4S 0A2, Canada;)

  • Adam Bear

    (Department of Psychology, Harvard University, Cambridge, Massachusetts 02138;)

  • Evan T. Collins

    (School of Medicine, Yale University, New Haven, Connecticut 06510;)

  • David G. Rand

    (Sloan School of Management, Massachusetts Institute of Technology, Cambridge, Massachusetts 02142; Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139; Institute for Data, Systems, and Society, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139)

Abstract

What can be done to combat political misinformation? One prominent intervention involves attaching warnings to headlines of news stories that have been disputed by third-party fact-checkers. Here we demonstrate a hitherto unappreciated potential consequence of such a warning: an implied truth effect , whereby false headlines that fail to get tagged are considered validated and thus are seen as more accurate. With a formal model, we demonstrate that Bayesian belief updating can lead to such an implied truth effect. In Study 1 ( n = 5,271 MTurkers), we find that although warnings do lead to a modest reduction in perceived accuracy of false headlines relative to a control condition (particularly for politically concordant headlines), we also observed the hypothesized implied truth effect: the presence of warnings caused untagged headlines to be seen as more accurate than in the control. In Study 2 ( n = 1,568 MTurkers), we find the same effects in the context of decisions about which headlines to consider sharing on social media. We also find that attaching verifications to some true headlines—which removes the ambiguity about whether untagged headlines have not been checked or have been verified—eliminates, and in fact slightly reverses, the implied truth effect. Together these results contest theories of motivated reasoning while identifying a potential challenge for the policy of using warning tags to fight misinformation—a challenge that is particularly concerning given that it is much easier to produce misinformation than it is to debunk it.

Suggested Citation

  • Gordon Pennycook & Adam Bear & Evan T. Collins & David G. Rand, 2020. "The Implied Truth Effect: Attaching Warnings to a Subset of Fake News Headlines Increases Perceived Accuracy of Headlines Without Warnings," Management Science, INFORMS, vol. 66(11), pages 4944-4957, November.
  • Handle: RePEc:inm:ormnsc:v:66:y:2020:i:11:p:4944-4957
    DOI: 10.1287/mnsc.2019.3478
    as

    Download full text from publisher

    File URL: https://doi.org/10.1287/mnsc.2019.3478
    Download Restriction: no

    File URL: https://libkey.io/10.1287/mnsc.2019.3478?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. John Horton & David Rand & Richard Zeckhauser, 2011. "The online laboratory: conducting experiments in a real labor market," Experimental Economics, Springer;Economic Science Association, vol. 14(3), pages 399-425, September.
    2. Mullinix, Kevin J. & Leeper, Thomas J. & Druckman, James N. & Freese, Jeremy, 2015. "The Generalizability of Survey Experiments," Journal of Experimental Political Science, Cambridge University Press, vol. 2(2), pages 109-138, January.
    3. Sander van der Linden & Anthony Leiserowitz & Edward Maibach, 2018. "Scientific agreement can neutralize politicization of facts," Nature Human Behaviour, Nature, vol. 2(1), pages 2-3, January.
    4. Dan M. Kahan & Ellen Peters & Maggie Wittlin & Paul Slovic & Lisa Larrimore Ouellette & Donald Braman & Gregory Mandel, 2012. "The polarizing impact of science literacy and numeracy on perceived climate change risks," Nature Climate Change, Nature, vol. 2(10), pages 732-735, October.
    5. Berinsky, Adam J., 2017. "Rumors and Health Care Reform: Experiments in Political Misinformation," British Journal of Political Science, Cambridge University Press, vol. 47(2), pages 241-262, April.
    6. Krupnikov, Yanna & Levine, Adam Seth, 2014. "Cross-Sample Comparisons and External Validity," Journal of Experimental Political Science, Cambridge University Press, vol. 1(1), pages 59-80, April.
    7. Shalvi, Shaul & Dana, Jason & Handgraaf, Michel J.J. & De Dreu, Carsten K.W., 2011. "Justified ethicality: Observing desired counterfactuals modifies ethical perceptions and behavior," Organizational Behavior and Human Decision Processes, Elsevier, vol. 115(2), pages 181-190, July.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Alan C. Logan & Susan H. Berman & Brian M. Berman & Susan L. Prescott, 2021. "Healing Anthropocene Syndrome: Planetary Health Requires Remediation of the Toxic Post-Truth Environment," Challenges, MDPI, vol. 12(1), pages 1-25, January.
    2. Ugochukwu Etudo & Victoria Y. Yoon, 2024. "Ontology-Based Information Extraction for Labeling Radical Online Content Using Distant Supervision," Information Systems Research, INFORMS, vol. 35(1), pages 203-225, March.
    3. Guy Aridor & Rafael Jiménez-Durán & Ro'ee Levy & Lena Song, 2024. "The Economics of Social Media," CESifo Working Paper Series 10934, CESifo.
    4. Chopra, Felix & Haaland, Ingar & Roth, Christopher, 2021. "The Demand for Fact Checking," CAGE Online Working Paper Series 563, Competitive Advantage in the Global Economy (CAGE).
    5. Thomas Renault & David Restrepo Amariles & Aurore Troussel, 2024. "Collaboratively adding context to social media posts reduces the sharing of false news," Papers 2404.02803, arXiv.org.
    6. Garrett Morrow & Briony Swire‐Thompson & Jessica Montgomery Polny & Matthew Kopec & John P. Wihbey, 2022. "The emerging science of content labeling: Contextualizing social media content moderation," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 73(10), pages 1365-1386, October.
    7. Chopra, Felix & Haaland, Ingar & Roth, Christopher, 2022. "Do people demand fact-checked news? Evidence from U.S. Democrats," Journal of Public Economics, Elsevier, vol. 205(C).
    8. Folco Panizza & Piero Ronzani & Tiffany Morisseau & Simone Mattavelli & Carlo Martini, 2023. "How do online users respond to crowdsourced fact-checking?," Palgrave Communications, Palgrave Macmillan, vol. 10(1), pages 1-11, December.
    9. Emeric Henry & Ekaterina Zhuravskaya & Sergei Guriev, 2022. "Checking and Sharing Alt-Facts," American Economic Journal: Economic Policy, American Economic Association, vol. 14(3), pages 55-86, August.
    10. Rezaee, Arman & Hirshleifer, Sarojini & Naseem, Mustafa & Raza, Agha Ali, 2023. "The Spread of (Mis)information: A Social Media Experiment in Pakistan," Institute on Global Conflict and Cooperation, Working Paper Series qt53n4q35z, Institute on Global Conflict and Cooperation, University of California.
    11. Lusher, Lester & Ruberg, Tim, 2023. "Killer Alerts? Public Health Warnings and Heat Stroke in Japan," IZA Discussion Papers 16562, Institute of Labor Economics (IZA).
    12. Greta Castellini & Mariarosaria Savarese & Guendalina Graffigna, 2021. "Online Fake News about Food: Self-Evaluation, Social Influence, and the Stages of Change Moderation," IJERPH, MDPI, vol. 18(6), pages 1-13, March.
    13. van Gils, Freek & Müller, Wieland & Prüfer, Jens, 2020. "Big Data and Democracy," Other publications TiSEM ecc11d8d-1478-4dd2-b570-4, Tilburg University, School of Economics and Management.
    14. Gonzalo Cisternas & Jorge Vásquez, 2022. "Misinformation in Social Media: The Role of Verification Incentives," Staff Reports 1028, Federal Reserve Bank of New York.
    15. Gupta, Ashish & Li, Han & Farnoush, Alireza & Jiang, Wenting, 2022. "Understanding patterns of COVID infodemic: A systematic and pragmatic approach to curb fake news," Journal of Business Research, Elsevier, vol. 140(C), pages 670-683.
    16. repec:cup:judgdm:v:16:y:2021:i:2:p:484-504 is not listed on IDEAS
    17. Cameron Martel & Mohsen Mosleh & David G. Rand, 2021. "You’re Definitely Wrong, Maybe: Correction Style Has Minimal Effect on Corrections of Misinformation Online," Media and Communication, Cogitatio Press, vol. 9(1), pages 120-133.
    18. John M. Carey & Andrew M. Guess & Peter J. Loewen & Eric Merkley & Brendan Nyhan & Joseph B. Phillips & Jason Reifler, 2022. "The ephemeral effects of fact-checks on COVID-19 misperceptions in the United States, Great Britain and Canada," Nature Human Behaviour, Nature, vol. 6(2), pages 236-243, February.
    19. Patricia L. Moravec & Antino Kim & Alan R. Dennis & Randall K. Minas, 2022. "Do You Really Know if It’s True? How Asking Users to Rate Stories Affects Belief in Fake News on Social Media," Information Systems Research, INFORMS, vol. 33(3), pages 887-907, September.
    20. Kevin Matthe Caramancion & Yueqi Li & Elisabeth Dubois & Ellie Seoe Jung, 2022. "The Missing Case of Disinformation from the Cybersecurity Risk Continuum: A Comparative Assessment of Disinformation with Other Cyber Threats," Data, MDPI, vol. 7(4), pages 1-18, April.
    21. Sarah Spiekermann & Hanna Krasnova & Oliver Hinz & Annika Baumann & Alexander Benlian & Henner Gimpel & Irina Heimbach & Antonia Köster & Alexander Maedche & Björn Niehaves & Marten Risius & Manuel Tr, 2022. "Values and Ethics in Information Systems," Business & Information Systems Engineering: The International Journal of WIRTSCHAFTSINFORMATIK, Springer;Gesellschaft für Informatik e.V. (GI), vol. 64(2), pages 247-264, April.
    22. Robert M. Ross & David G. Rand & Gordon Pennycook, 2021. "Beyond “fake news†: Analytic thinking and the detection of false and hyperpartisan news headlines," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 16(2), pages 484-504, March.
    23. Tuval Danenberg & Drew Fudenberg, 2024. "Endogenous Attention and the Spread of False News," Papers 2406.11024, arXiv.org.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Antonio A. Arechar & Simon Gächter & Lucas Molleman, 2018. "Conducting interactive experiments online," Experimental Economics, Springer;Economic Science Association, vol. 21(1), pages 99-131, March.
    2. Jay J. Van Bavel & Katherine Baicker & Paulo S. Boggio & Valerio Capraro & Aleksandra Cichocka & Mina Cikara & Molly J. Crockett & Alia J. Crum & Karen M. Douglas & James N. Druckman & John Drury & Oe, 2020. "Using social and behavioural science to support COVID-19 pandemic response," Nature Human Behaviour, Nature, vol. 4(5), pages 460-471, May.
    3. Blaine G. Robbins, 2017. "Status, identity, and ability in the formation of trust," Rationality and Society, , vol. 29(4), pages 408-448, November.
    4. Heinicke, Franziska & Rosenkranz, Stephanie & Weitzel, Utz, 2019. "The effect of pledges on the distribution of lying behavior: An online experiment," Journal of Economic Psychology, Elsevier, vol. 73(C), pages 136-151.
    5. Garbarino, Ellen & Slonim, Robert & Villeval, Marie Claire, 2019. "Loss aversion and lying behavior," Journal of Economic Behavior & Organization, Elsevier, vol. 158(C), pages 379-393.
    6. Hardin, Ashley E. & Bauman, Christopher W. & Mayer, David M., 2020. "Show me the … family: How photos of meaningful relationships reduce unethical behavior at work," Organizational Behavior and Human Decision Processes, Elsevier, vol. 161(C), pages 93-108.
    7. Bortolotti, Stefania & Kölle, Felix & Wenner, Lukas, 2022. "On the persistence of dishonesty," Journal of Economic Behavior & Organization, Elsevier, vol. 200(C), pages 1053-1065.
    8. Austin M Strange & Ryan D Enos & Mark Hill & Amy Lakeman, 2019. "Online volunteer laboratories for human subjects research," PLOS ONE, Public Library of Science, vol. 14(8), pages 1-13, August.
    9. Shari De Baets & Dilek Önkal & Wasim Ahmed, 2022. "Do Risky Scenarios Affect Forecasts of Savings and Expenses?," Forecasting, MDPI, vol. 4(1), pages 1-28, February.
    10. Ellen Garbarino & Robert Slonim & Marie Claire Villeval, 2016. "Loss Aversion and lying behavior: Theory, estimation and empirical evidence," Working Papers halshs-01404333, HAL.
    11. Laura Biziou-van-Pol & Jana Haenen & Arianna Novaro & Andrés Occhipinti & Valerio Capraro, 2015. "Does telling white lies signal pro-social preferences?," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 10(6), pages 538-548, November.
    12. Exley, Christine L. & Petrie, Ragan, 2018. "The impact of a surprise donation ask," Journal of Public Economics, Elsevier, vol. 158(C), pages 152-167.
    13. MacFarlane, Douglas & Hurlstone, Mark J. & Ecker, Ullrich K.H., 2020. "Protecting consumers from fraudulent health claims: A taxonomy of psychological drivers, interventions, barriers, and treatments," Social Science & Medicine, Elsevier, vol. 259(C).
    14. David Johnson & John Barry Ryan, 2020. "Amazon Mechanical Turk workers can provide consistent and economically meaningful data," Southern Economic Journal, John Wiley & Sons, vol. 87(1), pages 369-385, July.
    15. Petrik Runst, 2018. "Does Immigration Affect Demand for Redistribution? – An Experimental Design," German Economic Review, Verein für Socialpolitik, vol. 19(4), pages 383-400, November.
    16. Adam Seth Levine & Reuben Kline, 2017. "A new approach for evaluating climate change communication," Climatic Change, Springer, vol. 142(1), pages 301-309, May.
    17. Bachmann, Kremena & Lot, Andre & Xu, Xiaogeng & Hens, Thorsten, 2023. "Experimental Research on Retirement Decision-Making: Evidence from Replications," Journal of Banking & Finance, Elsevier, vol. 152(C).
    18. Logan S. Casey & Jesse Chandler & Adam Seth Levine & Andrew Proctor & Dara Z. Strolovitch, 2017. "Intertemporal Differences Among MTurk Workers: Time-Based Sample Variations and Implications for Online Data Collection," SAGE Open, , vol. 7(2), pages 21582440177, June.
    19. Shoots-Reinhard, Brittany & Goodwin, Raleigh & Bjälkebring, Pär & Markowitz, David M. & Silverstein, Michael C. & Peters, Ellen, 2021. "Ability-related political polarization in the COVID-19 pandemic," Intelligence, Elsevier, vol. 88(C).
    20. Kevin E. Levay & Jeremy Freese & James N. Druckman, 2016. "The Demographic and Political Composition of Mechanical Turk Samples," SAGE Open, , vol. 6(1), pages 21582440166, March.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:inm:ormnsc:v:66:y:2020:i:11:p:4944-4957. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Asher (email available below). General contact details of provider: https://edirc.repec.org/data/inforea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.