IDEAS home Printed from https://ideas.repec.org/p/osf/metaar/2gurz.html
   My bibliography  Save this paper

Deciding what to replicate: A formal definition of “replication value” and a decision model for replication study selection

Author

Listed:
  • Isager, Peder Mortvedt

    (Eindhoven University of Technology)

  • van Aert, Robbie Cornelis Maria
  • Bahník, Štěpán

    (University of Economics, Prague)

  • Brandt, Mark John

    (Tilburg University)

  • DeSoto, Kurt Andrew

    (Association for Psychological Science)

  • Giner-Sorolla, Roger
  • Krueger, Joachim
  • Perugini, Marco
  • Ropovik, Ivan

    (University of Presov)

  • van 't Veer, Anna Elisabeth

    (Leiden University)

Abstract

Robust scientific knowledge is contingent upon replication of original findings. However, researchers who conduct replication studies face a difficult problem; there are many more studies in need of replication than there are funds available for replicating. To select studies for replication efficiently, we need to understand which studies are the most in need of replication. In other words, we need to understand which replication efforts have the highest expected utility. In this article we propose a general rule for study selection in replication research based on the replication value of the claims considered for replication. The replication value of a claim is defined as the maximum expected utility we could gain by replicating the claim, and is a function of (1) the value of being certain about the claim, and (2) uncertainty about the claim based on current evidence. We formalize this definition in terms of a causal decision model, utilizing concepts from decision theory and causal graph modeling. We discuss the validity of using replication value as a measure of expected utility gain, and we suggest approaches for deriving quantitative estimates of replication value.

Suggested Citation

  • Isager, Peder Mortvedt & van Aert, Robbie Cornelis Maria & Bahník, Štěpán & Brandt, Mark John & DeSoto, Kurt Andrew & Giner-Sorolla, Roger & Krueger, Joachim & Perugini, Marco & Ropovik, Ivan & van 't, 2020. "Deciding what to replicate: A formal definition of “replication value” and a decision model for replication study selection," MetaArXiv 2gurz, Center for Open Science.
  • Handle: RePEc:osf:metaar:2gurz
    DOI: 10.31219/osf.io/2gurz
    as

    Download full text from publisher

    File URL: https://osf.io/download/5f4f4314a392b9002f1d9576/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/2gurz?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 48(1), pages 62-83.
    2. Purkayastha, Amrita & Palmaro, Eleonora & Falk-Krzesinski, Holly J. & Baas, Jeroen, 2019. "Comparison of two article-level, field-independent citation metrics: Field-Weighted Citation Impact (FWCI) and Relative Citation Ratio (RCR)," Journal of Informetrics, Elsevier, vol. 13(2), pages 635-642.
    3. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    4. Bornmann, Lutz, 2014. "Validity of altmetrics data for measuring societal impact: A study using data from Altmetric and F1000Prime," Journal of Informetrics, Elsevier, vol. 8(4), pages 935-950.
    5. Stephan Lewandowsky & Klaus Oberauer, 2020. "Low replicability can support robust and efficient science," Nature Communications, Nature, vol. 11(1), pages 1-12, December.
    6. Jörn Block & Andreas Kuckertz, 2018. "Seven principles of effective replication studies: strengthening the evidence base of management research," Management Review Quarterly, Springer, vol. 68(4), pages 355-359, November.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Brinkerink, Jasper & De Massis, Alfredo & Kellermanns, Franz, 2022. "One finding is no finding: Toward a replication culture in family business research," Journal of Family Business Strategy, Elsevier, vol. 13(4).
    2. Lucy Semerjian & Kunle Okaiyeto & Mike O. Ojemaye & Temitope Cyrus Ekundayo & Aboi Igwaran & Anthony I. Okoh, 2021. "Global Systematic Mapping of Road Dust Research from 1906 to 2020: Research Gaps and Future Direction," Sustainability, MDPI, vol. 13(20), pages 1-21, October.
    3. Alberto Saracco, 2022. "Dr. Strangelove: Or How I Learned to Stop Worrying and Love the Citations," The Mathematical Intelligencer, Springer, vol. 44(4), pages 326-330, December.
    4. Tom Coupé & W. Robert Reed, 2021. "Do Negative Replications Affect Citations?," Working Papers in Economics 21/14, University of Canterbury, Department of Economics and Finance.
    5. Guoqiang Liang & Ying Lou & Haiyan Hou, 2022. "Revisiting the disruptive index: evidence from the Nobel Prize-winning articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(10), pages 5721-5730, October.
    6. Zhichao Wang & Valentin Zelenyuk, 2021. "Performance Analysis of Hospitals in Australia and its Peers: A Systematic Review," CEPA Working Papers Series WP012021, School of Economics, University of Queensland, Australia.
    7. Nick Huntington‐Klein & Andreu Arenas & Emily Beam & Marco Bertoni & Jeffrey R. Bloem & Pralhad Burli & Naibin Chen & Paul Grieco & Godwin Ekpe & Todd Pugatch & Martin Saavedra & Yaniv Stopnitzky, 2021. "The influence of hidden researcher decisions in applied microeconomics," Economic Inquiry, Western Economic Association International, vol. 59(3), pages 944-960, July.
    8. Mehdi Rhaiem & Nabil Amara, 2020. "Determinants of research efficiency in Canadian business schools: evidence from scholar-level data," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 53-99, October.
    9. Jin Su & Mo Wang & Mohd Adib Mohammad Razi & Norlida Mohd Dom & Noralfishah Sulaiman & Lai-Wai Tan, 2023. "A Bibliometric Review of Nature-Based Solutions on Urban Stormwater Management," Sustainability, MDPI, vol. 15(9), pages 1-23, April.
    10. Yaxue Ma & Zhichao Ba & Yuxiang Zhao & Jin Mao & Gang Li, 2021. "Understanding and predicting the dissemination of scientific papers on social media: a two-step simultaneous equation modeling–artificial neural network approach," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(8), pages 7051-7085, August.
    11. Katarzyna Piwowar‐Sulej, 2021. "Core functions of Sustainable Human Resource Management. A hybrid literature review with the use of H‐Classics methodology," Sustainable Development, John Wiley & Sons, Ltd., vol. 29(4), pages 671-693, July.
    12. Monika Blišťanová & Peter Koščák & Michaela Tirpáková & Magdaléna Ondicová, 2023. "A Cross-Comparative Analysis of Transportation Safety Research," Sustainability, MDPI, vol. 15(9), pages 1-14, May.
    13. Dreber, Anna & Johannesson, Magnus, 2023. "A framework for evaluating reproducibility and replicability in economics," Ruhr Economic Papers 1055, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    14. Tang, Xuli & Li, Xin & Ding, Ying & Song, Min & Bu, Yi, 2020. "The pace of artificial intelligence innovations: Speed, talent, and trial-and-error," Journal of Informetrics, Elsevier, vol. 14(4).
    15. Sven Helmer & David B. Blumenthal & Kathrin Paschen, 2020. "What is meaningful research and how should we measure it?," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 153-169, October.
    16. Zsolt Kohus & Márton Demeter & László Kun & Eszter Lukács & Katalin Czakó & Gyula Péter Szigeti, 2022. "A Study of the Relation between Byline Positions of Affiliated/Non-Affiliated Authors and the Scientific Impact of European Universities in Times Higher Education World University Rankings," Sustainability, MDPI, vol. 14(20), pages 1-14, October.
    17. Mark J. McCabe & Frank Mueller-Langer, 2019. "Does Data Disclosure Increase Citations? Empirical Evidence from a Natural Experiment in Leading Economics Journals," JRC Working Papers on Digital Economy 2019-02, Joint Research Centre.
    18. Horbach, Serge & Aagaard, Kaare & Schneider, Jesper W., 2021. "Meta-Research: How problematic citing practices distort science," MetaArXiv aqyhg, Center for Open Science.
    19. He-Li Sun & Yuan Feng & Qinge Zhang & Jia-Xin Li & Yue-Ying Wang & Zhaohui Su & Teris Cheung & Todd Jackson & Sha Sha & Yu-Tao Xiang, 2022. "The Microbiome–Gut–Brain Axis and Dementia: A Bibliometric Analysis," IJERPH, MDPI, vol. 19(24), pages 1-14, December.
    20. Martin Paldam, 2023. "Meta‐mining: The political economy of meta‐analysis," Kyklos, Wiley Blackwell, vol. 76(1), pages 125-140, February.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:metaar:2gurz. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/metaarxiv .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.