IDEAS home Printed from https://ideas.repec.org/a/eee/jbrese/v167y2023ics0148296323005489.html
   My bibliography  Save this article

A toolbox to evaluate the trustworthiness of published findings

Author

Listed:
  • Adler, Susanne Jana
  • Röseler, Lukas
  • Schöniger, Martina Katharina

Abstract

During the past few years, researchers have criticized their professions for providing an entry point for false-positive results arising from publication bias and questionable research practices such as p-hacking (i.e., selectively reporting analyses that yield a p-value below 5 %). Researchers are advocating replication studies and the implementation of open-science practices, like preregistration, in order to identify trustworthy effects. Nevertheless, because such consumer research developments are still emerging, most prior research findings have not been replicated, leaving researchers in the dark as to whether a line of research or a particular effect is trustworthy. We tackle this problem by providing a toolbox containing multiple heuristics to identify data patterns that might, from the information provided in published articles, indicate publication bias and p-hacking. Our toolbox is an easy-to-use instrument with which to initially assess a given set of findings.

Suggested Citation

  • Adler, Susanne Jana & Röseler, Lukas & Schöniger, Martina Katharina, 2023. "A toolbox to evaluate the trustworthiness of published findings," Journal of Business Research, Elsevier, vol. 167(C).
  • Handle: RePEc:eee:jbrese:v:167:y:2023:i:c:s0148296323005489
    DOI: 10.1016/j.jbusres.2023.114189
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0148296323005489
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.jbusres.2023.114189?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Simonson, Itamar, 1989. "Choice Based on Reasons: The Case of Attraction and Compromise Effects," Journal of Consumer Research, Journal of Consumer Research Inc., vol. 16(2), pages 158-174, September.
    2. Aparna A. Labroo & Natalie Mizik & Russell Winer, 2022. "Introducing Marketing Letters’ data policy," Marketing Letters, Springer, vol. 33(3), pages 361-364, September.
    3. Raphael Thomadsen & Robert P. Rooderkerk & On Amir & Neeraj Arora & Bryan Bollinger & Karsten Hansen & Leslie John & Wendy Liu & Aner Sela & Vishal Singh & K. Sudhir & Wendy Wood, 2018. "How Context Affects Choice," Customer Needs and Solutions, Springer;Institute for Sustainable Innovation and Growth (iSIG), vol. 5(1), pages 3-14, March.
    4. Daniel J. Benjamin & James O. Berger & Magnus Johannesson & Brian A. Nosek & E.-J. Wagenmakers & Richard Berk & Kenneth A. Bollen & Björn Brembs & Lawrence Brown & Colin Camerer & David Cesarini & Chr, 2018. "Redefine statistical significance," Nature Human Behaviour, Nature, vol. 2(1), pages 6-10, January.
      • Daniel Benjamin & James Berger & Magnus Johannesson & Brian Nosek & E. Wagenmakers & Richard Berk & Kenneth Bollen & Bjorn Brembs & Lawrence Brown & Colin Camerer & David Cesarini & Christopher Chambe, 2017. "Redefine Statistical Significance," Artefactual Field Experiments 00612, The Field Experiments Website.
    5. Adam Altmejd & Anna Dreber & Eskil Forsell & Juergen Huber & Taisuke Imai & Magnus Johannesson & Michael Kirchler & Gideon Nave & Colin Camerer, 2019. "Predicting the replicability of social science lab experiments," PLOS ONE, Public Library of Science, vol. 14(12), pages 1-18, December.
    6. Xinshu Zhao & John G. Lynch & Qimei Chen, 2010. "Reconsidering Baron and Kenny: Myths and Truths about Mediation Analysis," Journal of Consumer Research, Journal of Consumer Research Inc., vol. 37(2), pages 197-206, August.
    7. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    8. Nosek, Brian A. & Ebersole, Charles R. & DeHaven, Alexander Carl & Mellor, David Thomas, 2018. "The Preregistration Revolution," OSF Preprints 2dxu5, Center for Open Science.
    9. Wicherts, Jelte M. & Veldkamp, Coosje Lisabet Sterre & Augusteijn, Hilde & Bakker, Marjan & van Aert, Robbie Cornelis Maria & van Assen, Marcel A. L. M., 2016. "Degrees of freedom in planning, running, analyzing, and reporting psychological studies A checklist to avoid p-hacking," OSF Preprints umq8d, Center for Open Science.
    10. Hubbard, Raymond & Lindsay, R. Murray, 2013. "The significant difference paradigm promotes bad science," Journal of Business Research, Elsevier, vol. 66(9), pages 1393-1397.
    11. Matthew J Page & Joanne E McKenzie & Patrick M Bossuyt & Isabelle Boutron & Tammy C Hoffmann & Cynthia D Mulrow & Larissa Shamseer & Jennifer M Tetzlaff & Elie A Akl & Sue E Brennan & Roger Chou & Jul, 2021. "The PRISMA 2020 statement: An updated guideline for reporting systematic reviews," PLOS Medicine, Public Library of Science, vol. 18(3), pages 1-15, March.
    12. Uri Simonsohn & Leif D Nelson & Joseph P Simmons, 2019. "P-curve won’t do your laundry, but it will distinguish replicable from non-replicable findings in observational research: Comment on Bruns & Ioannidis (2016)," PLOS ONE, Public Library of Science, vol. 14(3), pages 1-5, March.
    13. Eric T. Bradlow & Peter N. Golder & Joel Huber & Sandy Jap & Aparna A. Labroo & Donald R. Lehmann & John Lynch & Natalie Mizik & Russell S. Winer, 2020. "Editorial: Relaunching Marketing Letters," Marketing Letters, Springer, vol. 31(4), pages 311-314, December.
    14. Lynch, John G. & Bradlow, Eric T. & Huber, Joel C. & Lehmann, Donald R., 2015. "Reflections on the replication corner: In praise of conceptual replications," International Journal of Research in Marketing, Elsevier, vol. 32(4), pages 333-342.
    15. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    16. Courtney K. Soderberg & Timothy M. Errington & Sarah R. Schiavone & Julia Bottesini & Felix Singleton Thorn & Simine Vazire & Kevin M. Esterling & Brian A. Nosek, 2021. "Initial evidence of research quality of registered reports compared with the standard publishing model," Nature Human Behaviour, Nature, vol. 5(8), pages 990-997, August.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Magno, Francesca & Cassia, Fabio, 2024. "Predicting restaurants’ surplus food platform continuance: Insights from the combined use of PLS-SEM and NCA and predictive model comparisons," Journal of Retailing and Consumer Services, Elsevier, vol. 79(C).
    2. Jun-Hwa Cheah (Jacky) & Francesca Magno & Fabio Cassia, 2024. "Reviewing the SmartPLS 4 software: the latest features and enhancements," Journal of Marketing Analytics, Palgrave Macmillan, vol. 12(1), pages 97-107, March.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Adler, Susanne Jana & Sharma, Pratyush Nidhi & Radomir, Lăcrămioara, 2023. "Toward open science in PLS-SEM: Assessing the state of the art and future perspectives," Journal of Business Research, Elsevier, vol. 169(C).
    2. Shaw, Steven D. & Nave, Gideon, 2023. "Don't hate the player, hate the game: Realigning incentive structures to promote robust science and better scientific practices in marketing," Journal of Business Research, Elsevier, vol. 167(C).
    3. Felix Holzmeister & Magnus Johannesson & Robert Böhm & Anna Dreber & Jürgen Huber & Michael Kirchler, 2023. "Heterogeneity in effect size estimates: Empirical evidence and practical implications," Working Papers 2023-17, Faculty of Economics and Statistics, Universität Innsbruck.
    4. Chin, Jason & Zeiler, Kathryn, 2021. "Replicability in Empirical Legal Research," LawArXiv 2b5k4, Center for Open Science.
    5. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    6. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    7. Anna Dreber & Magnus Johannesson & Yifan Yang, 2024. "Selective reporting of placebo tests in top economics journals," Economic Inquiry, Western Economic Association International, vol. 62(3), pages 921-932, July.
    8. Williams, Cole Randall, 2019. "How redefining statistical significance can worsen the replication crisis," Economics Letters, Elsevier, vol. 181(C), pages 65-69.
    9. Merl, Robert & Stöckl, Thomas & Palan, Stefan, 2023. "Insider trading regulation and shorting constraints. Evaluating the joint effects of two market interventions," Journal of Banking & Finance, Elsevier, vol. 154(C).
    10. Sarstedt, Marko & Adler, Susanne J., 2023. "An advanced method to streamline p-hacking," Journal of Business Research, Elsevier, vol. 163(C).
    11. Schweinsberg, Martin & Feldman, Michael & Staub, Nicola & van den Akker, Olmo R. & van Aert, Robbie C.M. & van Assen, Marcel A.L.M. & Liu, Yang & Althoff, Tim & Heer, Jeffrey & Kale, Alex & Mohamed, Z, 2021. "Same data, different conclusions: Radical dispersion in empirical results when independent analysts operationalize and test the same hypothesis," Organizational Behavior and Human Decision Processes, Elsevier, vol. 165(C), pages 228-249.
    12. Brinkerink, Jasper & De Massis, Alfredo & Kellermanns, Franz, 2022. "One finding is no finding: Toward a replication culture in family business research," Journal of Family Business Strategy, Elsevier, vol. 13(4).
    13. Sébastien Duchêne & Adrien Nguyen-Huu & Dimitri Dubois & Marc Willinger, 2022. "Risk-return trade-offs in the context of environmental impact: a lab-in-the-field experiment with finance professionals," CEE-M Working Papers hal-03883121, CEE-M, Universtiy of Montpellier, CNRS, INRA, Montpellier SupAgro.
    14. Schaerer, Michael & du Plessis, Christilene & Nguyen, My Hoang Bao & van Aert, Robbie C.M. & Tiokhin, Leo & Lakens, Daniël & Giulia Clemente, Elena & Pfeiffer, Thomas & Dreber, Anna & Johannesson, Mag, 2023. "On the trajectory of discrimination: A meta-analysis and forecasting survey capturing 44 years of field experiments on gender and hiring decisions," Organizational Behavior and Human Decision Processes, Elsevier, vol. 179(C).
    15. Strømland, Eirik, 2019. "Preregistration and reproducibility," Journal of Economic Psychology, Elsevier, vol. 75(PA).
    16. Cantone, Giulio Giacomo, 2023. "The multiversal methodology as a remedy of the replication crisis," MetaArXiv kuhmz, Center for Open Science.
    17. Logg, Jennifer M. & Dorison, Charles A., 2021. "Pre-registration: Weighing costs and benefits for researchers," Organizational Behavior and Human Decision Processes, Elsevier, vol. 167(C), pages 18-27.
    18. Yu Ding & Wayne S. DeSarbo & Dominique M. Hanssens & Kamel Jedidi & John G. Lynch & Donald R. Lehmann, 2020. "The past, present, and future of measurement and methods in marketing analysis," Marketing Letters, Springer, vol. 31(2), pages 175-186, September.
    19. Jindrich Matousek & Tomas Havranek & Zuzana Irsova, 2022. "Individual discount rates: a meta-analysis of experimental evidence," Experimental Economics, Springer;Economic Science Association, vol. 25(1), pages 318-358, February.
    20. Jasper Brinkerink, 2023. "When Shooting for the Stars Becomes Aiming for Asterisks: P-Hacking in Family Business Research," Entrepreneurship Theory and Practice, , vol. 47(2), pages 304-343, March.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:jbrese:v:167:y:2023:i:c:s0148296323005489. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/jbusres .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.