IDEAS home Printed from https://ideas.repec.org/a/eee/jobhdp/v158y2020icp27-35.html
   My bibliography  Save this article

Actionable recommendations for narrowing the science-practice gap in open science

Author

Listed:
  • Aguinis, Herman
  • Banks, George C.
  • Rogelberg, Steven G.
  • Cascio, Wayne F.

Abstract

Efforts to promote open-science practices are, to a large extent, driven by a need to reduce questionable research practices (QRPs). There is ample evidence that QRPs are corrosive because they make research opaque and therefore challenge the credibility, trustworthiness, and usefulness of the scientific knowledge that is produced. A literature based on false-positive results that will not replicate is not only scientifically misleading but also worthless for anyone who wants to put knowledge to use. So, a question then arises: Why are these QRPs still so pervasive and why do gatekeepers of scientific knowledge such as journal editors, reviewers, funding-agency panel members, and board members of professional organizations in charge of journal policies not seem to be taking decisive actions about QRPs? We address these questions by using a science-practice gap analogy to identify the existence of a science-practice gap in open science. Specifically, although there is abundant research on how to reduce QRPs, many gatekeepers are not adopting this knowledge in their practices. Drawing upon the literatures on the more general science-practice gap and QRPs, we offer 10 actionable recommendations for narrowing the specific science-practice gap in open science. Our recommendations require little effort, time, and financial resources. Importantly, they are explicit about the resulting benefits for the various research-production stakeholders (i.e., authors and gatekeepers). By translating findings on open-science research into actionable recommendations for “practitioners of research”, we hope to encourage more transparent, credible, and reproducible research that can be trusted and used by consumers of that research.

Suggested Citation

  • Aguinis, Herman & Banks, George C. & Rogelberg, Steven G. & Cascio, Wayne F., 2020. "Actionable recommendations for narrowing the science-practice gap in open science," Organizational Behavior and Human Decision Processes, Elsevier, vol. 158(C), pages 27-35.
  • Handle: RePEc:eee:jobhdp:v:158:y:2020:i:c:p:27-35
    DOI: 10.1016/j.obhdp.2020.02.007
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S074959781930740X
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.obhdp.2020.02.007?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    2. Jelte M Wicherts & Marjan Bakker & Dylan Molenaar, 2011. "Willingness to Share Research Data Is Related to the Strength of the Evidence and the Quality of Reporting of Statistical Results," PLOS ONE, Public Library of Science, vol. 6(11), pages 1-7, November.
    3. Gerber, Alan & Malhotra, Neil, 2008. "Do Statistical Reporting Standards Affect What Is Published? Publication Bias in Two Leading Political Science Journals," Quarterly Journal of Political Science, now publishers, vol. 3(3), pages 313-326, October.
    4. Garret Christensen & Allan Dafoe & Edward Miguel & Don A Moore & Andrew K Rose, 2019. "A study of the impact of data sharing on article citations using journal policies as a natural experiment," PLOS ONE, Public Library of Science, vol. 14(12), pages 1-13, December.
    5. Herman Aguinis & Angelo M. Solarino, 2019. "Transparency and replicability in qualitative research: The case of interviews with elite informants," Strategic Management Journal, Wiley Blackwell, vol. 40(8), pages 1291-1315, August.
    6. Aguinis, Herman & Ramani, Ravi S. & Campbell, P. Knight & Bernal-Turnes, Paloma & Drewry, Josiah M. & Edgerton, Brett T., 2017. "Most Frequently Cited Sources, Articles, and Authors in Industrial-Organizational Psychology Textbooks: Implications for the Science–Practice Divide, Scholarly Impact, and the Future of the Field," Industrial and Organizational Psychology, Cambridge University Press, vol. 10(4), pages 507-557, December.
    7. James Carpenter & Gerta Rücker & Guido Schwarzer, 2011. "Assessing the Sensitivity of Meta-analysis to Selection Bias: A Multiple Imputation Approach," Biometrics, The International Biometric Society, vol. 67(3), pages 1066-1072, September.
    8. Daniele Fanelli, 2009. "How Many Scientists Fabricate and Falsify Research? A Systematic Review and Meta-Analysis of Survey Data," PLOS ONE, Public Library of Science, vol. 4(5), pages 1-11, May.
    9. Mellor, David Thomas & Nosek, Brian A., 2018. "Easy preregistration will benefit any research," MetaArXiv dhc2e, Center for Open Science.
    10. Daniele Fanelli, 2012. "Negative results are disappearing from most disciplines and countries," Scientometrics, Springer;Akadémiai Kiadó, vol. 90(3), pages 891-904, March.
    11. Marcus R. Munafò & Brian A. Nosek & Dorothy V. M. Bishop & Katherine S. Button & Christopher D. Chambers & Nathalie Percie du Sert & Uri Simonsohn & Eric-Jan Wagenmakers & Jennifer J. Ware & John P. A, 2017. "A manifesto for reproducible science," Nature Human Behaviour, Nature, vol. 1(1), pages 1-9, January.
    12. David T. Mellor & Brian A. Nosek, 2018. "Easy preregistration will benefit any research," Nature Human Behaviour, Nature, vol. 2(2), pages 98-98, February.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Piers Steel & Sjoerd Beugelsdijk & Herman Aguinis, 2021. "The anatomy of an award-winning meta-analysis: Recommendations for authors, reviewers, and readers of meta-analytic reviews," Journal of International Business Studies, Palgrave Macmillan;Academy of International Business, vol. 52(1), pages 23-44, February.
    2. Henrique Castro Martins, 2021. "Tutorial-Articles: The Importance of Data and Code Sharing," RAC - Revista de Administração Contemporânea (Journal of Contemporary Administration), ANPAD - Associação Nacional de Pós-Graduação e Pesquisa em Administração, vol. 25(1), pages 200212-2002.
    3. Mackey, Jeremy D. & Parker Ellen, B. & McAllister, Charn P. & Alexander, Katherine C., 2021. "The dark side of leadership: A systematic literature review and meta-analysis of destructive leadership research," Journal of Business Research, Elsevier, vol. 132(C), pages 705-718.
    4. Alejandra Manco, 2022. "A Landscape of Open Science Policies Research," SAGE Open, , vol. 12(4), pages 21582440221, December.
    5. Moore, Don A. & Thau, Stefan & Zhong, Chenbo & Gino, Francesca, 2022. "Open Science at OBHDP," Organizational Behavior and Human Decision Processes, Elsevier, vol. 168(C).
    6. Mackey, Jeremy D., 2021. "Why and how predators pick prey: Followers’ personality and performance as predictors of destructive leadership," Journal of Business Research, Elsevier, vol. 130(C), pages 159-169.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    2. Brian Fabo & Martina Jancokova & Elisabeth Kempf & Lubos Pastor, 2020. "Fifty Shades of QE: Conflicts of Interest in Economic Research," Working Papers 2020-128, Becker Friedman Institute for Research In Economics.
    3. Klaas Sijtsma, 2016. "Playing with Data—Or How to Discourage Questionable Research Practices and Stimulate Researchers to Do Things Right," Psychometrika, Springer;The Psychometric Society, vol. 81(1), pages 1-15, March.
    4. Bruns, Stephan B. & Asanov, Igor & Bode, Rasmus & Dunger, Melanie & Funk, Christoph & Hassan, Sherif M. & Hauschildt, Julia & Heinisch, Dominik & Kempa, Karol & König, Johannes & Lips, Johannes & Verb, 2019. "Reporting errors and biases in published empirical findings: Evidence from innovation research," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    5. Oliver Braganza, 2020. "A simple model suggesting economically rational sample-size choice drives irreproducibility," PLOS ONE, Public Library of Science, vol. 15(3), pages 1-19, March.
    6. Kraft-Todd, Gordon T. & Rand, David G., 2021. "Practice what you preach: Credibility-enhancing displays and the growth of open science," Organizational Behavior and Human Decision Processes, Elsevier, vol. 164(C), pages 1-10.
    7. Frederique Bordignon, 2020. "Self-correction of science: a comparative study of negative citations and post-publication peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1225-1239, August.
    8. Hensel, Przemysław G., 2019. "Supporting replication research in management journals: Qualitative analysis of editorials published between 1970 and 2015," European Management Journal, Elsevier, vol. 37(1), pages 45-57.
    9. Christian Heise & Joshua M. Pearce, 2020. "From Open Access to Open Science: The Path From Scientific Reality to Open Scientific Communication," SAGE Open, , vol. 10(2), pages 21582440209, May.
    10. Fernando Hoces de la Guardia & Sean Grant & Edward Miguel, 2021. "A framework for open policy analysis," Science and Public Policy, Oxford University Press, vol. 48(2), pages 154-163.
    11. David Spiegelhalter, 2017. "Trust in numbers," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 180(4), pages 948-965, October.
    12. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    13. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2018. "Methods Matter: P-Hacking and Causal Inference in Economics," IZA Discussion Papers 11796, Institute of Labor Economics (IZA).
    14. Pierre J C Chuard & Milan Vrtílek & Megan L Head & Michael D Jennions, 2019. "Evidence that nonsignificant results are sometimes preferred: Reverse P-hacking or selective reporting?," PLOS Biology, Public Library of Science, vol. 17(1), pages 1-7, January.
    15. Bernhard Voelkl & Lucile Vogt & Emily S Sena & Hanno Würbel, 2018. "Reproducibility of preclinical animal research improves with heterogeneity of study samples," PLOS Biology, Public Library of Science, vol. 16(2), pages 1-13, February.
    16. Markku Maula & Wouter Stam, 2020. "Enhancing Rigor in Quantitative Entrepreneurship Research," Entrepreneurship Theory and Practice, , vol. 44(6), pages 1059-1090, November.
    17. Klaas Sijtsma, 2016. "Playing with Data—Or How to Discourage Questionable Research Practices and Stimulate Researchers to Do Things Right," Psychometrika, Springer;The Psychometric Society, vol. 81(1), pages 1-15, March.
    18. Felix Chopra & Ingar Haaland & Christopher Roth & Andreas Stegmann, 2024. "The Null Result Penalty," The Economic Journal, Royal Economic Society, vol. 134(657), pages 193-219.
    19. Fabo, Brian & Jančoková, Martina & Kempf, Elisabeth & Pástor, Ľuboš, 2021. "Fifty shades of QE: Comparing findings of central bankers and academics," Journal of Monetary Economics, Elsevier, vol. 120(C), pages 1-20.
    20. Matteo Colombo & Georgi Duev & Michèle B Nuijten & Jan Sprenger, 2018. "Statistical reporting inconsistencies in experimental philosophy," PLOS ONE, Public Library of Science, vol. 13(4), pages 1-12, April.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:jobhdp:v:158:y:2020:i:c:p:27-35. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/obhdp .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.