IDEAS home Printed from https://ideas.repec.org/p/osf/osfxxx/7dc6a.html
   My bibliography  Save this paper

Quality assessment of scientific manuscripts in peer review and education

Author

Listed:
  • Augusteijn, Hilde Elisabeth Maria

    (Tilburg University)

  • Wicherts, Jelte M.

    (Tilburg University)

  • Sijtsma, Klaas
  • van Assen, Marcel A. L. M.

Abstract

We report a vignette study and a survey to investigate which study characteristics influence quality ratings academics give of articles submitted for publication, and academics and students give of students’ theses. In the vignette study, 800 respondents evaluated the quality of an abstract of studies with small or large sample sizes, showing statistically significant or non-significant results, and containing statistical reporting errors or no errors. In the survey, the same participants rated the importance of 29 manuscript characteristics related to the study’s theory, design, conduct, data analyses, and presentation for assessing either the quality of a manuscript or its publishability (article) or grade (thesis). Results showed that quality ratings were affected by sample sizes but not by statistical significance or the presence of statistical reporting errors in the rated research vignette. These results suggest that researchers’ assessments of manuscript quality are not responsible for publication bias. Furthermore, academics and students provided highly similar ratings of the importance of different aspects relevant to quality assessment of articles and theses. These results suggest that quality criteria for scientific manuscripts are already adopted by students and are similar for submitted manuscripts and theses.

Suggested Citation

  • Augusteijn, Hilde Elisabeth Maria & Wicherts, Jelte M. & Sijtsma, Klaas & van Assen, Marcel A. L. M., 2023. "Quality assessment of scientific manuscripts in peer review and education," OSF Preprints 7dc6a, Center for Open Science.
  • Handle: RePEc:osf:osfxxx:7dc6a
    DOI: 10.31219/osf.io/7dc6a
    as

    Download full text from publisher

    File URL: https://osf.io/download/63b40dc2e48ccc08404fdc77/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/7dc6a?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Lutz Bornmann & Rüdiger Mutz & Hans-Dieter Daniel, 2010. "A Reliability-Generalization Study of Journal Peer Reviews: A Multilevel Meta-Analysis of Inter-Rater Reliability and Its Determinants," PLOS ONE, Public Library of Science, vol. 5(12), pages 1-10, December.
    2. Wicherts, Jelte M. & Veldkamp, Coosje Lisabet Sterre & Augusteijn, Hilde & Bakker, Marjan & van Aert, Robbie Cornelis Maria & van Assen, Marcel A. L. M., 2016. "Degrees of freedom in planning, running, analyzing, and reporting psychological studies A checklist to avoid p-hacking," OSF Preprints umq8d, Center for Open Science.
    3. Denes Szucs & John P A Ioannidis, 2017. "Empirical assessment of published effect sizes and power in the recent cognitive neuroscience and psychology literature," PLOS Biology, Public Library of Science, vol. 15(3), pages 1-18, March.
    4. Rüdiger Mutz & Lutz Bornmann & Hans-Dieter Daniel, 2012. "Heterogeneity of Inter-Rater Reliabilities of Grant Peer Reviews and Its Determinants: A General Estimating Equations Approach," PLOS ONE, Public Library of Science, vol. 7(10), pages 1-10, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Jens Jirschitzka & Aileen Oeberst & Richard Göllner & Ulrike Cress, 2017. "Inter-rater reliability and validity of peer reviews in an interdisciplinary field," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 1059-1092, November.
    2. Rüdiger Mutz & Tobias Wolbring & Hans-Dieter Daniel, 2017. "The effect of the “very important paper” (VIP) designation in Angewandte Chemie International Edition on citation impact: A propensity score matching analysis," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(9), pages 2139-2153, September.
    3. Jasper Brinkerink, 2023. "When Shooting for the Stars Becomes Aiming for Asterisks: P-Hacking in Family Business Research," Entrepreneurship Theory and Practice, , vol. 47(2), pages 304-343, March.
    4. Feliciani, Thomas & Morreau, Michael & Luo, Junwen & Lucas, Pablo & Shankar, Kalpana, 2022. "Designing grant-review panels for better funding decisions: Lessons from an empirically calibrated simulation model," Research Policy, Elsevier, vol. 51(4).
    5. Steven Wooding & Thed N Van Leeuwen & Sarah Parks & Shitij Kapur & Jonathan Grant, 2015. "UK Doubles Its “World-Leading” Research in Life Sciences and Medicine in Six Years: Testing the Claim?," PLOS ONE, Public Library of Science, vol. 10(7), pages 1-10, July.
    6. Laura Muñoz-Bermejo & Jorge Pérez-Gómez & Fernando Manzano & Daniel Collado-Mateo & Santos Villafaina & José C Adsuar, 2019. "Reliability of isokinetic knee strength measurements in children: A systematic review and meta-analysis," PLOS ONE, Public Library of Science, vol. 14(12), pages 1-15, December.
    7. David A. M. Peterson, 2020. "Dear Reviewer 2: Go F’ Yourself," Social Science Quarterly, Southwestern Social Science Association, vol. 101(4), pages 1648-1652, July.
    8. Vincent Chandler, 2019. "Identifying emerging scholars: seeing through the crystal ball of scholarship selection committees," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 39-56, July.
    9. Lei Li & Yan Wang & Guanfeng Liu & Meng Wang & Xindong Wu, 2015. "Context-Aware Reviewer Assignment for Trust Enhanced Peer Review," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-28, June.
    10. Schweinsberg, Martin & Feldman, Michael & Staub, Nicola & van den Akker, Olmo R. & van Aert, Robbie C.M. & van Assen, Marcel A.L.M. & Liu, Yang & Althoff, Tim & Heer, Jeffrey & Kale, Alex & Mohamed, Z, 2021. "Same data, different conclusions: Radical dispersion in empirical results when independent analysts operationalize and test the same hypothesis," Organizational Behavior and Human Decision Processes, Elsevier, vol. 165(C), pages 228-249.
    11. Felix Holzmeister & Magnus Johannesson & Robert Böhm & Anna Dreber & Jürgen Huber & Michael Kirchler, 2023. "Heterogeneity in effect size estimates: Empirical evidence and practical implications," Working Papers 2023-17, Faculty of Economics and Statistics, Universität Innsbruck.
    12. Chin, Jason & Zeiler, Kathryn, 2021. "Replicability in Empirical Legal Research," LawArXiv 2b5k4, Center for Open Science.
    13. Denise Rousseau & Byeong Jo Kim & Ryan Splenda & Sarah Young & Jangbum Lee & Donna Beck, 2023. "Does chief executive compensation predict financial performance or inaccurate financial reporting in listed companies: A systematic review," Campbell Systematic Reviews, John Wiley & Sons, vol. 19(4), December.
    14. Filip Melinscak & Dominik R Bach, 2020. "Computational optimization of associative learning experiments," PLOS Computational Biology, Public Library of Science, vol. 16(1), pages 1-23, January.
    15. Auer, Tobias & Ulasik, Maria & Holzmeister, Felix, 2024. "A Comment on "Motivated Errors" by Exley and Kessler (2024)," I4R Discussion Paper Series 161, The Institute for Replication (I4R).
    16. David Gurwitz & Elena Milanesi & Thomas Koenig, 2014. "Grant Application Review: The Case of Transparency," PLOS Biology, Public Library of Science, vol. 12(12), pages 1-6, December.
    17. Vieira, Elizabeth S. & Cabral, José A.S. & Gomes, José A.N.F., 2014. "How good is a model based on bibliometric indicators in predicting the final decisions made by peers?," Journal of Informetrics, Elsevier, vol. 8(2), pages 390-405.
    18. Esteban Morales & Erin C McKiernan & Meredith T Niles & Lesley Schimanski & Juan Pablo Alperin, 2021. "How faculty define quality, prestige, and impact of academic journals," PLOS ONE, Public Library of Science, vol. 16(10), pages 1-13, October.
    19. Adler, Susanne Jana & Röseler, Lukas & Schöniger, Martina Katharina, 2023. "A toolbox to evaluate the trustworthiness of published findings," Journal of Business Research, Elsevier, vol. 167(C).
    20. Adler, Susanne Jana & Sharma, Pratyush Nidhi & Radomir, Lăcrămioara, 2023. "Toward open science in PLS-SEM: Assessing the state of the art and future perspectives," Journal of Business Research, Elsevier, vol. 169(C).

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:osfxxx:7dc6a. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.