IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v91y2012i3d10.1007_s11192-011-0569-5.html
   My bibliography  Save this article

Closed versus open reviewing of journal manuscripts: how far do comments differ in language use?

Author

Listed:
  • Lutz Bornmann

    (Max Planck Society, Administrative Headquarters)

  • Markus Wolf

    (University Hospital Heidelberg)

  • Hans-Dieter Daniel

    (University of Zurich
    ETH Zurich)

Abstract

Whereas in traditional, closed peer review (CPR) a few, selected scientists (peers) are included in the process of manuscript review, public peer review (PPR) includes, in addition to invited reviewers, a wider circle of scientists who are interested in a manuscript and wish to write a comment on it. In this study, using the data of two comprehensive evaluation studies on the CPR process at Angewandte Chemie—International Edition and the PPR process at Atmospheric Chemistry and Physics, we examined the language characteristics in comments that were written by invited reviewers in CPR and by invited reviewers and interested members of the scientific community in PPR. We used Linguistic Inquiry and Word Count (LIWC), a text analysis software program that counts words in meaningful categories (e.g., positive or negative emotions) using a standardized dictionary. We examined 599 comments from the reviews of 229 manuscripts. The results show that the comments in PPR are much longer than the comments in CPR. This is an indication that PPR reviewing has more of an improvement function and CPR reviewing has more of a selection function. The results also show that CPR is not, as might be expected, more susceptible to the expression of negative emotions than PPR is. On the contrary, positive emotion words are used statistically significantly more frequently in CPR than in PPR.

Suggested Citation

  • Lutz Bornmann & Markus Wolf & Hans-Dieter Daniel, 2012. "Closed versus open reviewing of journal manuscripts: how far do comments differ in language use?," Scientometrics, Springer;Akadémiai Kiadó, vol. 91(3), pages 843-856, June.
  • Handle: RePEc:spr:scient:v:91:y:2012:i:3:d:10.1007_s11192-011-0569-5
    DOI: 10.1007/s11192-011-0569-5
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-011-0569-5
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-011-0569-5?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Bornmann, Lutz & Marx, Werner & Schier, Hermann & Rahm, Erhard & Thor, Andreas & Daniel, Hans-Dieter, 2009. "Convergent validity of bibliometric Google Scholar data in the field of chemistry—Citation counts for papers that were accepted by Angewandte Chemie International Edition or rejected but published els," Journal of Informetrics, Elsevier, vol. 3(1), pages 27-35.
    2. James Hartley & James W. Pennebaker & Claire Fox, 2003. "Abstracts, introductions and discussions: How far do they differ in style?," Scientometrics, Springer;Akadémiai Kiadó, vol. 57(3), pages 389-398, July.
    3. Editors The, 2008. "Content," Basic Income Studies, De Gruyter, vol. 3(1), pages 1-1, July.
    4. Editors The, 2008. "Content," Basic Income Studies, De Gruyter, vol. 2(2), pages 1-2, January.
    5. Lutz Bornmann & Irina Nast & Hans-Dieter Daniel, 2008. "Do editors and referees look for signs of scientific misconduct when reviewing manuscripts? A quantitative content analysis of studies that examined review criteria and reasons for accepting and rejec," Scientometrics, Springer;Akadémiai Kiadó, vol. 77(3), pages 415-432, December.
    6. Editors The, 2008. "Content," Basic Income Studies, De Gruyter, vol. 3(3), pages 1-1, December.
    7. James Hartley & Lucy Betts, 2009. "Common weaknesses in traditional abstracts in the social sciences," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 60(10), pages 2010-2018, October.
    8. Editors The, 2008. "Content," Basic Income Studies, De Gruyter, vol. 3(2), pages 1-1, November.
    9. Lutz Bornmann & Christoph Neuhaus & Hans-Dieter Daniel, 2011. "The effect of a two-stage publication process on the Journal Impact Factor: a case study on the interactive open access journal Atmospheric Chemistry and Physics," Scientometrics, Springer;Akadémiai Kiadó, vol. 86(1), pages 93-97, January.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Dietmar Wolfram & Peiling Wang & Adam Hembree & Hyoungjoo Park, 2020. "Open peer review: promoting transparency in open science," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 1033-1051, November.
    2. Hren, Darko & Pina, David G. & Norman, Christopher R. & Marušić, Ana, 2022. "What makes or breaks competitive research proposals? A mixed-methods analysis of research grant evaluation reports," Journal of Informetrics, Elsevier, vol. 16(2).
    3. Carlos Vílchez-Román & Arístides Vara-Horna, 2021. "Usage, content and citation in open access publication: any interaction effects?," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(12), pages 9457-9476, December.
    4. Sizo, Amanda & Lino, Adriano & Reis, Luis Paulo & Rocha, Álvaro, 2019. "An overview of assessing the quality of peer review reports of scientific articles," International Journal of Information Management, Elsevier, vol. 46(C), pages 286-293.
    5. Azzurra Ragone & Katsiaryna Mirylenka & Fabio Casati & Maurizio Marchese, 2013. "On peer review in computer science: analysis of its effectiveness and suggestions for improvement," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 317-356, November.
    6. Qianjin Zong & Yafen Xie & Jiechun Liang, 2020. "Does open peer review improve citation count? Evidence from a propensity score matching analysis of PeerJ," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 607-623, October.
    7. Andrijana Perković Paloš & Antonija Mijatović & Ivan Buljan & Daniel Garcia-Costa & Elena Álvarez-García & Francisco Grimaldo & Ana Marušić, 2023. "Linguistic and semantic characteristics of articles and peer review reports in Social Sciences and Medical and Health Sciences: analysis of articles published in Open Research Central," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4707-4729, August.
    8. Shan Jiang, 2021. "Understanding authors' psychological reactions to peer reviews: a text mining approach," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(7), pages 6085-6103, July.
    9. Lundmark, Erik & Milanov, Hana & Seigner, Benedikt David Christian, 2022. "Can it be measured? A quantitative assessment of critiques of the entrepreneurship literature," Journal of Business Venturing Insights, Elsevier, vol. 17(C).
    10. Sahar Vahdati & Said Fathalla & Christoph Lange & Andreas Behrend & Aysegul Say & Zeynep Say & Sören Auer, 2021. "A comprehensive quality assessment framework for scientific events," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 641-682, January.
    11. Cassidy R. Sugimoto & Blaise Cronin, 2013. "Citation gamesmanship: testing for evidence of ego bias in peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 95(3), pages 851-862, June.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Jerome K. Vanclay, 2012. "Impact factor: outdated artefact or stepping-stone to journal certification?," Scientometrics, Springer;Akadémiai Kiadó, vol. 92(2), pages 211-238, August.
    2. Embiya Celik & Nuray Gedik & Güler Karaman & Turgay Demirel & Yuksel Goktas, 2014. "Mistakes encountered in manuscripts on education and their effects on journal rejections," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(3), pages 1837-1853, March.
    3. Louis Mesnard, 2010. "On Hochberg et al.’s “The tragedy of the reviewer commons”," Scientometrics, Springer;Akadémiai Kiadó, vol. 84(3), pages 903-917, September.
    4. Mario Paolucci & Francisco Grimaldo, 2014. "Mechanism change in a simulation of peer review: from junk support to elitism," Scientometrics, Springer;Akadémiai Kiadó, vol. 99(3), pages 663-688, June.
    5. Lutz Bornmann & Christophe Weymuth & Hans-Dieter Daniel, 2010. "A content analysis of referees’ comments: how do comments on manuscripts rejected by a high-impact journal and later published in either a low- or high-impact journal differ?," Scientometrics, Springer;Akadémiai Kiadó, vol. 83(2), pages 493-506, May.
    6. Drahomira Herrmannova & Robert M. Patton & Petr Knoth & Christopher G. Stahl, 2018. "Do citations and readership identify seminal publications?," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(1), pages 239-262, April.
    7. Michael McAleer & Judit Olah & Jozsef Popp, 2018. "Pros and Cons of the Impact Factor in a Rapidly Changing Digital World," Tinbergen Institute Discussion Papers 18-014/III, Tinbergen Institute.
    8. Olgica Nedić & Aleksandar Dekanski, 2016. "Priority criteria in peer review of scientific articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 107(1), pages 15-26, April.
    9. Lutz Bornmann, 2013. "Research Misconduct—Definitions, Manifestations and Extent," Publications, MDPI, vol. 1(3), pages 1-12, October.
    10. Pardeep Sud & Mike Thelwall, 2014. "Evaluating altmetrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(2), pages 1131-1143, February.
    11. Akram Osman & Naomie Salim & Faisal Saeed, 2019. "Quality dimensions features for identifying high-quality user replies in text forum threads using classification methods," PLOS ONE, Public Library of Science, vol. 14(5), pages 1-26, May.
    12. Jie Zhao & Jianfei Wang & Suping Fang & Peiquan Jin, 2018. "Towards Sustainable Development of Online Communities in the Big Data Era: A Study of the Causes and Possible Consequence of Voting on User Reviews," Sustainability, MDPI, vol. 10(9), pages 1-18, September.
    13. Makri, Katerina & Papadas, Karolos & Schlegelmilch, Bodo B., 2021. "Global social networking sites and global identity: A three-country study," Journal of Business Research, Elsevier, vol. 130(C), pages 482-492.
    14. Chetty, Krish & Aneja, Urvashi & Mishra, Vidisha & Gcora, Nozibele & Josie, Jaya, 2018. "Bridging the digital divide in the G20: Skills for the new age," Economics - The Open-Access, Open-Assessment E-Journal (2007-2020), Kiel Institute for the World Economy (IfW Kiel), vol. 12, pages 1-20.
    15. SeungGwan Lee & DaeHo Lee, 2018. "A personalized channel recommendation and scheduling system considering both section video clips and full video clips," PLOS ONE, Public Library of Science, vol. 13(7), pages 1-14, July.
    16. Caroline M. Hoxby, 2018. "The Productivity of US Postsecondary Institutions," NBER Chapters, in: Productivity in Higher Education, pages 31-66, National Bureau of Economic Research, Inc.
    17. Catalina Granda & Franz Hamann, 2015. "Informality, Saving and Wealth Inequality in Colombia," IDB Publications (Working Papers) 88196, Inter-American Development Bank.
    18. Zhan Wang, 2021. "Social media brand posts and customer engagement," Journal of Brand Management, Palgrave Macmillan, vol. 28(6), pages 685-699, November.
    19. Meva Bayrak Karsli & Sinem Karabey & Nergiz Ercil Cagiltay & Yuksel Goktas, 2018. "Comparison of the discussion sections of PhD dissertations in educational technology: the case of Turkey and the USA," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(3), pages 1381-1403, December.
    20. Ju Wen & Lei Lei, 2022. "Adjectives and adverbs in life sciences across 50 years: implications for emotions and readability in academic texts," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(8), pages 4731-4749, August.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:91:y:2012:i:3:d:10.1007_s11192-011-0569-5. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.