IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v129y2024i7d10.1007_s11192-024-05070-8.html
   My bibliography  Save this article

Are the confidence scores of reviewers consistent with the review content? Evidence from top conference proceedings in AI

Author

Listed:
  • Wenqing Wu

    (Nanjing University of Science and Technology)

  • Haixu Xi

    (Nanjing University of Science and Technology)

  • Chengzhi Zhang

    (Nanjing University of Science and Technology)

Abstract

Peer review is a critical process used in academia to assess the quality and validity of research articles. Top-tier conferences in the field of artificial intelligence (e.g. ICLR and ACL et al.) require reviewers to provide confidence scores to ensure the reliability of their review reports. However, existing studies on confidence scores have neglected to measure the consistency between the comment text and the confidence score in a more refined way, which may overlook more detailed details (such as aspects) in the text, leading to incomplete understanding of the results and insufficient objective analysis of the results. In this work, we propose assessing the consistency between the textual content of the review reports and the assigned scores at a fine-grained level, including word, sentence and aspect levels. The data used in this paper is derived from the peer review comments of conferences in the fields of deep learning and natural language processing. We employed deep learning models to detect hedge sentences and their corresponding aspects. Furthermore, we conducted statistical analyses of the length of review reports, frequency of hedge word usage, number of hedge sentences, frequency of aspect mentions, and their associated sentiment to assess the consistency between the textual content and confidence scores. Finally, we performed correlation analysis, significance tests and regression analysis on the data to examine the impact of confidence scores on the outcomes of the papers. The results indicate that textual content of the review reports and their confidence scores have high level of consistency at the word, sentence, and aspect levels. The regression results reveal a negative correlation between confidence scores and paper outcomes, indicating that higher confidence scores given by reviewers were associated with paper rejection. This indicates that current overall assessment of the paper’s content and quality by the experts is reliable, making the transparency and fairness of the peer review process convincing. We release our data and associated codes at https://github.com/njust-winchy/confidence_score .

Suggested Citation

  • Wenqing Wu & Haixu Xi & Chengzhi Zhang, 2024. "Are the confidence scores of reviewers consistent with the review content? Evidence from top conference proceedings in AI," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(7), pages 4109-4135, July.
  • Handle: RePEc:spr:scient:v:129:y:2024:i:7:d:10.1007_s11192-024-05070-8
    DOI: 10.1007/s11192-024-05070-8
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-024-05070-8
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-024-05070-8?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Sun, Zhuanlan & Clark Cao, C. & Ma, Chao & Li, Yiwei, 2023. "The academic status of reviewers predicts their language use," Journal of Informetrics, Elsevier, vol. 17(4).
    2. Peter Kardos & Ádám Kun & Csaba Pléh & Ferenc Jordán, 2023. "(How) should researchers publicize their research papers before peer review?," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 2019-2023, March.
    3. Flaminio Squazzoni & Elise Brezis & Ana Marušić, 2017. "Scientometrics of peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 501-502, October.
    4. Ying He & Kun Tian & Xiaoran Xu, 2023. "A validation study on the factors affecting the practice modes of open peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(1), pages 587-607, January.
    5. Adrian Mulligan & Louise Hall & Ellen Raphael, 2013. "Peer review in a changing world: An international study measuring the attitudes of researchers," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 132-161, January.
    6. Guangyao Zhang & Yuqi Wang & Weixi Xie & Han Du & Chunlin Jiang & Xianwen Wang, 2021. "The open access usage advantage: a temporal and spatial analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(7), pages 6187-6199, July.
    7. Tirthankar Ghosal & Sandeep Kumar & Prabhat Kumar Bharti & Asif Ekbal, 2022. "Peer review analyze: A novel benchmark resource for computational analysis of peer reviews," PLOS ONE, Public Library of Science, vol. 17(1), pages 1-29, January.
    8. Lutz Bornmann & Rüdiger Mutz, 2015. "Growth rates of modern science: A bibliometric analysis based on the number of publications and cited references," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(11), pages 2215-2222, November.
    9. Janine Huisman & Jeroen Smits, 2017. "Duration and quality of the peer review process: the author’s perspective," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 633-650, October.
    10. Vladimir Batagelj & Anuška Ferligoj & Flaminio Squazzoni, 2017. "The emergence of a field: a network analysis of research on peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 503-532, October.
    11. Adrian Mulligan & Louise Hall & Ellen Raphael, 2013. "Peer review in a changing world: An international study measuring the attitudes of researchers," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(1), pages 132-161, January.
    12. Marco Seeber & Alberto Bacchelli, 2017. "Does single blind peer review hinder newcomers?," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 567-585, October.
    13. Yuxian Liu & Ronald Rousseau, 2023. "A proposal for the peer review procedure for funding decisions," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(1), pages 861-865, January.
    14. Zhang, Guangyao & Xu, Shenmeng & Sun, Yao & Jiang, Chunlin & Wang, Xianwen, 2022. "Understanding the peer review endeavor in scientific publishing," Journal of Informetrics, Elsevier, vol. 16(2).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Pengfei Jia & Weixi Xie & Guangyao Zhang & Xianwen Wang, 2023. "Do reviewers get their deserved acknowledgments from the authors of manuscripts?," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(10), pages 5687-5703, October.
    2. Yuetong Chen & Hao Wang & Baolong Zhang & Wei Zhang, 2022. "A method of measuring the article discriminative capacity and its distribution," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(6), pages 3317-3341, June.
    3. Narjes Vara & Mahdieh Mirzabeigi & Hajar Sotudeh & Seyed Mostafa Fakhrahmad, 2022. "Application of k-means clustering algorithm to improve effectiveness of the results recommended by journal recommender system," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(6), pages 3237-3252, June.
    4. Sun, Zhuanlan, 2024. "Textual features of peer review predict top-cited papers: An interpretable machine learning perspective," Journal of Informetrics, Elsevier, vol. 18(2).
    5. Zhang, Guangyao & Xu, Shenmeng & Sun, Yao & Jiang, Chunlin & Wang, Xianwen, 2022. "Understanding the peer review endeavor in scientific publishing," Journal of Informetrics, Elsevier, vol. 16(2).
    6. Sun, Zhuanlan & Pang, Ka Lok & Li, Yiwei, 2024. "The fading of status bias during the open peer review process," Journal of Informetrics, Elsevier, vol. 18(3).
    7. J. A. Garcia & Rosa Rodriguez-Sánchez & J. Fdez-Valdivia, 2021. "The interplay between the reviewer’s incentives and the journal’s quality standard," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(4), pages 3041-3061, April.
    8. Ivana Drvenica & Giangiacomo Bravo & Lucija Vejmelka & Aleksandar Dekanski & Olgica Nedić, 2018. "Peer Review of Reviewers: The Author’s Perspective," Publications, MDPI, vol. 7(1), pages 1-10, December.
    9. Rodríguez Sánchez, Isabel & Makkonen, Teemu & Williams, Allan M., 2019. "Peer review assessment of originality in tourism journals: critical perspective of key gatekeepers," Annals of Tourism Research, Elsevier, vol. 77(C), pages 1-11.
    10. Alessandro Checco & Lorenzo Bracciale & Pierpaolo Loreti & Stephen Pinfield & Giuseppe Bianchi, 2021. "AI-assisted peer review," Palgrave Communications, Palgrave Macmillan, vol. 8(1), pages 1-11, December.
    11. Kuklin, Alexander A. (Куклин, Александр) & Balyakina, Evgeniya A. (Балякина, Евгения), 2017. "Active policy as a key to success for an International Economic Periodical [Активная Политика — Залог Успеха Международного Экономического Журнала]," Ekonomicheskaya Politika / Economic Policy, Russian Presidential Academy of National Economy and Public Administration, vol. 6, pages 160-177, December.
    12. Vivian M Nguyen & Neal R Haddaway & Lee F G Gutowsky & Alexander D M Wilson & Austin J Gallagher & Michael R Donaldson & Neil Hammerschlag & Steven J Cooke, 2015. "How Long Is Too Long in Contemporary Peer Review? Perspectives from Authors Publishing in Conservation Biology Journals," PLOS ONE, Public Library of Science, vol. 10(8), pages 1-20, August.
    13. Dietmar Wolfram & Peiling Wang & Adam Hembree & Hyoungjoo Park, 2020. "Open peer review: promoting transparency in open science," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 1033-1051, November.
    14. Bianchi, Federico & Grimaldo, Francisco & Squazzoni, Flaminio, 2019. "The F3-index. Valuing reviewers for scholarly journals," Journal of Informetrics, Elsevier, vol. 13(1), pages 78-86.
    15. Paul Sebo & Jean Pascal Fournier & Claire Ragot & Pierre-Henri Gorioux & François R. Herrmann & Hubert Maisonneuve, 2019. "Factors associated with publication speed in general medical journals: a retrospective study of bibliometric data," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 1037-1058, May.
    16. Michail Kovanis & Ludovic Trinquart & Philippe Ravaud & Raphaël Porcher, 2017. "Evaluating alternative systems of peer review: a large-scale agent-based modelling approach to scientific publication," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 651-671, October.
    17. Qianjin Zong & Yafen Xie & Jiechun Liang, 2020. "Does open peer review improve citation count? Evidence from a propensity score matching analysis of PeerJ," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 607-623, October.
    18. Maciej J Mrowinski & Piotr Fronczak & Agata Fronczak & Marcel Ausloos & Olgica Nedic, 2017. "Artificial intelligence in peer review: How can evolutionary computation support journal editors?," PLOS ONE, Public Library of Science, vol. 12(9), pages 1-11, September.
    19. Buljan, Ivan & Garcia-Costa, Daniel & Grimaldo, Francisco & Klein, Richard A. & Bakker, Marjan & Marušić, Ana, 2024. "Development and application of a comprehensive glossary for the identification of statistical and methodological concepts in peer review reports," Journal of Informetrics, Elsevier, vol. 18(3).
    20. Eirini Delikoura & Dimitrios Kouis, 2021. "Open Research Data and Open Peer Review: Perceptions of a Medical and Health Sciences Community in Greece," Publications, MDPI, vol. 9(2), pages 1-19, March.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:129:y:2024:i:7:d:10.1007_s11192-024-05070-8. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.