IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v123y2020i1d10.1007_s11192-020-03348-1.html
   My bibliography  Save this article

Arbitrariness in the peer review process

Author

Listed:
  • Elise S. Brezis

    (Bar-Ilan University)

  • Aliaksandr Birukou

    (Springer-Verlag GmbH
    Peoples’ Friendship University of Russia (RUDN University))

Abstract

The purpose of this paper is to analyze the causes and effects of arbitrariness in the peer review process. This paper focuses on two main reasons for the arbitrariness in peer review. The first is that referees are not homogenous and display homophily in their taste and perception of innovative ideas. The second element is that reviewers are different in the time they allocate for peer review. Our model replicates the NIPS experiment of 2014, showing that the ratings of peer review are not robust, and that altering reviewers leads to a dramatic impact on the ranking of the papers. This paper also shows that innovative works are not highly ranked in the existing peer review process, and in consequence are often rejected.

Suggested Citation

  • Elise S. Brezis & Aliaksandr Birukou, 2020. "Arbitrariness in the peer review process," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(1), pages 393-411, April.
  • Handle: RePEc:spr:scient:v:123:y:2020:i:1:d:10.1007_s11192-020-03348-1
    DOI: 10.1007/s11192-020-03348-1
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-020-03348-1
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-020-03348-1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to look for a different version below or search for a different version of it.

    Other versions of this item:

    References listed on IDEAS

    as
    1. Terttu Luukkonen, 2012. "Conservatism and risk-taking in peer review: Emerging ERC practices," Research Evaluation, Oxford University Press, vol. 21(1), pages 48-60, February.
    2. Elise S Brezis, 2007. "Focal randomisation: An optimal mechanism for the evaluation of R&D projects," Science and Public Policy, Oxford University Press, vol. 34(10), pages 691-698, December.
    3. Lingfei Wu & Dashun Wang & James A. Evans, 2019. "Large teams develop and small teams disrupt science and technology," Nature, Nature, vol. 566(7744), pages 378-382, February.
    4. Flaminio Squazzoni & Elise Brezis & Ana Marušić, 2017. "Scientometrics of peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 501-502, October.
    5. Linton, Jonathan D., 2016. "Improving the Peer review process: Capturing more information and enabling high-risk/high-return research," Research Policy, Elsevier, vol. 45(9), pages 1936-1938.
    6. Kevin J. Boudreau & Eva C. Guinan & Karim R. Lakhani & Christoph Riedl, 2016. "Looking Across and Looking Beyond the Knowledge Frontier: Intellectual Distance, Novelty, and Resource Allocation in Science," Management Science, INFORMS, vol. 62(10), pages 2765-2783, October.
    7. Kevin Gross & Carl T Bergstrom, 2019. "Contest models highlight inherent inefficiencies of scientific funding competitions," PLOS Biology, Public Library of Science, vol. 17(1), pages 1-15, January.
    8. Azzurra Ragone & Katsiaryna Mirylenka & Fabio Casati & Maurizio Marchese, 2013. "On peer review in computer science: analysis of its effectiveness and suggestions for improvement," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 317-356, November.
    9. Christoph Bartneck, 2017. "Reviewers’ scores do not predict impact: bibliometric analysis of the proceedings of the human–robot interaction conference," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(1), pages 179-194, January.
    10. repec:nas:journl:v:115:y:2018:p:2952-2957 is not listed on IDEAS
    11. Michail Kovanis & Ludovic Trinquart & Philippe Ravaud & Raphaël Porcher, 2017. "Evaluating alternative systems of peer review: a large-scale agent-based modelling approach to scientific publication," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 651-671, October.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Elena A. Erosheva & Patrícia Martinková & Carole J. Lee, 2021. "When zero may not be zero: A cautionary note on the use of inter‐rater reliability in evaluating grant peer review," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(3), pages 904-919, July.
    2. Jibang Wu & Haifeng Xu & Yifan Guo & Weijie Su, 2023. "A Truth Serum for Eliciting Self-Evaluations in Scientific Reviews," Papers 2306.11154, arXiv.org, revised Feb 2024.
    3. Xie, Yundong & Wu, Qiang & Wang, Yezhu & Hou, Li & Liu, Yuanyuan, 2024. "Does the handling time of scientific papers relate to their academic impact and social attention? Evidence from Nature, Science, and PNAS," Journal of Informetrics, Elsevier, vol. 18(2).
    4. Sven Helmer & David B. Blumenthal & Kathrin Paschen, 2020. "What is meaningful research and how should we measure it?," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 153-169, October.
    5. Carol Nash, 2023. "Roles and Responsibilities for Peer Reviewers of International Journals," Publications, MDPI, vol. 11(2), pages 1-24, June.
    6. Pengfei Jia & Weixi Xie & Guangyao Zhang & Xianwen Wang, 2023. "Do reviewers get their deserved acknowledgments from the authors of manuscripts?," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(10), pages 5687-5703, October.
    7. Tirthankar Ghosal & Sandeep Kumar & Prabhat Kumar Bharti & Asif Ekbal, 2022. "Peer review analyze: A novel benchmark resource for computational analysis of peer reviews," PLOS ONE, Public Library of Science, vol. 17(1), pages 1-29, January.
    8. Eva Barlösius & Laura Paruschke & Axel Philipps, 2024. "Peer review’s irremediable flaws: Scientists’ perspectives on grant evaluation in Germany," Research Evaluation, Oxford University Press, vol. 32(4), pages 623-634.
    9. Axel Philipps, 2022. "Research funding randomly allocated? A survey of scientists’ views on peer review and lottery," Science and Public Policy, Oxford University Press, vol. 49(3), pages 365-377.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Thomas Feliciani & Junwen Luo & Lai Ma & Pablo Lucas & Flaminio Squazzoni & Ana Marušić & Kalpana Shankar, 2019. "A scoping review of simulation models of peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 555-594, October.
    2. Pierre Azoulay & Danielle Li, 2020. "Scientific Grant Funding," NBER Working Papers 26889, National Bureau of Economic Research, Inc.
    3. Chiara Franzoni & Paula Stephan & Reinhilde Veugelers, 2022. "Funding Risky Research," Entrepreneurship and Innovation Policy and the Economy, University of Chicago Press, vol. 1(1), pages 103-133.
    4. Pierre Azoulay & Danielle Li, 2020. "Scientific Grant Funding," NBER Chapters, in: Innovation and Public Policy, pages 117-150, National Bureau of Economic Research, Inc.
    5. Axel Philipps, 2022. "Research funding randomly allocated? A survey of scientists’ views on peer review and lottery," Science and Public Policy, Oxford University Press, vol. 49(3), pages 365-377.
    6. Banal-Estañol, Albert & Macho-Stadler, Inés & Pérez-Castrillo, David, 2019. "Evaluation in research funding agencies: Are structurally diverse teams biased against?," Research Policy, Elsevier, vol. 48(7), pages 1823-1840.
    7. Sam Arts & Nicola Melluso & Reinhilde Veugelers, 2023. "Beyond Citations: Measuring Novel Scientific Ideas and their Impact in Publication Text," Papers 2309.16437, arXiv.org, revised Oct 2024.
    8. Zhentao Liang & Jin Mao & Gang Li, 2023. "Bias against scientific novelty: A prepublication perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 99-114, January.
    9. Conor O’Kane & Jing A. Zhang & Jarrod Haar & James A. Cunningham, 2023. "How scientists interpret and address funding criteria: value creation and undesirable side effects," Small Business Economics, Springer, vol. 61(2), pages 799-826, August.
    10. Albert Banal-Estañol & Ines Macho-Stadler & David Pérez-Castrillo, 2016. "Key Success Drivers in Public Research Grants: Funding the Seeds of Radical Innovation in Academia?," CESifo Working Paper Series 5852, CESifo.
    11. Nicolas Carayol, 2016. "The Right Job and the Job Right: Novelty, Impact and Journal Stratification in Science," Post-Print hal-02274661, HAL.
    12. Yuetong Chen & Hao Wang & Baolong Zhang & Wei Zhang, 2022. "A method of measuring the article discriminative capacity and its distribution," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(6), pages 3317-3341, June.
    13. Charles Ayoubi & Michele Pezzoni & Fabiana Visentin, 2021. "Does It Pay to Do Novel Science? The Selectivity Patterns in Science Funding," Science and Public Policy, Oxford University Press, vol. 48(5), pages 635-648.
    14. Pierre Pelletier & Kevin Wirtz, 2023. "Sails and Anchors: The Complementarity of Exploratory and Exploitative Scientists in Knowledge Creation," Papers 2312.10476, arXiv.org.
    15. Marco Ottaviani, 2020. "Grantmaking," Working Papers 672, IGIER (Innocenzo Gasparini Institute for Economic Research), Bocconi University.
    16. Christian Catalini & Christian Fons-Rosen & Patrick Gaulé, 2020. "How Do Travel Costs Shape Collaboration?," Management Science, INFORMS, vol. 66(8), pages 3340-3360, August.
    17. Weixi Xie & Pengfei Jia & Guangyao Zhang & Xianwen Wang, 2024. "Are reviewer scores consistent with citations?," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(8), pages 4721-4740, August.
    18. Hou, Jianhua & Wang, Dongyi & Li, Jing, 2022. "A new method for measuring the originality of academic articles based on knowledge units in semantic networks," Journal of Informetrics, Elsevier, vol. 16(3).
    19. Stephen Gallo & Lisa Thompson & Karen Schmaling & Scott Glisson, 2018. "Risk evaluation in peer review of grant applications," Environment Systems and Decisions, Springer, vol. 38(2), pages 216-229, June.
    20. Seolmin Yang & So Young Kim, 2023. "Knowledge-integrated research is more disruptive when supported by homogeneous funding sources: a case of US federally funded research in biomedical and life sciences," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(6), pages 3257-3282, June.

    More about this item

    Keywords

    Arbitrariness; Homophily; Peer review; Innovation;
    All these keywords.

    JEL classification:

    • D73 - Microeconomics - - Analysis of Collective Decision-Making - - - Bureaucracy; Administrative Processes in Public Organizations; Corruption
    • G01 - Financial Economics - - General - - - Financial Crises
    • G18 - Financial Economics - - General Financial Markets - - - Government Policy and Regulation
    • L51 - Industrial Organization - - Regulation and Industrial Policy - - - Economics of Regulation

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:123:y:2020:i:1:d:10.1007_s11192-020-03348-1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.