IDEAS home Printed from https://ideas.repec.org/p/biu/wpaper/2019-08.html
   My bibliography  Save this paper

Arbitrariness in the Peer Review Process

Author

Listed:
  • Elise S. Brezis

    (Bar-Ilan University)

  • Aliaksandr Birukou

Abstract

The purpose of this paper is to analyze the causes and effects of arbitrariness in the peer review process. This paper focuses on two main reasons for the arbitrariness in peer review. The first is that referees are not homogenous and display homophily in their taste and perception of innovative ideas. The second element is that reviewers are different in the time they allocate for peer review. Our model replicates the NIPS experiment of 2014, showing that the ratings of peer review are not robust, and that altering reviewers leads to a dramatic impact on the ranking of the papers. This paper also shows that innovative works are not highly ranked in the existing peer review process, and in consequence are often rejected.

Suggested Citation

  • Elise S. Brezis & Aliaksandr Birukou, 2019. "Arbitrariness in the Peer Review Process," Working Papers 2019-08, Bar-Ilan University, Department of Economics.
  • Handle: RePEc:biu:wpaper:2019-08
    as

    Download full text from publisher

    File URL: https://econ.biu.ac.il/sites/econ/files/working-papers/2019-08.pdf
    File Function: Working paper
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Terttu Luukkonen, 2012. "Conservatism and risk-taking in peer review: Emerging ERC practices," Research Evaluation, Oxford University Press, vol. 21(1), pages 48-60, February.
    2. Elise S Brezis, 2007. "Focal randomisation: An optimal mechanism for the evaluation of R&D projects," Science and Public Policy, Oxford University Press, vol. 34(10), pages 691-698, December.
    3. Lingfei Wu & Dashun Wang & James A. Evans, 2019. "Large teams develop and small teams disrupt science and technology," Nature, Nature, vol. 566(7744), pages 378-382, February.
    4. Flaminio Squazzoni & Elise Brezis & Ana Marušić, 2017. "Scientometrics of peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 501-502, October.
    5. Linton, Jonathan D., 2016. "Improving the Peer review process: Capturing more information and enabling high-risk/high-return research," Research Policy, Elsevier, vol. 45(9), pages 1936-1938.
    6. Kevin J. Boudreau & Eva C. Guinan & Karim R. Lakhani & Christoph Riedl, 2016. "Looking Across and Looking Beyond the Knowledge Frontier: Intellectual Distance, Novelty, and Resource Allocation in Science," Management Science, INFORMS, vol. 62(10), pages 2765-2783, October.
    7. Kevin Gross & Carl T Bergstrom, 2019. "Contest models highlight inherent inefficiencies of scientific funding competitions," PLOS Biology, Public Library of Science, vol. 17(1), pages 1-15, January.
    8. Azzurra Ragone & Katsiaryna Mirylenka & Fabio Casati & Maurizio Marchese, 2013. "On peer review in computer science: analysis of its effectiveness and suggestions for improvement," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 317-356, November.
    9. Christoph Bartneck, 2017. "Reviewers’ scores do not predict impact: bibliometric analysis of the proceedings of the human–robot interaction conference," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(1), pages 179-194, January.
    10. Elizabeth L. Pier & Markus Brauer & Amarette Filut & Anna Kaatz & Joshua Raclaw & Mitchell J. Nathan & Cecilia E. Ford & Molly Carnes, 2018. "Low agreement among reviewers evaluating the same NIH grant applications," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 115(12), pages 2952-2957, March.
    11. Michail Kovanis & Ludovic Trinquart & Philippe Ravaud & Raphaël Porcher, 2017. "Evaluating alternative systems of peer review: a large-scale agent-based modelling approach to scientific publication," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 651-671, October.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Xie, Yundong & Wu, Qiang & Wang, Yezhu & Hou, Li & Liu, Yuanyuan, 2024. "Does the handling time of scientific papers relate to their academic impact and social attention? Evidence from Nature, Science, and PNAS," Journal of Informetrics, Elsevier, vol. 18(2).
    2. Sven Helmer & David B. Blumenthal & Kathrin Paschen, 2020. "What is meaningful research and how should we measure it?," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 153-169, October.
    3. Carol Nash, 2023. "Roles and Responsibilities for Peer Reviewers of International Journals," Publications, MDPI, vol. 11(2), pages 1-24, June.
    4. Pengfei Jia & Weixi Xie & Guangyao Zhang & Xianwen Wang, 2023. "Do reviewers get their deserved acknowledgments from the authors of manuscripts?," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(10), pages 5687-5703, October.
    5. Elena A. Erosheva & Patrícia Martinková & Carole J. Lee, 2021. "When zero may not be zero: A cautionary note on the use of inter‐rater reliability in evaluating grant peer review," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(3), pages 904-919, July.
    6. Tirthankar Ghosal & Sandeep Kumar & Prabhat Kumar Bharti & Asif Ekbal, 2022. "Peer review analyze: A novel benchmark resource for computational analysis of peer reviews," PLOS ONE, Public Library of Science, vol. 17(1), pages 1-29, January.
    7. Eva Barlösius & Laura Paruschke & Axel Philipps, 2024. "Peer review’s irremediable flaws: Scientists’ perspectives on grant evaluation in Germany," Research Evaluation, Oxford University Press, vol. 32(4), pages 623-634.
    8. Jibang Wu & Haifeng Xu & Yifan Guo & Weijie Su, 2023. "A Truth Serum for Eliciting Self-Evaluations in Scientific Reviews," Papers 2306.11154, arXiv.org, revised Feb 2024.
    9. Axel Philipps, 2022. "Research funding randomly allocated? A survey of scientists’ views on peer review and lottery," Science and Public Policy, Oxford University Press, vol. 49(3), pages 365-377.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Pierre Azoulay & Danielle Li, 2020. "Scientific Grant Funding," NBER Working Papers 26889, National Bureau of Economic Research, Inc.
    2. Chiara Franzoni & Paula Stephan & Reinhilde Veugelers, 2022. "Funding Risky Research," Entrepreneurship and Innovation Policy and the Economy, University of Chicago Press, vol. 1(1), pages 103-133.
    3. Thomas Feliciani & Junwen Luo & Lai Ma & Pablo Lucas & Flaminio Squazzoni & Ana Marušić & Kalpana Shankar, 2019. "A scoping review of simulation models of peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 555-594, October.
    4. Pierre Azoulay & Danielle Li, 2020. "Scientific Grant Funding," NBER Chapters, in: Innovation and Public Policy, pages 117-150, National Bureau of Economic Research, Inc.
    5. Axel Philipps, 2022. "Research funding randomly allocated? A survey of scientists’ views on peer review and lottery," Science and Public Policy, Oxford University Press, vol. 49(3), pages 365-377.
    6. Banal-Estañol, Albert & Macho-Stadler, Inés & Pérez-Castrillo, David, 2019. "Evaluation in research funding agencies: Are structurally diverse teams biased against?," Research Policy, Elsevier, vol. 48(7), pages 1823-1840.
    7. Sam Arts & Nicola Melluso & Reinhilde Veugelers, 2023. "Beyond Citations: Measuring Novel Scientific Ideas and their Impact in Publication Text," Papers 2309.16437, arXiv.org, revised Oct 2024.
    8. Zhentao Liang & Jin Mao & Gang Li, 2023. "Bias against scientific novelty: A prepublication perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 99-114, January.
    9. Elias Bouacida & Renaud Foucart, 2022. "Rituals of Reason," Working Papers 344119591, Lancaster University Management School, Economics Department.
    10. Conor O’Kane & Jing A. Zhang & Jarrod Haar & James A. Cunningham, 2023. "How scientists interpret and address funding criteria: value creation and undesirable side effects," Small Business Economics, Springer, vol. 61(2), pages 799-826, August.
    11. Albert Banal-Estañol & Ines Macho-Stadler & David Pérez-Castrillo, 2016. "Key Success Drivers in Public Research Grants: Funding the Seeds of Radical Innovation in Academia?," CESifo Working Paper Series 5852, CESifo.
    12. Nicolas Carayol, 2016. "The Right Job and the Job Right: Novelty, Impact and Journal Stratification in Science," Post-Print hal-02274661, HAL.
    13. van Dalen, Hendrik Peter, 2020. "How the Publish-or-Perish Principle Divides a Science : The Case of Academic Economists," Discussion Paper 2020-020, Tilburg University, Center for Economic Research.
    14. Yuetong Chen & Hao Wang & Baolong Zhang & Wei Zhang, 2022. "A method of measuring the article discriminative capacity and its distribution," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(6), pages 3317-3341, June.
    15. Charles Ayoubi & Michele Pezzoni & Fabiana Visentin, 2021. "Does It Pay to Do Novel Science? The Selectivity Patterns in Science Funding," Science and Public Policy, Oxford University Press, vol. 48(5), pages 635-648.
    16. Pierre Pelletier & Kevin Wirtz, 2023. "Sails and Anchors: The Complementarity of Exploratory and Exploitative Scientists in Knowledge Creation," Papers 2312.10476, arXiv.org.
    17. Kevin Gross & Carl T Bergstrom, 2019. "Contest models highlight inherent inefficiencies of scientific funding competitions," PLOS Biology, Public Library of Science, vol. 17(1), pages 1-15, January.
    18. Marco Ottaviani, 2020. "Grantmaking," Working Papers 672, IGIER (Innocenzo Gasparini Institute for Economic Research), Bocconi University.
    19. Christian Catalini & Christian Fons-Rosen & Patrick Gaulé, 2020. "How Do Travel Costs Shape Collaboration?," Management Science, INFORMS, vol. 66(8), pages 3340-3360, August.
    20. Hou, Jianhua & Wang, Dongyi & Li, Jing, 2022. "A new method for measuring the originality of academic articles based on knowledge units in semantic networks," Journal of Informetrics, Elsevier, vol. 16(3).

    More about this item

    Keywords

    arbitrariness; homophily; peer review; innovation;
    All these keywords.

    JEL classification:

    • D73 - Microeconomics - - Analysis of Collective Decision-Making - - - Bureaucracy; Administrative Processes in Public Organizations; Corruption
    • G01 - Financial Economics - - General - - - Financial Crises
    • G18 - Financial Economics - - General Financial Markets - - - Government Policy and Regulation
    • L51 - Industrial Organization - - Regulation and Industrial Policy - - - Economics of Regulation

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:biu:wpaper:2019-08. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Department of Economics (email available below). General contact details of provider: https://edirc.repec.org/data/debaril.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.