IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v99y2014i2d10.1007_s11192-013-1198-y.html
   My bibliography  Save this article

A web application for aggregating conflicting reviewers’ preferences

Author

Listed:
  • J. A. García

    (Universidad de Granada)

  • Rosa Rodriguez-Sánchez

    (Universidad de Granada)

  • J. Fdez-Valdivia

    (Universidad de Granada)

  • F. Moya-Anegón

    (Centro de Ciencias Humanas y Sociales (CCHS))

Abstract

Drawing on social choice theory we derive a rationale in which each reviewer is asked to provide his or her second, third, and fourth choice in addition to his/her first choice recommendation regarding the acceptance/revision/rejection of a given manuscript. All reviewers’ hierarchies of alternatives are collected and combined such that an overall ranking can be computed. Consequently, conflicting recommendations are resolved not by asking a third adjudicating reviewer for his/her recommendation as is usual editorial praxis in many scientific journals, but rather by using more information from the available judges. After a brief introduction into social choice theory and a description and justification of the maximum likelihood rule for ranking alternatives, we describe and demonstrate a public available web application that provides easy-to-use tools to apply these methods for aggregating conflicting reviewers’ recommendations. This application might be accessed by editors to aid their decision process in case they receive conflicting recommendations by their reviewers.

Suggested Citation

  • J. A. García & Rosa Rodriguez-Sánchez & J. Fdez-Valdivia & F. Moya-Anegón, 2014. "A web application for aggregating conflicting reviewers’ preferences," Scientometrics, Springer;Akadémiai Kiadó, vol. 99(2), pages 523-539, May.
  • Handle: RePEc:spr:scient:v:99:y:2014:i:2:d:10.1007_s11192-013-1198-y
    DOI: 10.1007/s11192-013-1198-y
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-013-1198-y
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-013-1198-y?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    2. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    3. Peyton Young, 1995. "Optimal Voting Rules," Journal of Economic Perspectives, American Economic Association, vol. 9(1), pages 51-64, Winter.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Jürgen Janger & Nicole Schmidt-Padickakudy & Anna Strauss-Kollin, 2019. "International Differences in Basic Research Grant Funding. A Systematic Comparison," WIFO Studies, WIFO, number 61664.
    2. Meyer, Matthias & Waldkirch, Rüdiger W. & Duscher, Irina & Just, Alexander, 2018. "Drivers of citations: An analysis of publications in “top” accounting journals," CRITICAL PERSPECTIVES ON ACCOUNTING, Elsevier, vol. 51(C), pages 24-46.
    3. Feliciani, Thomas & Morreau, Michael & Luo, Junwen & Lucas, Pablo & Shankar, Kalpana, 2022. "Designing grant-review panels for better funding decisions: Lessons from an empirically calibrated simulation model," Research Policy, Elsevier, vol. 51(4).
    4. Andrada Elena Urda-Cîmpean & Sorana D. Bolboacă & Andrei Achimaş-Cadariu & Tudor Cătălin Drugan, 2016. "Knowledge Production in Two Types of Medical PhD Routes—What’s to Gain?," Publications, MDPI, vol. 4(2), pages 1-16, June.
    5. Oleksiyenko, Anatoly V., 2023. "Geopolitical agendas and internationalization of post-soviet higher education: Discursive dilemmas in the realm of the prestige economy," International Journal of Educational Development, Elsevier, vol. 102(C).
    6. Randa Alsabahi, 2022. "English Medium Publications: Opening or Closing Doors to Authors with Non-English Language Backgrounds," English Language Teaching, Canadian Center of Science and Education, vol. 15(10), pages 1-18, October.
    7. Yuetong Chen & Hao Wang & Baolong Zhang & Wei Zhang, 2022. "A method of measuring the article discriminative capacity and its distribution," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(6), pages 3317-3341, June.
    8. Qianjin Zong & Yafen Xie & Jiechun Liang, 2020. "Does open peer review improve citation count? Evidence from a propensity score matching analysis of PeerJ," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 607-623, October.
    9. Thomas Feliciani & Junwen Luo & Lai Ma & Pablo Lucas & Flaminio Squazzoni & Ana Marušić & Kalpana Shankar, 2019. "A scoping review of simulation models of peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 555-594, October.
    10. David Card & Stefano DellaVigna, 2020. "What Do Editors Maximize? Evidence from Four Economics Journals," The Review of Economics and Statistics, MIT Press, vol. 102(1), pages 195-217, March.
    11. Weinhold, Ines & Gurtner, Sebastian, 2014. "Understanding shortages of sufficient health care in rural areas," Health Policy, Elsevier, vol. 118(2), pages 201-214.
    12. Minhyeok Lee, 2023. "Game-Theoretical Analysis of Reviewer Rewards in Peer-Review Journal Systems: Analysis and Experimental Evaluation using Deep Reinforcement Learning," Papers 2305.12088, arXiv.org.
    13. Mohammadamin Erfanmanesh & Jaime A. Teixeira da Silva, 2019. "Is the soundness-only quality control policy of open access mega journals linked to a higher rate of published errors?," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 917-923, August.
    14. Emanuel Kulczycki & Tim C. E. Engels & Janne Pölönen & Kasper Bruun & Marta Dušková & Raf Guns & Robert Nowotniak & Michal Petr & Gunnar Sivertsen & Andreja Istenič Starčič & Alesia Zuccala, 2018. "Publication patterns in the social sciences and humanities: evidence from eight European countries," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(1), pages 463-486, July.
    15. Vincent Chandler, 2019. "Identifying emerging scholars: seeing through the crystal ball of scholarship selection committees," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 39-56, July.
    16. Alhamami, Munassir, 2023. "Inequity, inequality, and language rights in English as a medium of instruction programs," Evaluation and Program Planning, Elsevier, vol. 99(C).
    17. Kok, Holmer & Faems, Dries & de Faria, Pedro, 2022. "Pork Barrel or Barrel of Gold? Examining the performance implications of earmarking in public R&D grants," Research Policy, Elsevier, vol. 51(7).
    18. Cheng, Xi & Wang, Haoran & Tang, Li & Jiang, Weiyan & Zhou, Maotian & Wang, Guoyan, 2024. "Open peer review correlates with altmetrics but not with citations: Evidence from Nature Communications and PLoS One," Journal of Informetrics, Elsevier, vol. 18(3).
    19. Wiltrud Kuhlisch & Magnus Roos & Jörg Rothe & Joachim Rudolph & Björn Scheuermann & Dietrich Stoyan, 2016. "A statistical approach to calibrating the scores of biased reviewers of scientific papers," Metrika: International Journal for Theoretical and Applied Statistics, Springer, vol. 79(1), pages 37-57, January.
    20. Gemma Elizabeth Derrick & Alessandra Zimmermann & Helen Greaves & Jonathan Best & Richard Klavans, 2024. "Targeted, actionable and fair: Reviewer reports as feedback and its effect on ECR career choices," Research Evaluation, Oxford University Press, vol. 32(4), pages 648-657.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:99:y:2014:i:2:d:10.1007_s11192-013-1198-y. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.