IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0071693.html
   My bibliography  Save this article

Teleconference versus Face-to-Face Scientific Peer Review of Grant Application: Effects on Review Outcomes

Author

Listed:
  • Stephen A Gallo
  • Afton S Carpenter
  • Scott R Glisson

Abstract

Teleconferencing as a setting for scientific peer review is an attractive option for funding agencies, given the substantial environmental and cost savings. Despite this, there is a paucity of published data validating teleconference-based peer review compared to the face-to-face process.Our aim was to conduct a retrospective analysis of scientific peer review data to investigate whether review setting has an effect on review process and outcome measures.We analyzed reviewer scoring data from a research program that had recently modified the review setting from face-to-face to a teleconference format with minimal changes to the overall review procedures. This analysis included approximately 1600 applications over a 4-year period: two years of face-to-face panel meetings compared to two years of teleconference meetings. The average overall scientific merit scores, score distribution, standard deviations and reviewer inter-rater reliability statistics were measured, as well as reviewer demographics and length of time discussing applications.The data indicate that few differences are evident between face-to-face and teleconference settings with regard to average overall scientific merit score, scoring distribution, standard deviation, reviewer demographics or inter-rater reliability. However, some difference was found in the discussion time.These findings suggest that most review outcome measures are unaffected by review setting, which would support the trend of using teleconference reviews rather than face-to-face meetings. However, further studies are needed to assess any correlations among discussion time, application funding and the productivity of funded research projects.

Suggested Citation

  • Stephen A Gallo & Afton S Carpenter & Scott R Glisson, 2013. "Teleconference versus Face-to-Face Scientific Peer Review of Grant Application: Effects on Review Outcomes," PLOS ONE, Public Library of Science, vol. 8(8), pages 1-9, August.
  • Handle: RePEc:plo:pone00:0071693
    DOI: 10.1371/journal.pone.0071693
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0071693
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0071693&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0071693?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Oecd, 2008. "DAC Peer Review of Greece," OECD Journal on Development, OECD Publishing, vol. 7(4), pages 7-93.
    2. Michael Obrecht & Karl Tibelius & Guy D'Aloisio, 2007. "Examining the value added by committee discussion in the review of applications for research awards," Research Evaluation, Oxford University Press, vol. 16(2), pages 79-91, June.
    3. Upali W. Jayasinghe & Herbert W. Marsh & Nigel Bond, 2003. "A multilevel cross‐classified modelling approach to peer review of grant proposals: the effects of assessor and researcher attributes on assessor ratings," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 166(3), pages 279-300, October.
    4. Harmon, Joel & Schneer, Joy A. & Hoffman, L. Richard, 1995. "Electronic Meetings and Established Decision Groups: Audioconferencing Effects on Performance and Structural Stability," Organizational Behavior and Human Decision Processes, Elsevier, vol. 61(2), pages 138-147, February.
    5. Oecd, 2008. "DAC Peer Review of the European Community," OECD Journal on Development, OECD Publishing, vol. 8(4), pages 127-261.
    6. Michael R Martin & Andrea Kopstein & Joy M Janice, 2010. "An Analysis of Preliminary and Post-Discussion Priority Scores for Grant Applications Peer Reviewed by the Center for Scientific Review at the NIH," PLOS ONE, Public Library of Science, vol. 5(11), pages 1-6, November.
    7. Oecd, 2008. "DAC Peer Review of Canada," OECD Journal on Development, OECD Publishing, vol. 8(4), pages 263-387.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. David G Pina & Darko Hren & Ana Marušić, 2015. "Peer Review Evaluation Process of Marie Curie Actions under EU’s Seventh Framework Programme for Research," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-15, June.
    2. Stephen A Gallo & Afton S Carpenter & David Irwin & Caitlin D McPartland & Joseph Travis & Sofie Reynders & Lisa A Thompson & Scott R Glisson, 2014. "The Validation of Peer Review through Research Impact Measures and the Implications for Funding Strategies," PLOS ONE, Public Library of Science, vol. 9(9), pages 1-9, September.
    3. Gemma E Derrick & Julie Bayley, 2022. "The Corona-Eye: Exploring the risks of COVID-19 on fair assessments of impact for REF2021," Research Evaluation, Oxford University Press, vol. 31(1), pages 93-103.
    4. Miriam L E Steiner Davis & Tiffani R Conner & Kate Miller-Bains & Leslie Shapard, 2020. "What makes an effective grants peer reviewer? An exploratory study of the necessary skills," PLOS ONE, Public Library of Science, vol. 15(5), pages 1-22, May.
    5. Katie Meadmore & Kathryn Fackrell & Alejandra Recio-Saucedo & Abby Bull & Simon D S Fraser & Amanda Blatch-Jones, 2020. "Decision-making approaches used by UK and international health funding organisations for allocating research funds: A survey of current practice," PLOS ONE, Public Library of Science, vol. 15(11), pages 1-17, November.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. David G Pina & Darko Hren & Ana Marušić, 2015. "Peer Review Evaluation Process of Marie Curie Actions under EU’s Seventh Framework Programme for Research," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-15, June.
    2. ederico Bianchi & Flaminio Squazzoni, 2022. "Can transparency undermine peer review? A simulation model of scientist behavior under open peer review [Reviewing Peer Review]," Science and Public Policy, Oxford University Press, vol. 49(5), pages 791-800.
    3. Sun, Zhuanlan & Clark Cao, C. & Ma, Chao & Li, Yiwei, 2023. "The academic status of reviewers predicts their language use," Journal of Informetrics, Elsevier, vol. 17(4).
    4. Maciej J. Mrowinski & Agata Fronczak & Piotr Fronczak & Olgica Nedic & Aleksandar Dekanski, 2020. "The hurdles of academic publishing from the perspective of journal editors: a case study," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 115-133, October.
    5. Zhao, Zhi-Dan & Chen, Jiahao & Lu, Yichuan & Zhao, Na & Jiang, Dazhi & Wang, Bing-Hong, 2021. "Dynamic patterns of open review process," Physica A: Statistical Mechanics and its Applications, Elsevier, vol. 582(C).
    6. Besim Bilalli & Rana Faisal Munir & Alberto Abelló, 2021. "A framework for assessing the peer review duration of journals: case study in computer science," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 545-563, January.
    7. Siddarth Srinivasan & Jamie Morgenstern, 2021. "Auctions and Peer Prediction for Academic Peer Review," Papers 2109.00923, arXiv.org, revised May 2023.
    8. José Cendejas Bueno & Cecilia Font de Villanueva, 2015. "Convergence of inflation with a common cycle: estimating and modelling Spanish historical inflation from the 16th to the 18th centuries," Empirical Economics, Springer, vol. 48(4), pages 1643-1665, June.
    9. Lucas Rodriguez Forti & Luiz A. Solino & Judit K. Szabo, 2021. "Trade-off between urgency and reduced editorial capacity affect publication speed in ecological and medical journals during 2020," Palgrave Communications, Palgrave Macmillan, vol. 8(1), pages 1-9, December.
    10. Elena A. Erosheva & Patrícia Martinková & Carole J. Lee, 2021. "When zero may not be zero: A cautionary note on the use of inter‐rater reliability in evaluating grant peer review," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(3), pages 904-919, July.
    11. Miriam L E Steiner Davis & Tiffani R Conner & Kate Miller-Bains & Leslie Shapard, 2020. "What makes an effective grants peer reviewer? An exploratory study of the necessary skills," PLOS ONE, Public Library of Science, vol. 15(5), pages 1-22, May.
    12. Albert Banal-Estañol & Qianshuo Liu & Inés Macho-Stadler & David Pérez-Castrillo, 2021. "Similar-to-me Effects in the Grant Application Process: Applicants, Panelists, and the Likelihood of Obtaining Funds," Working Papers 1289, Barcelona School of Economics.
    13. Louise Bedsworth, 2012. "California’s local health agencies and the state’s climate adaptation strategy," Climatic Change, Springer, vol. 111(1), pages 119-133, March.
    14. Patrícia Martinková & Dan Goldhaber & Elena Erosheva, 2018. "Disparities in ratings of internal and external applicants: A case for model-based inter-rater reliability," PLOS ONE, Public Library of Science, vol. 13(10), pages 1-17, October.
    15. Wen Luo & Oi-Man Kwok, 2010. "Proportional Reduction of Prediction Error in Cross-Classified Random Effects Models," Sociological Methods & Research, , vol. 39(2), pages 188-205, November.
    16. Feliciani, Thomas & Morreau, Michael & Luo, Junwen & Lucas, Pablo & Shankar, Kalpana, 2022. "Designing grant-review panels for better funding decisions: Lessons from an empirically calibrated simulation model," Research Policy, Elsevier, vol. 51(4).
    17. Benedetto Lepori & Emanuela Reale & Stig Slipersaeter, 2011. "The Construction of New Indicators for Science and Innovation Policies: The Case of Project Funding Indicators," Chapters, in: Massimo G. Colombo & Luca Grilli & Lucia Piscitello & Cristina Rossi-Lamastra (ed.), Science and Innovation Policy for the New Knowledge Economy, chapter 2, Edward Elgar Publishing.
    18. Andreas Ortmann & Benoît Walraevens, 2012. "Adam Smith, Philosopher and Man of the World," Post-Print halshs-00756341, HAL.
    19. Martin, Nigel & Rice, John, 2015. "Improving Australia's renewable energy project policy and planning: A multiple stakeholder analysis," Energy Policy, Elsevier, vol. 84(C), pages 128-141.
    20. Manuel Bagues & Mauro Sylos-Labini & Natalia Zinovyeva, 2017. "Does the Gender Composition of Scientific Committees Matter?," American Economic Review, American Economic Association, vol. 107(4), pages 1207-1238, April.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0071693. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.