IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v15y2021i3s1751157721000456.html
   My bibliography  Save this article

Post-publication expert recommendations in faculty opinions (F1000Prime): Recommended articles and citations

Author

Listed:
  • Wang, Peiling
  • Su, Jing

Abstract

This exploratory study of the post-publication expert recommendations (PPER) of biomedical articles in Faculty Opinions observed whether the recommended articles were cited differently from other articles in the same journal. The collected data include 830 research articles published in Cell, Nature Genetics, Nature Medicine, and PLoS Biology in 2010 and their 205,976 citations in Web of Science (WoS) from 2010 to 2019. Of the 830 articles, 417 were recommended in Faculty Opinions. A recommendation made by a Faculty Member (FM) includes a star rating and optional classification and commentary. For Nature Genetics, Nature Medicine, and PLoS Biology, the recommended articles (dataset.FM) were cited significantly more than other articles (dataset.other). Certain correlations were found between recommendation level and citedness, but a scaled mapping showed no linear relationship between the two measurements. The majority of the articles reached a citation peak two years after publication. The most assigned classification tags are New Finding, Interesting Hypothesis, Technical Advance, and Novel Drug Target. Sentiment analysis of the 118 recommendations of the 30 top articles found that FM ratings were correlated with sentiment intensity level. The repeated measures ANOVA did not show the Matthew effect of citations. Suggestions include refining Faculty Opinions’ rating schema.

Suggested Citation

  • Wang, Peiling & Su, Jing, 2021. "Post-publication expert recommendations in faculty opinions (F1000Prime): Recommended articles and citations," Journal of Informetrics, Elsevier, vol. 15(3).
  • Handle: RePEc:eee:infome:v:15:y:2021:i:3:s1751157721000456
    DOI: 10.1016/j.joi.2021.101174
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S1751157721000456
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2021.101174?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Bornmann, Lutz & Leydesdorff, Loet, 2015. "Does quality and content matter for citedness? A comparison with para-textual factors and over time," Journal of Informetrics, Elsevier, vol. 9(3), pages 419-429.
    2. Andreas Thor & Lutz Bornmann & Werner Marx & Rüdiger Mutz, 2018. "Identifying single influential publications in a research field: new analysis opportunities of the CRExplorer," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(1), pages 591-608, July.
    3. Peiling Wang & Marilyn Domas White, 1999. "A cognitive model of document use during a research project. Study II. Decisions at the reading and citing stages," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 50(2), pages 98-114.
    4. Anthony F J van Raan & Jos J Winnink, 2019. "The occurrence of ‘Sleeping Beauty’ publications in medical research: Their scientific impact and technological relevance," PLOS ONE, Public Library of Science, vol. 14(10), pages 1-34, October.
    5. Peiling Wang & Joshua Williams & Nan Zhang & Qiang Wu, 2020. "F1000Prime recommended articles and their citations: an exploratory study of four journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(2), pages 933-955, February.
    6. Howard D. White, 2001. "Author-centered bibliometrics through CAMEOs: Characterizations automatically made and edited online," Scientometrics, Springer;Akadémiai Kiadó, vol. 50(3), pages 607-637, January.
    7. Lutz Bornmann & Robin Haunschild, 2018. "Do altmetrics correlate with the quality of papers? A large-scale empirical study based on F1000Prime data," PLOS ONE, Public Library of Science, vol. 13(5), pages 1-12, May.
    8. Ludo Waltman & Rodrigo Costas, 2014. "F1000 Recommendations as a Potential New Data Source for Research Evaluation: A Comparison With Citations," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 65(3), pages 433-445, March.
    9. Drivas, Kyriakos & Kremmydas, Dimitris, 2020. "The Matthew effect of a journal's ranking," Research Policy, Elsevier, vol. 49(4).
    10. Howard D. White, 2001. "Author-centered bibliometrics through CAMEOs: Characterizations automatically made and edited online," Scientometrics, Springer;Akadémiai Kiadó, vol. 51(3), pages 607-637, July.
    11. Ehsan Mohammadi & Mike Thelwall, 2013. "Assessing non-standard article impact using F1000 labels," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 383-395, November.
    12. Lutz Bornmann, 2015. "Interrater reliability and convergent validity of F1000Prime peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(12), pages 2415-2426, December.
    13. Bornmann, Lutz & Haunschild, Robin, 2015. "Which people use which scientific papers? An evaluation of data from F1000 and Mendeley," Journal of Informetrics, Elsevier, vol. 9(3), pages 477-487.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Peiling Wang & Joshua Williams & Nan Zhang & Qiang Wu, 2020. "F1000Prime recommended articles and their citations: an exploratory study of four journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(2), pages 933-955, February.
    2. Weixi Xie & Pengfei Jia & Guangyao Zhang & Xianwen Wang, 2024. "Are reviewer scores consistent with citations?," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(8), pages 4721-4740, August.
    3. Mojisola Erdt & Aarthy Nagarajan & Sei-Ching Joanna Sin & Yin-Leng Theng, 2016. "Altmetrics: an analysis of the state-of-the-art in measuring research impact on social media," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(2), pages 1117-1166, November.
    4. Bornmann, Lutz & Tekles, Alexander & Zhang, Helena H. & Ye, Fred Y., 2019. "Do we measure novelty when we analyze unusual combinations of cited references? A validation study of bibliometric novelty indicators based on F1000Prime data," Journal of Informetrics, Elsevier, vol. 13(4).
    5. Jianhua Hou & Xiucai Yang & Yang Zhang, 2023. "The effect of social media knowledge cascade: an analysis of scientific papers diffusion," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(9), pages 5169-5195, September.
    6. Omar Hernando Avila-Poveda, 2014. "Technical report: the trend of author compound names and its implications for authorship identity identification," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 833-846, October.
    7. Thelwall, Mike & Fairclough, Ruth, 2015. "The influence of time and discipline on the magnitude of correlations between citation counts and quality scores," Journal of Informetrics, Elsevier, vol. 9(3), pages 529-541.
    8. Félix Moya-Anegón & Benjamín Vargas-Quesada & Victor Herrero-Solana & Zaida Chinchilla-Rodríguez & Elena Corera-Álvarez & Francisco J. Munoz-Fernández, 2004. "A new technique for building maps of large scientific domains based on the cocitation of classes and categories," Scientometrics, Springer;Akadémiai Kiadó, vol. 61(1), pages 129-145, September.
    9. Sepideh Fahimifar & Elmira Janavi & Fatemeh Fadaei, 2024. "Awakening the beauty: a journey through dormant gems in strategic management literature," Quality & Quantity: International Journal of Methodology, Springer, vol. 58(4), pages 3331-3362, August.
    10. Bornmann, Lutz, 2014. "Do altmetrics point to the broader impact of research? An overview of benefits and disadvantages of altmetrics," Journal of Informetrics, Elsevier, vol. 8(4), pages 895-903.
    11. Bar-Ilan, Judit, 2008. "Informetrics at the beginning of the 21st century—A review," Journal of Informetrics, Elsevier, vol. 2(1), pages 1-52.
    12. Jianhua Hou & Hao Li & Yang Zhang, 2020. "Identifying the princes base on Altmetrics: An awakening mechanism of sleeping beauties from the perspective of social media," PLOS ONE, Public Library of Science, vol. 15(11), pages 1-28, November.
    13. Bornmann, Lutz, 2014. "Validity of altmetrics data for measuring societal impact: A study using data from Altmetric and F1000Prime," Journal of Informetrics, Elsevier, vol. 8(4), pages 935-950.
    14. Ehsan Mohammadi & Mike Thelwall & Stefanie Haustein & Vincent Larivière, 2015. "Who reads research articles? An altmetrics analysis of Mendeley user categories," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(9), pages 1832-1846, September.
    15. Howard D. White, 2004. "Reward, persuasion, and the Sokal Hoax: A study in citation identities," Scientometrics, Springer;Akadémiai Kiadó, vol. 60(1), pages 93-120, May.
    16. Bornmann, Lutz & Leydesdorff, Loet, 2015. "Does quality and content matter for citedness? A comparison with para-textual factors and over time," Journal of Informetrics, Elsevier, vol. 9(3), pages 419-429.
    17. Lutz Bornmann, 2015. "Interrater reliability and convergent validity of F1000Prime peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(12), pages 2415-2426, December.
    18. Shi, Xuanyu & Du, Jian, 2022. "Distinguishing transformative from incremental clinical evidence: A classifier of clinical research using textual features from abstracts and citing sentences," Journal of Informetrics, Elsevier, vol. 16(2).
    19. Robin Haunschild & Lutz Bornmann, 2018. "Field- and time-normalization of data with many zeros: an empirical analysis using citation and Twitter data," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(2), pages 997-1012, August.
    20. Byungun Yoon & Sungjoo Lee & Gwanghee Lee, 2010. "Development and application of a keyword-based knowledge map for effective R&D planning," Scientometrics, Springer;Akadémiai Kiadó, vol. 85(3), pages 803-820, December.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:15:y:2021:i:3:s1751157721000456. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.