IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v127y2022i6d10.1007_s11192-022-04371-0.html
   My bibliography  Save this article

A method of measuring the article discriminative capacity and its distribution

Author

Listed:
  • Yuetong Chen

    (Nanjing University
    Jiangsu Key Laboratory of Data Engineering and Knowledge Service)

  • Hao Wang

    (Nanjing University
    Jiangsu Key Laboratory of Data Engineering and Knowledge Service)

  • Baolong Zhang

    (Zhengzhou University of Aeronautics)

  • Wei Zhang

    (Nanjing University
    Jiangsu Key Laboratory of Data Engineering and Knowledge Service)

Abstract

Previous studies on scientific literature rarely considered discrimination, i.e., the extent to which the content of some research is different from that of others. This paper contributes to the quantitative methods used for the research on the discrimination of article content via the proposal of the article discriminative capacity (ADC). Academic articles included in the Chinese Social Sciences Citation Index (CSSCI) in the discipline of Library and Information Science (LIS) are used as research objects. First, the most suitable text representation model is chosen to better represent the content of articles, thereby improving the performance of ADC. Then, in-depth quantitative analyses and evaluations of the articles from the perspectives of the source journals, publication years, authors, themes, and disciplines are conducted in conjunction with the ADC. The results demonstrate that the combination of the ADC with the BERT model can better identify a single article with high discriminative capacity. Articles in the fields of Information Science and Cross-LIS are found to have relatively low average ADC values. In contrast, articles in the fields of Library Science and Archives Science have high average ADC values. Articles with high ADC values have diverse themes and distinctive keywords, and can reveal new methods and promote interdisciplinarity. On the contrary, articles with low ADC values have similar research themes, and favor traditional, commentary, and conventional research. Moreover, scholars with high discriminative capacity are more willing to explore new fields, instead of being confined to traditional LIS research. This work may help promote the diversity of academic research and complement the evaluation system of academic articles. One major limitation of this study is that it only used data from Chinese databases.

Suggested Citation

  • Yuetong Chen & Hao Wang & Baolong Zhang & Wei Zhang, 2022. "A method of measuring the article discriminative capacity and its distribution," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(6), pages 3317-3341, June.
  • Handle: RePEc:spr:scient:v:127:y:2022:i:6:d:10.1007_s11192-022-04371-0
    DOI: 10.1007/s11192-022-04371-0
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-022-04371-0
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-022-04371-0?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Lingfei Wu & Dashun Wang & James A. Evans, 2019. "Large teams develop and small teams disrupt science and technology," Nature, Nature, vol. 566(7744), pages 378-382, February.
    2. Dunaiski, Marcel & Visser, Willem & Geldenhuys, Jaco, 2016. "Evaluating paper and author ranking algorithms using impact and contribution awards," Journal of Informetrics, Elsevier, vol. 10(2), pages 392-407.
    3. Flaminio Squazzoni & Elise Brezis & Ana Marušić, 2017. "Scientometrics of peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 501-502, October.
    4. Leo Egghe, 2006. "Theory and practise of the g-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 69(1), pages 131-152, October.
    5. Adrian Mulligan & Louise Hall & Ellen Raphael, 2013. "Peer review in a changing world: An international study measuring the attitudes of researchers," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 132-161, January.
    6. Chaomei Chen & Timothy Cribbin & Robert Macredie & Sonali Morar, 2002. "Visualizing and tracking the growth of competing paradigms: Two case studies," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 53(8), pages 678-689.
    7. Lutz Bornmann, 2014. "How are excellent (highly cited) papers defined in bibliometrics? A quantitative analysis of the literature," Research Evaluation, Oxford University Press, vol. 23(2), pages 166-173.
    8. Upali W. Jayasinghe & Herbert W. Marsh & Nigel Bond, 2003. "A multilevel cross‐classified modelling approach to peer review of grant proposals: the effects of assessor and researcher attributes on assessor ratings," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 166(3), pages 279-300, October.
    9. G. Salton & C. S. Yang & C. T. Yu, 1975. "A theory of term importance in automatic text analysis," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 26(1), pages 33-44, January.
    10. Pérez-Hornero, Patricia & Arias-Nicolás, José Pablo & Pulgarín, Antonio A. & Pulgarín, Antonio, 2013. "An annual JCR impact factor calculation based on Bayesian credibility formulas," Journal of Informetrics, Elsevier, vol. 7(1), pages 1-9.
    11. Xinning Su & Sanhong Deng & Si Shen, 2014. "The design and application value of the Chinese Social Science Citation Index," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(3), pages 1567-1582, March.
    12. Aickin, M. & Gensler, H., 1996. "Adjusting for multiple testing when reporting research results: The Bonferroni vs Holm methods," American Journal of Public Health, American Public Health Association, vol. 86(5), pages 726-728.
    13. Franceschet, Massimo & Costantini, Antonio, 2010. "The effect of scholar collaboration on impact and quality of academic papers," Journal of Informetrics, Elsevier, vol. 4(4), pages 540-553.
    14. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    15. Xie, Qing & Zhang, Xinyuan & Ding, Ying & Song, Min, 2020. "Monolingual and multilingual topic analysis using LDA and BERT embeddings," Journal of Informetrics, Elsevier, vol. 14(3).
    16. Adrian Mulligan & Louise Hall & Ellen Raphael, 2013. "Peer review in a changing world: An international study measuring the attitudes of researchers," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(1), pages 132-161, January.
    17. Yongjun Zhang & Jialin Ma & Zijian Wang & Bolun Chen & Yongtao Yu, 2018. "Collective topical PageRank: a model to evaluate the topic-dependent academic impact of scientific papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(3), pages 1345-1372, March.
    18. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    19. Justin W. Flatt & Alessandro Blasimme & Effy Vayena, 2017. "Improving the Measurement of Scientific Success by Reporting a Self-Citation Index," Publications, MDPI, vol. 5(3), pages 1-6, August.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Rodríguez Sánchez, Isabel & Makkonen, Teemu & Williams, Allan M., 2019. "Peer review assessment of originality in tourism journals: critical perspective of key gatekeepers," Annals of Tourism Research, Elsevier, vol. 77(C), pages 1-11.
    2. Dietmar Wolfram & Peiling Wang & Adam Hembree & Hyoungjoo Park, 2020. "Open peer review: promoting transparency in open science," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 1033-1051, November.
    3. Qianjin Zong & Yafen Xie & Jiechun Liang, 2020. "Does open peer review improve citation count? Evidence from a propensity score matching analysis of PeerJ," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 607-623, October.
    4. Eirini Delikoura & Dimitrios Kouis, 2021. "Open Research Data and Open Peer Review: Perceptions of a Medical and Health Sciences Community in Greece," Publications, MDPI, vol. 9(2), pages 1-19, March.
    5. Libo Sheng & Dongqing Lyu & Xuanmin Ruan & Hongquan Shen & Ying Cheng, 2023. "The association between prior knowledge and the disruption of an article," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4731-4751, August.
    6. Zhentao Liang & Jin Mao & Gang Li, 2023. "Bias against scientific novelty: A prepublication perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 99-114, January.
    7. Thomas Feliciani & Junwen Luo & Lai Ma & Pablo Lucas & Flaminio Squazzoni & Ana Marušić & Kalpana Shankar, 2019. "A scoping review of simulation models of peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 555-594, October.
    8. Stephen A Gallo & Joanne H Sullivan & Scott R Glisson, 2016. "The Influence of Peer Reviewer Expertise on the Evaluation of Research Funding Applications," PLOS ONE, Public Library of Science, vol. 11(10), pages 1-18, October.
    9. Wiltrud Kuhlisch & Magnus Roos & Jörg Rothe & Joachim Rudolph & Björn Scheuermann & Dietrich Stoyan, 2016. "A statistical approach to calibrating the scores of biased reviewers of scientific papers," Metrika: International Journal for Theoretical and Applied Statistics, Springer, vol. 79(1), pages 37-57, January.
    10. Zhang, Baolong & Wang, Hao & Deng, Sanhong & Su, Xinning, 2020. "Measurement and analysis of Chinese journal discriminative capacity," Journal of Informetrics, Elsevier, vol. 14(1).
    11. Jens Jirschitzka & Aileen Oeberst & Richard Göllner & Ulrike Cress, 2017. "Inter-rater reliability and validity of peer reviews in an interdisciplinary field," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 1059-1092, November.
    12. Sergio Copiello, 2018. "On the money value of peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(1), pages 613-620, April.
    13. Gaëlle Vallée-Tourangeau & Ana Wheelock & Tushna Vandrevala & Priscilla Harries, 2022. "Peer reviewers’ dilemmas: a qualitative exploration of decisional conflict in the evaluation of grant applications in the medical humanities and social sciences," Palgrave Communications, Palgrave Macmillan, vol. 9(1), pages 1-11, December.
    14. García, J.A. & Montero-Parodi, J.J. & Rodriguez-Sánchez, Rosa & Fdez-Valdivia, J., 2023. "How to motivate a reviewer with a present bias to work harder," Journal of Informetrics, Elsevier, vol. 17(4).
    15. Carlos Olmeda-Gómez & Carlos Romá-Mateo & Maria-Antonia Ovalle-Perandones, 2019. "Overview of trends in global epigenetic research (2009–2017)," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1545-1574, June.
    16. Kevin J. Boudreau & Eva C. Guinan & Karim R. Lakhani & Christoph Riedl, 2016. "Looking Across and Looking Beyond the Knowledge Frontier: Intellectual Distance, Novelty, and Resource Allocation in Science," Management Science, INFORMS, vol. 62(10), pages 2765-2783, October.
    17. Monica Aniela Zaharie & Marco Seeber, 2018. "Are non-monetary rewards effective in attracting peer reviewers? A natural experiment," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(3), pages 1587-1609, December.
    18. Wenqing Wu & Haixu Xi & Chengzhi Zhang, 2024. "Are the confidence scores of reviewers consistent with the review content? Evidence from top conference proceedings in AI," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(7), pages 4109-4135, July.
    19. Rüdiger Mutz & Lutz Bornmann & Hans-Dieter Daniel, 2015. "Testing for the fairness and predictive validity of research funding decisions: A multilevel multiple imputation for missing data approach using ex-ante and ex-post peer evaluation data from the Austr," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(11), pages 2321-2339, November.
    20. J. A. Garcia & Rosa Rodriguez-Sánchez & J. Fdez-Valdivia, 2021. "The interplay between the reviewer’s incentives and the journal’s quality standard," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(4), pages 3041-3061, April.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:127:y:2022:i:6:d:10.1007_s11192-022-04371-0. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.