IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v108y2016i3d10.1007_s11192-016-2000-8.html
   My bibliography  Save this article

Two citation-based indicators to measure latent referential value of papers

Author

Listed:
  • Zhi Li

    (Xi’an Jiaotong University)

  • Qinke Peng

    (Xi’an Jiaotong University)

  • Che Liu

    (Xi’an Jiaotong University)

Abstract

Traditional bibliometric indicators aim at helping academic administrators or research investors measure the influence of publications. These indicators focus on how to quantify and compare the scientific output of researchers. However, little attention has been paid to the aspect that bibliometric indicators can also be used to help scientists find valuable referential papers. In this paper, we propose three points to characterize valuable referential papers: first, valuable referential papers always are high-quality research; second, valuable referential papers are closely related to a considerable quantity of recent papers; third, valuable referential papers lead to hotspots which attract papers to follow successively. We extract the critical subnetwork from the original citation network which only reserves the significant nodes and edges that meet the three preceding points. Then we present two indicators on the basis of the critical subnetwork. The experimental results demonstrate that papers recommended by our indicators are relatively new and our indicators have greater Spearman’s rank correlation coefficients with the future citation count compared with other bibliometric indicators like the raw citation count.

Suggested Citation

  • Zhi Li & Qinke Peng & Che Liu, 2016. "Two citation-based indicators to measure latent referential value of papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(3), pages 1299-1313, September.
  • Handle: RePEc:spr:scient:v:108:y:2016:i:3:d:10.1007_s11192-016-2000-8
    DOI: 10.1007/s11192-016-2000-8
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-016-2000-8
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-016-2000-8?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Philip M. Davis, 2008. "Eigenfactor: Does the principle of repeated improvement result in better estimates than raw citation counts?," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 59(13), pages 2186-2188, November.
    2. Leo Egghe, 2006. "Theory and practise of the g-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 69(1), pages 131-152, October.
    3. Upul Senanayake & Mahendra Piraveenan & Albert Zomaya, 2015. "The Pagerank-Index: Going beyond Citation Counts in Quantifying Scientific Impact of Researchers," PLOS ONE, Public Library of Science, vol. 10(8), pages 1-34, August.
    4. Lundberg, Jonas, 2007. "Lifting the crown—citation z-score," Journal of Informetrics, Elsevier, vol. 1(2), pages 145-154.
    5. M.H. MacRoberts & B.R. MacRoberts, 2010. "Problems of citation analysis: A study of uncited and seldom-cited influences," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 61(1), pages 1-12, January.
    6. David Adam, 2002. "The counting house," Nature, Nature, vol. 415(6873), pages 726-729, February.
    7. M.H. MacRoberts & B.R. MacRoberts, 2010. "Problems of citation analysis: A study of uncited and seldom‐cited influences," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 61(1), pages 1-12, January.
    8. Naoki Shibata & Yuya Kajikawa & Ichiro Sakata, 2012. "Link prediction in citation networks," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(1), pages 78-85, January.
    9. Chen, P. & Xie, H. & Maslov, S. & Redner, S., 2007. "Finding scientific gems with Google’s PageRank algorithm," Journal of Informetrics, Elsevier, vol. 1(1), pages 8-15.
    10. Giovanni Abramo & Corrado Costa & Ciriaco Andrea D’Angelo, 2015. "A multivariate stochastic model to assess research performance," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(2), pages 1755-1772, February.
    11. Anthony F. J. Raan, 2006. "Comparison of the Hirsch-index with standard bibliometric indicators and with peer judgment for 147 chemistry research groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 67(3), pages 491-502, June.
    12. Tibor Braun & Wolfgang Glänzel & András Schubert, 2006. "A Hirsch-type index for journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 69(1), pages 169-173, October.
    13. Naoki Shibata & Yuya Kajikawa & Ichiro Sakata, 2012. "Link prediction in citation networks," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 63(1), pages 78-85, January.
    14. Fiorenzo Franceschini & Maurizio Galetto & Domenico Maisano & Luca Mastrogiacomo, 2012. "The success-index: an alternative approach to the h-index for evaluating an individual’s research output," Scientometrics, Springer;Akadémiai Kiadó, vol. 92(3), pages 621-641, September.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Mingyang Wang & Shi Li & Guangsheng Chen, 2017. "Detecting latent referential articles based on their vitality performance in the latest 2 years," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(3), pages 1557-1571, September.
    2. Binglu Wang & Yi Bu & Yang Xu, 2018. "A quantitative exploration on reasons for citing articles from the perspective of cited authors," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(2), pages 675-687, August.
    3. Mingyang Wang & Jiaqi Zhang & Shijia Jiao & Tianyu Zhang, 2019. "Evaluating the impact of citations of articles based on knowledge flow patterns hidden in the citations," PLOS ONE, Public Library of Science, vol. 14(11), pages 1-19, November.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Deming Lin & Tianhui Gong & Wenbin Liu & Martin Meyer, 2020. "An entropy-based measure for the evolution of h index research," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(3), pages 2283-2298, December.
    2. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    3. Giovanni Abramo & Ciriaco Andrea D’Angelo & Fulvio Viel, 2013. "The suitability of h and g indexes for measuring the research performance of institutions," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(3), pages 555-570, December.
    4. Judit Bar-Ilan & Mark Levene, 2015. "The hw-rank: an h-index variant for ranking web pages," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(3), pages 2247-2253, March.
    5. Yuanyuan Liu & Qiang Wu & Shijie Wu & Yong Gao, 2021. "Weighted citation based on ranking-related contribution: a new index for evaluating article impact," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(10), pages 8653-8672, October.
    6. Fiorenzo Franceschini & Domenico Maisano, 2011. "Bibliometric positioning of scientific manufacturing journals: a comparative analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 86(2), pages 463-485, February.
    7. Tobias Opthof & Loet Leydesdorff, 2011. "A comment to the paper by Waltman et al., Scientometrics, 87, 467–481, 2011," Scientometrics, Springer;Akadémiai Kiadó, vol. 88(3), pages 1011-1016, September.
    8. Bar-Ilan, Judit, 2008. "Informetrics at the beginning of the 21st century—A review," Journal of Informetrics, Elsevier, vol. 2(1), pages 1-52.
    9. Antonis Sidiropoulos & Dimitrios Katsaros & Yannis Manolopoulos, 2007. "Generalized Hirsch h-index for disclosing latent facts in citation networks," Scientometrics, Springer;Akadémiai Kiadó, vol. 72(2), pages 253-280, August.
    10. Gangan Prathap, 2018. "Eugene Garfield: from the metrics of science to the science of metrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(2), pages 637-650, February.
    11. Kuan, Chung-Huei & Huang, Mu-Hsuan & Chen, Dar-Zen, 2013. "Cross-field evaluation of publications of research institutes using their contributions to the fields’ MVPs determined by h-index," Journal of Informetrics, Elsevier, vol. 7(2), pages 455-468.
    12. Dinesh Pradhan & Partha Sarathi Paul & Umesh Maheswari & Subrata Nandi & Tanmoy Chakraborty, 2017. "$$C^3$$ C 3 -index: a PageRank based multi-faceted metric for authors’ performance measurement," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(1), pages 253-273, January.
    13. Xu, Shuqi & Mariani, Manuel Sebastian & Lü, Linyuan & Medo, Matúš, 2020. "Unbiased evaluation of ranking metrics reveals consistent performance in science and technology citation data," Journal of Informetrics, Elsevier, vol. 14(1).
    14. Fiorenzo Franceschini & Domenico Maisano & Luca Mastrogiacomo, 2013. "The effect of database dirty data on h-index calculation," Scientometrics, Springer;Akadémiai Kiadó, vol. 95(3), pages 1179-1188, June.
    15. Hao Wang & Hua-Wei Shen & Xue-Qi Cheng, 2016. "Scientific credit diffusion: Researcher level or paper level?," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(2), pages 827-837, November.
    16. Bornmann, Lutz & Haunschild, Robin & Mutz, Rüdiger, 2020. "Should citations be field-normalized in evaluative bibliometrics? An empirical analysis based on propensity score matching," Journal of Informetrics, Elsevier, vol. 14(4).
    17. Thelwall, Mike, 2017. "Three practical field normalised alternative indicator formulae for research evaluation," Journal of Informetrics, Elsevier, vol. 11(1), pages 128-151.
    18. Vieira, E.S. & Gomes, J.A.N.F., 2010. "A research impact indicator for institutions," Journal of Informetrics, Elsevier, vol. 4(4), pages 581-590.
    19. Ehsan Mohammadi & Mike Thelwall, 2013. "Assessing non-standard article impact using F1000 labels," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 383-395, November.
    20. Themis Lazaridis, 2010. "Ranking university departments using the mean h-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 82(2), pages 211-216, February.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:108:y:2016:i:3:d:10.1007_s11192-016-2000-8. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.