IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v125y2020i2d10.1007_s11192-020-03581-8.html
   My bibliography  Save this article

On the use of journal classification in social sciences and humanities: evidence from an Italian database

Author

Listed:
  • Tindaro Cicero

    (National Agency for the Evaluations of Universities and Research Institutes (ANVUR))

  • Marco Malgarini

    (National Agency for the Evaluations of Universities and Research Institutes (ANVUR))

Abstract

In social sciences and humanities, a two-tier journal classification is currently used in Italy in the context of the National Habilitation programme; peer review is also available for a large number of articles published in the same journals, in the framework of the last national evaluation exercise (VQR 2011–2014). We take advantage of these combined two rich datasets in order to check if journals classified as top class by scientific experts show higher impact and if articles published in those journals receive higher marks in peer-reviewed evaluation exercises with respect to other journals. Our main result is that ANVUR classification offers on average a reliable proxy for the quality of journals, as measured by journal indicators and by the assessments of independent experts evaluating individual articles published in those journals. While peer review is still to be considered as the main method for evaluation in Humanities and Social sciences, our analysis supports the view that journal classification can be a useful tool to support peer review even in SSH.

Suggested Citation

  • Tindaro Cicero & Marco Malgarini, 2020. "On the use of journal classification in social sciences and humanities: evidence from an Italian database," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 1689-1708, November.
  • Handle: RePEc:spr:scient:v:125:y:2020:i:2:d:10.1007_s11192-020-03581-8
    DOI: 10.1007/s11192-020-03581-8
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-020-03581-8
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-020-03581-8?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    2. Thomas Franssen & Paul Wouters, 2019. "Science and its significant other: Representing the humanities in bibliometric scholarship," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 70(10), pages 1124-1137, October.
    3. Antonio Ferrara & Andrea Bonaccorsi, 2016. "How robust is journal rating in Humanities and Social Sciences? Evidence from a large-scale, multi-method exercise," Research Evaluation, Oxford University Press, vol. 25(3), pages 279-291.
    4. Alessio Ancaiani & Alberto F. Anfossi & Anna Barbara & Sergio Benedetto & Brigida Blasi & Valentina Carletti & Tindaro Cicero & Alberto Ciolfi & Filippo Costa & Giovanna Colizza & Marco Costantini & F, 2015. "Evaluating scientific research in Italy: The 2004–10 research evaluation exercise," Research Evaluation, Oxford University Press, vol. 24(3), pages 242-255.
    5. Simon Hix, 2004. "A Global Ranking of Political Science Departments," Political Studies Review, Political Studies Association, vol. 2(3), pages 293-313, September.
    6. David Pontille & Didier Torny, 2010. "The controversial policies of journal ratings: evaluating social sciences and humanities," Research Evaluation, Oxford University Press, vol. 19(5), pages 347-360, December.
    7. Kalaitzidakis, Pantelis & Mamuneas, Theofanis P. & Stengos, Thanasis, 1999. "European economics: An analysis based on publications in the core journals," European Economic Review, Elsevier, vol. 43(4-6), pages 1150-1168, April.
    8. Alberto Anfossi & Alberto Ciolfi & Filippo Costa & Giorgio Parisi & Sergio Benedetto, 2016. "Large-scale assessment of research outputs through a weighted combination of bibliometric indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 107(2), pages 671-683, May.
    9. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    10. Richard Williams, 2012. "Using the margins command to estimate and interpret adjusted predictions and marginal effects," Stata Journal, StataCorp LP, vol. 12(2), pages 308-331, June.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Mike Thelwall & Kayvan Kousha & Meiko Makita & Mahshid Abdoli & Emma Stuart & Paul Wilson & Jonathan Levitt, 2023. "In which fields do higher impact journals publish higher quality articles?," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(7), pages 3915-3933, July.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    2. Cruz-Castro, Laura & Sanz-Menendez, Luis, 2021. "What should be rewarded? Gender and evaluation criteria for tenure and promotion," Journal of Informetrics, Elsevier, vol. 15(3).
    3. Jürgen Janger & Nicole Schmidt-Padickakudy & Anna Strauss-Kollin, 2019. "International Differences in Basic Research Grant Funding. A Systematic Comparison," WIFO Studies, WIFO, number 61664, April.
    4. Rodríguez Sánchez, Isabel & Makkonen, Teemu & Williams, Allan M., 2019. "Peer review assessment of originality in tourism journals: critical perspective of key gatekeepers," Annals of Tourism Research, Elsevier, vol. 77(C), pages 1-11.
    5. Zhentao Liang & Jin Mao & Gang Li, 2023. "Bias against scientific novelty: A prepublication perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 99-114, January.
    6. Elena Veretennik & Maria Yudkevich, 2023. "Inconsistent quality signals: evidence from the regional journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(6), pages 3675-3701, June.
    7. Meyer, Matthias & Waldkirch, Rüdiger W. & Duscher, Irina & Just, Alexander, 2018. "Drivers of citations: An analysis of publications in “top” accounting journals," CRITICAL PERSPECTIVES ON ACCOUNTING, Elsevier, vol. 51(C), pages 24-46.
    8. Seeber, Marco & Alon, Ilan & Pina, David G. & Piro, Fredrik Niclas & Seeber, Michele, 2022. "Predictors of applying for and winning an ERC Proof-of-Concept grant: An automated machine learning model," Technological Forecasting and Social Change, Elsevier, vol. 184(C).
    9. Feliciani, Thomas & Morreau, Michael & Luo, Junwen & Lucas, Pablo & Shankar, Kalpana, 2022. "Designing grant-review panels for better funding decisions: Lessons from an empirically calibrated simulation model," Research Policy, Elsevier, vol. 51(4).
    10. David Card & Stefano DellaVigna, 2017. "What do Editors Maximize? Evidence from Four Leading Economics Journals," NBER Working Papers 23282, National Bureau of Economic Research, Inc.
    11. J. A. García & Rosa Rodriguez-Sánchez & J. Fdez-Valdivia, 2016. "Why the referees’ reports I receive as an editor are so much better than the reports I receive as an author?," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(3), pages 967-986, March.
    12. Dietmar Wolfram & Peiling Wang & Adam Hembree & Hyoungjoo Park, 2020. "Open peer review: promoting transparency in open science," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 1033-1051, November.
    13. Andrada Elena Urda-Cîmpean & Sorana D. Bolboacă & Andrei Achimaş-Cadariu & Tudor Cătălin Drugan, 2016. "Knowledge Production in Two Types of Medical PhD Routes—What’s to Gain?," Publications, MDPI, vol. 4(2), pages 1-16, June.
    14. Oleksiyenko, Anatoly V., 2023. "Geopolitical agendas and internationalization of post-soviet higher education: Discursive dilemmas in the realm of the prestige economy," International Journal of Educational Development, Elsevier, vol. 102(C).
    15. Rosa Rodriguez-Sánchez & J. A. García & J. Fdez-Valdivia, 2018. "Editorial decisions with informed and uninformed reviewers," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(1), pages 25-43, October.
    16. Randa Alsabahi, 2022. "English Medium Publications: Opening or Closing Doors to Authors with Non-English Language Backgrounds," English Language Teaching, Canadian Center of Science and Education, vol. 15(10), pages 1-18, October.
    17. Yuetong Chen & Hao Wang & Baolong Zhang & Wei Zhang, 2022. "A method of measuring the article discriminative capacity and its distribution," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(6), pages 3317-3341, June.
    18. Qianjin Zong & Yafen Xie & Jiechun Liang, 2020. "Does open peer review improve citation count? Evidence from a propensity score matching analysis of PeerJ," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 607-623, October.
    19. Thomas Feliciani & Junwen Luo & Lai Ma & Pablo Lucas & Flaminio Squazzoni & Ana Marušić & Kalpana Shankar, 2019. "A scoping review of simulation models of peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 555-594, October.
    20. Stephen A Gallo & Joanne H Sullivan & Scott R Glisson, 2016. "The Influence of Peer Reviewer Expertise on the Evaluation of Research Funding Applications," PLOS ONE, Public Library of Science, vol. 11(10), pages 1-18, October.

    More about this item

    Keywords

    Journal classification; Social sciences and humanities; Peer review; Italy;
    All these keywords.

    JEL classification:

    • C12 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Hypothesis Testing: General
    • I23 - Health, Education, and Welfare - - Education - - - Higher Education; Research Institutions

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:125:y:2020:i:2:d:10.1007_s11192-020-03581-8. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.