IDEAS home Printed from https://ideas.repec.org/a/pal/palcom/v8y2021i1d10.1057_s41599-021-00929-0.html
   My bibliography  Save this article

Ten ways to improve academic CVs for fairer research assessment

Author

Listed:
  • Michaela Strinzel

    (Swiss National Science Foundation)

  • Josh Brown

    (MoreBrains Cooperative)

  • Wolfgang Kaltenbrunner

    (Leiden University)

  • Sarah Rijcke

    (Leiden University)

  • Michael Hill

    (Swiss National Science Foundation)

Abstract

Academic CVs are ubiquitous and play an integral role in the assessment of researchers. They define and portray what activities and achievements are considered important in the scientific system. Developing their content and structure beyond the traditional, publication-focused CV has the potential to make research careers more diverse and their assessment fairer and more transparent. This comment presents ten ways to further develop the content and structure of academic CVs. The recommendations are inspired by a workshop of the CV Harmonization Group (H-Group), a joint initiative between researchers on research, academic data infrastructure organizations, and representatives from >15 funding organizations. The proposed improvements aim at inspiring development and innovation in academic CVs for funding agencies and hiring committees.

Suggested Citation

  • Michaela Strinzel & Josh Brown & Wolfgang Kaltenbrunner & Sarah Rijcke & Michael Hill, 2021. "Ten ways to improve academic CVs for fairer research assessment," Palgrave Communications, Palgrave Macmillan, vol. 8(1), pages 1-4, December.
  • Handle: RePEc:pal:palcom:v:8:y:2021:i:1:d:10.1057_s41599-021-00929-0
    DOI: 10.1057/s41599-021-00929-0
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1057/s41599-021-00929-0
    File Function: Abstract
    Download Restriction: Access to full text is restricted to subscribers.

    File URL: https://libkey.io/10.1057/s41599-021-00929-0?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. David Moher & Lex Bouter & Sabine Kleinert & Paul Glasziou & Mai Har Sham & Virginia Barbour & Anne-Marie Coriat & Nicole Foeger & Ulrich Dirnagl, 2020. "The Hong Kong Principles for assessing researchers: Fostering research integrity," PLOS Biology, Public Library of Science, vol. 18(7), pages 1-14, July.
    2. Ruth Müller & Sarah de Rijcke, 2017. "Exploring the epistemic impacts of academic performance indicators in the life sciences," Research Evaluation, Oxford University Press, vol. 26(3), pages 157-168.
    3. Ruth Müller & Sarah de Rijcke, 2017. "Thinking with Indicators. Exploring the Epistemic Impacts of Academic Performance Indicators in the Life Sciences," Research Evaluation, Oxford University Press, vol. 26(4), pages 361-361.
    4. Diana Hicks & Paul Wouters & Ludo Waltman & Sarah de Rijcke & Ismael Rafols, 2015. "Bibliometrics: The Leiden Manifesto for research metrics," Nature, Nature, vol. 520(7548), pages 429-431, April.
    5. Martin Klein & Herbert Van de Sompel & Robert Sanderson & Harihar Shankar & Lyudmila Balakireva & Ke Zhou & Richard Tobin, 2014. "Scholarly Context Not Found: One in Five Articles Suffers from Reference Rot," PLOS ONE, Public Library of Science, vol. 9(12), pages 1-39, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Julian Hamann & Wolfgang Kaltenbrunner, 2022. "Biographical representation, from narrative to list: The evolution of curricula vitae in the humanities, 1950 to 2010," Research Evaluation, Oxford University Press, vol. 31(4), pages 438-451.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Tony Ross-Hellauer & Thomas Klebel & Petr Knoth & Nancy Pontika, 2024. "Value dissonance in research(er) assessment: individual and perceived institutional priorities in review, promotion, and tenure," Science and Public Policy, Oxford University Press, vol. 51(3), pages 337-351.
    2. Alexander Schniedermann, 2021. "A comparison of systematic reviews and guideline-based systematic reviews in medical studies," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(12), pages 9829-9846, December.
    3. Eugenio Petrovich, 2022. "Bibliometrics in Press. Representations and uses of bibliometric indicators in the Italian daily newspapers," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2195-2233, May.
    4. Frank J. Rijnsoever & Laurens K. Hessels, 2021. "How academic researchers select collaborative research projects: a choice experiment," The Journal of Technology Transfer, Springer, vol. 46(6), pages 1917-1948, December.
    5. Ginevra Peruginelli & Janne Pölönen, 2024. "The legal foundation of responsible research assessment: An overview on European Union and Italy," Research Evaluation, Oxford University Press, vol. 32(4), pages 670-682.
    6. Chris H. J. Hartgerink & Marino Van Zelst, 2018. "“As-You-Go” Instead of “After-the-Fact”: A Network Approach to Scholarly Communication and Evaluation," Publications, MDPI, vol. 6(2), pages 1-10, April.
    7. Lai Ma, 2023. "Information, platformized," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(2), pages 273-282, February.
    8. Diane (DeDe) Dawson & Esteban Morales & Erin C McKiernan & Lesley A Schimanski & Meredith T Niles & Juan Pablo Alperin, 2022. "The role of collegiality in academic review, promotion, and tenure," PLOS ONE, Public Library of Science, vol. 17(4), pages 1-17, April.
    9. Julian Hamann & Wolfgang Kaltenbrunner, 2022. "Biographical representation, from narrative to list: The evolution of curricula vitae in the humanities, 1950 to 2010," Research Evaluation, Oxford University Press, vol. 31(4), pages 438-451.
    10. Gabriel-Alexandru Vîiu & Mihai Păunescu, 2021. "The citation impact of articles from which authors gained monetary rewards based on journal metrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(6), pages 4941-4974, June.
    11. Fabio Zagonari, 2019. "Scientific Production and Productivity for Characterizing an Author’s Publication History: Simple and Nested Gini’s and Hirsch’s Indexes Combined," Publications, MDPI, vol. 7(2), pages 1-30, May.
    12. Gabriel-Alexandru Vȋiu & Mihai Păunescu, 2021. "The lack of meaningful boundary differences between journal impact factor quartiles undermines their independent use in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(2), pages 1495-1525, February.
    13. Lucas Brunet & Ruth Müller, 2022. "Making the cut: How panel reviewers use evaluation devices to select applications at the European Research Council," Research Evaluation, Oxford University Press, vol. 31(4), pages 486-497.
    14. Sandra Rousseau & Ronald Rousseau, 2021. "Bibliometric Techniques And Their Use In Business And Economics Research," Journal of Economic Surveys, Wiley Blackwell, vol. 35(5), pages 1428-1451, December.
    15. Piotr Śpiewanowski & Oleksandr Talavera, 2021. "Journal rankings and publication strategy," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(4), pages 3227-3242, April.
    16. Bryce, Cormac & Dowling, Michael & Lucey, Brian, 2020. "The journal quality perception gap," Research Policy, Elsevier, vol. 49(5).
    17. Domingo Docampo & Lawrence Cram, 2019. "Highly cited researchers: a moving target," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(3), pages 1011-1025, March.
    18. Sten F Odenwald, 2020. "A citation study of earth science projects in citizen science," PLOS ONE, Public Library of Science, vol. 15(7), pages 1-26, July.
    19. Alexander Kalgin & Olga Kalgina & Anna Lebedeva, 2019. "Publication Metrics as a Tool for Measuring Research Productivity and Their Relation to Motivation," Voprosy obrazovaniya / Educational Studies Moscow, National Research University Higher School of Economics, issue 1, pages 44-86.
    20. Ramón A. Feenstra & Emilio Delgado López-Cózar, 2022. "Philosophers’ appraisals of bibliometric indicators and their use in evaluation: from recognition to knee-jerk rejection," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(4), pages 2085-2103, April.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:pal:palcom:v:8:y:2021:i:1:d:10.1057_s41599-021-00929-0. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: https://www.nature.com/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.