IDEAS home Printed from https://ideas.repec.org/a/oup/rseval/v25y2016i1p18-36..html
   My bibliography  Save this article

Measuring diversity in disciplinary collaboration in research teams: An ecological perspective

Author

Listed:
  • Arsev U. Aydinoglu
  • Suzie Allard
  • Chad Mitchell

Abstract

This study proposes an alternative and complementary method to bibliometric analysis to measure disciplinary diversity in research teams. Shannon’s entropy index, which is used in ecology to measure biodiversity in habitats, is adapted to measure disciplinary diversity of a research team (habitats become teams, and biodiversity becomes disciplinary diversity). Data come from the National Aeronautics and Space Administration Astrobiology Institute, which funded 14 interdisciplinary virtual research teams in 2012. Authors examined not only team rosters but also the project rosters (167 projects for 2012) of each team to calculate disciplinary diversity. Results suggest that the intended diversity is being achieved for some teams. However, for more than half of the teams, disciplinary diversity scores are lower on the project level compared to the overall team level, which suggests that for these teams, the intended diversity is not being achieved.

Suggested Citation

  • Arsev U. Aydinoglu & Suzie Allard & Chad Mitchell, 2016. "Measuring diversity in disciplinary collaboration in research teams: An ecological perspective," Research Evaluation, Oxford University Press, vol. 25(1), pages 18-36.
  • Handle: RePEc:oup:rseval:v:25:y:2016:i:1:p:18-36.
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1093/reseval/rvv028
    Download Restriction: Access to full text is restricted to subscribers.
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Blaise Cronin, 2001. "Hyperauthorship: A postmodern perversion or evidence of a structural shift in scholarly communication practices?," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 52(7), pages 558-569.
    2. Wagner, Caroline S. & Roessner, J. David & Bobb, Kamau & Klein, Julie Thompson & Boyack, Kevin W. & Keyton, Joann & Rafols, Ismael & Börner, Katy, 2011. "Approaches to understanding and measuring interdisciplinary scientific research (IDR): A review of the literature," Journal of Informetrics, Elsevier, vol. 5(1), pages 14-26.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Hajibabaei, Anahita & Schiffauerova, Andrea & Ebadi, Ashkan, 2022. "Gender-specific patterns in the artificial intelligence scientific ecosystem," Journal of Informetrics, Elsevier, vol. 16(2).
    2. Berea, Anamaria & Denning, Kathryn & Vidaurri, Monica & Arcand, Kimberly & Oman-Reagan, Michael P. & Bellovary, Jillian & Aydinoglu, Arsev Umur & Lupisella, Mark, 2019. "The Social Sciences Interdisciplinarity for Astronomy and Astrophysics - Lessons from the History of NASA and Related Fields," SocArXiv pfvw2, Center for Open Science.
    3. repec:oup:rseval:v:32:y:2024:i:2:p:213-227. is not listed on IDEAS
    4. Susan Roelofs & Nancy Edwards & Sarah Viehbeck & Cody Anderson, 2019. "Formative, embedded evaluation to strengthen interdisciplinary team science: Results of a 4-year, mixed methods, multi-country case study," Research Evaluation, Oxford University Press, vol. 28(1), pages 37-50.
    5. Yoshi-aki Shimada & Naotoshi Tsukada & Jun Suzuki, 2017. "Promoting diversity in science in Japan through mission-oriented research grants," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(3), pages 1415-1435, March.
    6. Wang, Chun-Chieh & Lin, Jia-Tian & Chen, Dar-Zen & Lo, Szu-Chia, 2023. "A New Look at National Diversity of Inventor Teams within Organizations," Journal of Informetrics, Elsevier, vol. 17(1).
    7. Bethany K Laursen & Nicole Motzer & Kelly J Anderson, 2022. "Pathways for assessing interdisciplinarity: A systematic review," Research Evaluation, Oxford University Press, vol. 31(3), pages 326-343.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.
    2. Nadine Desrochers & Adèle Paul‐Hus & Jen Pecoskie, 2017. "Five decades of gratitude: A meta‐synthesis of acknowledgments research," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(12), pages 2821-2833, December.
    3. Su, Hsin-Ning & Moaniba, Igam M., 2017. "Investigating the dynamics of interdisciplinary evolution in technology developments," Technological Forecasting and Social Change, Elsevier, vol. 122(C), pages 12-23.
    4. Seongkyoon Jeong & Jong-Chan Kim & Jae Young Choi, 2015. "Technology convergence: What developmental stage are we in?," Scientometrics, Springer;Akadémiai Kiadó, vol. 104(3), pages 841-871, September.
    5. Zuo, Zhiya & Zhao, Kang, 2018. "The more multidisciplinary the better? – The prevalence and interdisciplinarity of research collaborations in multidisciplinary institutions," Journal of Informetrics, Elsevier, vol. 12(3), pages 736-756.
    6. Rafols, Ismael & Leydesdorff, Loet & O’Hare, Alice & Nightingale, Paul & Stirling, Andy, 2012. "How journal rankings can suppress interdisciplinary research: A comparison between Innovation Studies and Business & Management," Research Policy, Elsevier, vol. 41(7), pages 1262-1282.
    7. Olle Persson & Wolfgang Glänzel, 2014. "Discouraging honorific authorship," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(2), pages 1417-1419, February.
    8. Jo Royle & Louisa Coles & Dorothy Williams & Paul Evans, 2007. "Publishing in international journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 71(1), pages 59-86, April.
    9. Sándor Soós & Zsófia Vida & András Schubert, 2018. "Long-term trends in the multidisciplinarity of some typical natural and social sciences, and its implications on the SSH versus STM distinction," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(3), pages 795-822, March.
    10. Ran Xu & Navid Ghaffarzadegan, 2018. "Neuroscience bridging scientific disciplines in health: Who builds the bridge, who pays for it?," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(2), pages 1183-1204, November.
    11. Franceschini, Fiorenzo & Maisano, Domenico, 2011. "Structured evaluation of the scientific output of academic research groups by recent h-based indicators," Journal of Informetrics, Elsevier, vol. 5(1), pages 64-74.
    12. Gibson, Elizabeth & Daim, Tugrul U. & Dabic, Marina, 2019. "Evaluating university industry collaborative research centers," Technological Forecasting and Social Change, Elsevier, vol. 146(C), pages 181-202.
    13. Pitambar Gautam & Ryuichi Yanagiya, 2012. "Reflection of cross-disciplinary research at Creative Research Institution (Hokkaido University) in the Web of Science database: appraisal and visualization using bibliometry," Scientometrics, Springer;Akadémiai Kiadó, vol. 93(1), pages 101-111, October.
    14. Chung-Souk Han, 2011. "On the demographical changes of U.S. research doctorate awardees and corresponding trends in research fields," Scientometrics, Springer;Akadémiai Kiadó, vol. 89(3), pages 845-865, December.
    15. Dorta-González, P. & Dorta-González, M.I., 2013. "Impact maturity times and citation time windows: The 2-year maximum journal impact factor," Journal of Informetrics, Elsevier, vol. 7(3), pages 593-602.
    16. Leo Egghe & Ronald Rousseau, 2023. "Global impact measures," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(1), pages 699-707, January.
    17. Perianes-Rodriguez, Antonio & Waltman, Ludo & van Eck, Nees Jan, 2016. "Constructing bibliometric networks: A comparison between full and fractional counting," Journal of Informetrics, Elsevier, vol. 10(4), pages 1178-1195.
    18. Haoye Sun & Thorsten Teichert, 2024. "Scarcity in today´s consumer markets: scoping the research landscape by author keywords," Management Review Quarterly, Springer, vol. 74(1), pages 93-120, February.
    19. Fiorenzo Franceschini & Domenico Maisano & Luca Mastrogiacomo, 2014. "The citer-success-index: a citer-based indicator to select a subset of elite papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(2), pages 963-983, November.
    20. Julia Heuritsch, 2023. "The Evaluation Gap in Astronomy—Explained through a Rational Choice Framework," Publications, MDPI, vol. 11(2), pages 1-26, June.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:oup:rseval:v:25:y:2016:i:1:p:18-36.. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Oxford University Press (email available below). General contact details of provider: https://academic.oup.com/rev .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.