IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v122y2020i1d10.1007_s11192-019-03291-w.html
   My bibliography  Save this article

Evaluating medical conferences: the emerging need for a quality metric

Author

Listed:
  • Raynell Lang

    (University of Calgary)

  • Kholoud Porter

    (University College London)

  • Hartmut B. Krentz

    (University of Calgary)

  • M. John Gill

    (University of Calgary)

Abstract

Scientific medical conferences have proliferated in recent years but little data are available to assess their effectiveness in achieving their commonly stated aims “to educate, advance science, and establish evidence-based policy”. The recent expansion of what has been labeled ‘predatory academia’ has heightened concerns about the quality of both published and conference “science”. A journal’s impact factor (JIF) became one accepted metric for the quality of publication science, but no such indicator exists for medical scientific conferences, such as a conference impact factor (CIF). To explore the feasibility of implementing a CIF metric for such conferences, we tested a tool that establishes a ranking system to help both attendees and funders identify quality. Using abstracts presented from 2013 to 2016 at an annual meeting (International Workshop on HIV/Hepatitis Observational Databases), we determined how many were subsequently published in peer-reviewed journals. We then calculated a CIF by dividing the number of peer reviewed published papers by the number of abstracts presented at each conference, then multiplied it by the median value of JIF of the publishing journals. For evaluating the quality of a scientific conference, the use of a CIF which, although limited in scope, can act as a tool for attendees and funders to prioritize their time and resources.

Suggested Citation

  • Raynell Lang & Kholoud Porter & Hartmut B. Krentz & M. John Gill, 2020. "Evaluating medical conferences: the emerging need for a quality metric," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 759-764, January.
  • Handle: RePEc:spr:scient:v:122:y:2020:i:1:d:10.1007_s11192-019-03291-w
    DOI: 10.1007/s11192-019-03291-w
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-019-03291-w
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-019-03291-w?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. B Ian Hutchins & Xin Yuan & James M Anderson & George M Santangelo, 2016. "Relative Citation Ratio (RCR): A New Metric That Uses Citation Rates to Measure Influence at the Article Level," PLOS Biology, Public Library of Science, vol. 14(9), pages 1-25, September.
    2. Douglas Evanoff & Philip Bartholomew & Robert DeYoung & Cosmin Lucaci & Ronnie Phillips, 2008. "Bank Structure Conference Impact Study," Journal of Financial Services Research, Springer;Western Finance Association, vol. 34(2), pages 99-121, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "On the interplay between normalisation, bias, and performance of paper impact metrics," Journal of Informetrics, Elsevier, vol. 13(1), pages 270-290.
    2. A Cecile J W Janssens & Michael Goodman & Kimberly R Powell & Marta Gwinn, 2017. "A critical evaluation of the algorithm behind the Relative Citation Ratio (RCR)," PLOS Biology, Public Library of Science, vol. 15(10), pages 1-5, October.
    3. Adrian G Barnett & Pauline Zardo & Nicholas Graves, 2018. "Randomly auditing research labs could be an affordable way to improve research quality: A simulation study," PLOS ONE, Public Library of Science, vol. 13(4), pages 1-17, April.
    4. Mohammed S. Alqahtani & Mohamed Abbas & Mohammed Abdul Muqeet & Hussain M. Almohiy, 2022. "Research Productivity in Terms of Output, Impact, and Collaboration for University Researchers in Saudi Arabia: SciVal Analytics and t -Tests Statistical Based Approach," Sustainability, MDPI, vol. 14(23), pages 1-21, December.
    5. Thelwall, Mike, 2018. "Dimensions: A competitor to Scopus and the Web of Science?," Journal of Informetrics, Elsevier, vol. 12(2), pages 430-435.
    6. Li, Heyang & Wu, Meijun & Wang, Yougui & Zeng, An, 2022. "Bibliographic coupling networks reveal the advantage of diversification in scientific projects," Journal of Informetrics, Elsevier, vol. 16(3).
    7. Michael D. Bordo & Edward S. Prescott, 2019. "Federal Reserve Structure, Economic Ideas, and Monetary and Financial Policy," NBER Working Papers 26098, National Bureau of Economic Research, Inc.
    8. Yang, Alex Jie & Wu, Linwei & Zhang, Qi & Wang, Hao & Deng, Sanhong, 2023. "The k-step h-index in citation networks at the paper, author, and institution levels," Journal of Informetrics, Elsevier, vol. 17(4).
    9. Corrêa Jr., Edilson A. & Silva, Filipi N. & da F. Costa, Luciano & Amancio, Diego R., 2017. "Patterns of authors contribution in scientific manuscripts," Journal of Informetrics, Elsevier, vol. 11(2), pages 498-510.
    10. Torres-Salinas, Daniel & Valderrama-Baca, Pilar & Arroyo-Machado, Wenceslao, 2022. "Is there a need for a new journal metric? Correlations between JCR Impact Factor metrics and the Journal Citation Indicator—JCI," Journal of Informetrics, Elsevier, vol. 16(3).
    11. Joseph Staudt & Huifeng Yu & Robert P Light & Gerald Marschke & Katy Börner & Bruce A Weinberg, 2018. "High-impact and transformative science (HITS) metrics: Definition, exemplification, and comparison," PLOS ONE, Public Library of Science, vol. 13(7), pages 1-23, July.
    12. Heng Huang & Donghua Zhu & Xuefeng Wang, 2022. "Evaluating scientific impact of publications: combining citation polarity and purpose," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(9), pages 5257-5281, September.
    13. Lutz Bornmann & Alexander Tekles & Loet Leydesdorff, 2019. "How well does I3 perform for impact measurement compared to other bibliometric indicators? The convergent validity of several (field-normalized) indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 1187-1205, May.
    14. Jay Bhattacharya & Mikko Packalen, 2020. "Stagnation and Scientific Incentives," NBER Working Papers 26752, National Bureau of Economic Research, Inc.
    15. Latefa Ali Dardas & Malik Sallam & Amanda Woodward & Nadia Sweis & Narjes Sweis & Faleh A. Sawair, 2023. "Evaluating Research Impact Based on Semantic Scholar Highly Influential Citations, Total Citations, and Altmetric Attention Scores: The Quest for Refined Measures Remains Illusive," Publications, MDPI, vol. 11(1), pages 1-16, January.
    16. Loet Leydesdorff & Jordan A. Comins & Aaron A. Sorensen & Lutz Bornmann & Iina Hellsten, 2016. "Cited references and Medical Subject Headings (MeSH) as two different knowledge representations: clustering and mappings at the paper level," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2077-2091, December.
    17. John P A Ioannidis & Kevin Boyack & Paul F Wouters, 2016. "Citation Metrics: A Primer on How (Not) to Normalize," PLOS Biology, Public Library of Science, vol. 14(9), pages 1-7, September.
    18. Rodríguez-Navarro, Alonso & Brito, Ricardo, 2024. "Rank analysis of most cited publications, a new approach for research assessments," Journal of Informetrics, Elsevier, vol. 18(2).
    19. Živan Živković & Marija Panić, 2020. "Development of science and education in the Western Balkan countries: competitiveness with the EU," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(3), pages 2319-2339, September.
    20. Michael Taylor, 2023. "Slow, slow, quick, quick, slow: five altmetric sources observed over a decade show evolving trends, by research age, attention source maturity and open access status," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(4), pages 2175-2200, April.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:122:y:2020:i:1:d:10.1007_s11192-019-03291-w. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.