IDEAS home Printed from https://ideas.repec.org/a/eee/epplan/v68y2018icp157-165.html
   My bibliography  Save this article

Reported credibility techniques in higher education evaluation studies that use qualitative methods: A research synthesis

Author

Listed:
  • Liao, Hongjing
  • Hitchcock, John

Abstract

This synthesis study examined the reported use of credibility techniques in higher education evaluation articles that use qualitative methods. The sample included 118 articles published in six leading higher education evaluation journals from 2003 to 2012. Mixed methods approaches were used to identify key credibility techniques reported across the articles, document the frequency of these techniques, and describe their use and properties. Two broad sets of techniques were of interest: primary design techniques (i.e., basic), such as sampling/participant recruitment strategies, data collection methods, analytic details, and additional qualitative credibility techniques (e.g., member checking, negative case analyses, peer debriefing). The majority of evaluation articles reported use of primary techniques although there was wide variation in the amount of supporting detail; most of the articles did not describe the use of additional credibility techniques. This suggests that editors of evaluation journals should encourage the reporting of qualitative design details and authors should develop strategies yielding fuller methodological description.

Suggested Citation

  • Liao, Hongjing & Hitchcock, John, 2018. "Reported credibility techniques in higher education evaluation studies that use qualitative methods: A research synthesis," Evaluation and Program Planning, Elsevier, vol. 68(C), pages 157-165.
  • Handle: RePEc:eee:epplan:v:68:y:2018:i:c:p:157-165
    DOI: 10.1016/j.evalprogplan.2018.03.005
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0149718917302410
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.evalprogplan.2018.03.005?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Michèle Lamont & Grégoire Mallard & Joshua Guetzkow, 2006. "Beyond blind faith: overcoming the obstacles to interdisciplinary evaluation," Research Evaluation, Oxford University Press, vol. 15(1), pages 43-55, April.
    2. Ulrich Schmoch & Torben Schubert & Dorothea Jansen & Richard Heidler & Regina von Görtz, 2010. "How to use indicators to measure scientific performance: a balanced approach," Research Evaluation, Oxford University Press, vol. 19(1), pages 2-18, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Antonio Fernández-Cano & Manuel Torralbo & Mónica Vallejo, 2012. "Time series of scientific growth in Spanish doctoral theses (1848–2009)," Scientometrics, Springer;Akadémiai Kiadó, vol. 91(1), pages 15-36, April.
    2. Gibson, Elizabeth & Daim, Tugrul U. & Dabic, Marina, 2019. "Evaluating university industry collaborative research centers," Technological Forecasting and Social Change, Elsevier, vol. 146(C), pages 181-202.
    3. Corey J A Bradshaw & Justin M Chalker & Stefani A Crabtree & Bart A Eijkelkamp & John A Long & Justine R Smith & Kate Trinajstic & Vera Weisbecker, 2021. "A fairer way to compare researchers at any career stage and in any discipline using open-access citation data," PLOS ONE, Public Library of Science, vol. 16(9), pages 1-15, September.
    4. Tasso Brandt & Torben Schubert, 2014. "Is the university model an organizational necessity? Scale and agglomeration effects in science," Chapters, in: Andrea Bonaccorsi (ed.), Knowledge, Diversity and Performance in European Higher Education, chapter 8, pages iii-iii, Edward Elgar Publishing.
    5. Franc Mali, 2013. "Why an Unbiased External R&D Evaluation System is Important for the Progress of Social Sciences—the Case of a Small Social Science Community," Social Sciences, MDPI, vol. 2(4), pages 1-14, December.
    6. Nabil Amara & Réjean Landry, 2012. "Counting citations in the field of business and management: why use Google Scholar rather than the Web of Science," Scientometrics, Springer;Akadémiai Kiadó, vol. 93(3), pages 553-581, December.
    7. Tasso Brandt & Torben Schubert, 2013. "Is the university model an organizational necessity? Scale and agglomeration effects in science," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(2), pages 541-565, February.
    8. Zeug, Walther & Bezama, Alberto & Thrän, Daniela, 2020. "Towards a holistic and integrated Life Cycle Sustainability Assessment of the bioeconomy: Background on concepts, visions and measurements," UFZ Discussion Papers 7/2020, Helmholtz Centre for Environmental Research (UFZ), Division of Social Sciences (ÖKUS).
    9. Pimentel, Erica & Cho, Charles H. & Bothello, Joel, 2023. "The blind spots of interdisciplinarity in addressing grand challenges," CRITICAL PERSPECTIVES ON ACCOUNTING, Elsevier, vol. 93(C).
    10. Rodríguez-Navarro, Alonso & Brito, Ricardo, 2024. "Rank analysis of most cited publications, a new approach for research assessments," Journal of Informetrics, Elsevier, vol. 18(2).
    11. Fiorenzo Franceschini & Elisa Turina, 2013. "Quality improvement and redesign of performance measurement systems: an application to the academic field," Quality & Quantity: International Journal of Methodology, Springer, vol. 47(1), pages 465-483, January.
    12. Yagi, Michiyuki & Managi, Shunsuke, 2018. "Shadow price of patent stock as knowledge stock: Time and country heterogeneity," Economic Analysis and Policy, Elsevier, vol. 60(C), pages 43-61.
    13. Moss, Todd W. & Renko, Maija & Block, Emily & Meyskens, Moriah, 2018. "Funding the story of hybrid ventures: Crowdfunder lending preferences and linguistic hybridity," Journal of Business Venturing, Elsevier, vol. 33(5), pages 643-659.
    14. Katarina Rojko & Brankica Bratić & Borut Lužar, 2020. "The Bologna reform’s impacts on the scientific publication performance of Ph.D. graduates—the case of Slovenia," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(1), pages 329-356, July.
    15. Davit Gondauri & Ekaterine Mikautadze, 2024. "Impact of R&D and AI Investments on Economic Growth and Credit Rating," Papers 2411.07817, arXiv.org.
    16. Andrea Bonaccorsi & Nicola Melluso & Francesco Alessandro Massucci, 2022. "Exploring the antecedents of interdisciplinarity at the European Research Council: a topic modeling approach," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(12), pages 6961-6991, December.
    17. Kroll, Henning & Schubert, Torben, 2014. "On universities' long-term effects on regional value creation and unemployment: The case of Germany," Working Papers "Firms and Region" R1/2014, Fraunhofer Institute for Systems and Innovation Research (ISI).
    18. Francesco Giovanni Avallone & Alberto Quagli & Paola Ramassa, 2022. "Interdisciplinary research by accounting scholars: An exploratory study," FINANCIAL REPORTING, FrancoAngeli Editore, vol. 2022(2), pages 5-34.
    19. Giulio Giacomo Cantone, 2024. "How to measure interdisciplinary research? A systemic design for the model of measurement," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(8), pages 4937-4982, August.
    20. Torben Schubert & Henning Kroll, 2016. "Universities’ effects on regional GDP and unemployment: The case of Germany," Papers in Regional Science, Wiley Blackwell, vol. 95(3), pages 467-489, August.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:epplan:v:68:y:2018:i:c:p:157-165. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/evalprogplan .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.