IDEAS home Printed from https://ideas.repec.org/a/eee/epplan/v66y2018icp183-194.html
   My bibliography  Save this article

Towards program theory validation: Crowdsourcing the qualitative analysis of participant experiences

Author

Listed:
  • Harman, Elena
  • Azzam, Tarek

Abstract

This exploratory study examines a novel tool for validating program theory through crowdsourced qualitative analysis. It combines a quantitative pattern matching framework traditionally used in theory-driven evaluation with crowdsourcing to analyze qualitative interview data. A sample of crowdsourced participants are asked to read an interview transcript and identify whether program theory components (Activities and Outcomes) are discussed and to highlight the most relevant passage about that component. The findings indicate that using crowdsourcing to analyze qualitative data can differentiate between program theory components that are supported by a participant’s experience and those that are not. This approach expands the range of tools available to validate program theory using qualitative data, thus strengthening the theory-driven approach.

Suggested Citation

  • Harman, Elena & Azzam, Tarek, 2018. "Towards program theory validation: Crowdsourcing the qualitative analysis of participant experiences," Evaluation and Program Planning, Elsevier, vol. 66(C), pages 183-194.
  • Handle: RePEc:eee:epplan:v:66:y:2018:i:c:p:183-194
    DOI: 10.1016/j.evalprogplan.2017.08.008
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0149718917301970
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.evalprogplan.2017.08.008?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Costner, Herbert L., 1989. "The validity of conclusions in evaluation research: A further development of Chen and Rossi's theory-driven approach," Evaluation and Program Planning, Elsevier, vol. 12(4), pages 345-353, January.
    2. Paolacci, Gabriele & Chandler, Jesse & Ipeirotis, Panagiotis G., 2010. "Running experiments on Amazon Mechanical Turk," Judgment and Decision Making, Cambridge University Press, vol. 5(5), pages 411-419, August.
    3. Deane, Kelsey L. & Harré, Niki, 2014. "Program theory-driven evaluation science in a youth development context," Evaluation and Program Planning, Elsevier, vol. 45(C), pages 61-70.
    4. Brousselle, Astrid & Champagne, François, 2011. "Program theory evaluation: Logic analysis," Evaluation and Program Planning, Elsevier, vol. 34(1), pages 69-78, February.
    5. Chen, Huey-Tsyh & Rossi, Peter H., 1989. "Issues in the theory-driven perspective," Evaluation and Program Planning, Elsevier, vol. 12(4), pages 299-306, January.
    6. Bickman, Leonard, 1989. "Barriers to the use of program theory," Evaluation and Program Planning, Elsevier, vol. 12(4), pages 387-390, January.
    7. Carol H. Weiss, 1997. "How Can Theory-Based Evaluation Make Greater Headway?," Evaluation Review, , vol. 21(4), pages 501-524, August.
    8. Chen, Huey-Tsyh & Rossi, Peter H., 1987. "The theory-driven approach to validity," Evaluation and Program Planning, Elsevier, vol. 10(1), pages 95-103, January.
    9. Berinsky, Adam J. & Huber, Gregory A. & Lenz, Gabriel S., 2012. "Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk," Political Analysis, Cambridge University Press, vol. 20(3), pages 351-368, July.
    10. repec:cup:judgdm:v:5:y:2010:i:5:p:411-419 is not listed on IDEAS
    11. Azzam, Tarek & Harman, Elena, 2016. "Crowdsourcing for quantifying transcripts: An exploratory study," Evaluation and Program Planning, Elsevier, vol. 54(C), pages 63-73.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Harman, Elena & Azzam, Tarek, 2018. "Incorporating public values into evaluative criteria: Using crowdsourcing to identify criteria and standards," Evaluation and Program Planning, Elsevier, vol. 71(C), pages 68-82.
    2. Park, Chul Hyun & Welch, Eric W. & Sriraj, P.S., 2016. "An integrative theory-driven framework for evaluating travel training programs," Evaluation and Program Planning, Elsevier, vol. 59(C), pages 7-20.
    3. Haas, Nicholas & Hassan, Mazen & Mansour, Sarah & Morton, Rebecca B., 2021. "Polarizing information and support for reform," Journal of Economic Behavior & Organization, Elsevier, vol. 185(C), pages 883-901.
    4. Cantarella, Michele & Strozzi, Chiara, 2019. "Workers in the Crowd: The Labour Market Impact of the Online Platform Economy," IZA Discussion Papers 12327, Institute of Labor Economics (IZA).
    5. Atalay, Kadir & Bakhtiar, Fayzan & Cheung, Stephen & Slonim, Robert, 2014. "Savings and prize-linked savings accounts," Journal of Economic Behavior & Organization, Elsevier, vol. 107(PA), pages 86-106.
    6. Azzam, Tarek & Harman, Elena, 2016. "Crowdsourcing for quantifying transcripts: An exploratory study," Evaluation and Program Planning, Elsevier, vol. 54(C), pages 63-73.
    7. Wladislaw Mill & Cornelius Schneider, 2023. "The Bright Side of Tax Evasion," CESifo Working Paper Series 10615, CESifo.
    8. Gandullia, Luca & Lezzi, Emanuela & Parciasepe, Paolo, 2020. "Replication with MTurk of the experimental design by Gangadharan, Grossman, Jones & Leister (2018): Charitable giving across donor types," Journal of Economic Psychology, Elsevier, vol. 78(C).
    9. Prissé, Benjamin & Jorrat, Diego, 2022. "Lab vs online experiments: No differences," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 100(C).
    10. Valerio Capraro & Hélène Barcelo, 2021. "Punishing defectors and rewarding cooperators: Do people discriminate between genders?," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 7(1), pages 19-32, September.
    11. Lefgren, Lars J. & Sims, David P. & Stoddard, Olga B., 2016. "Effort, luck, and voting for redistribution," Journal of Public Economics, Elsevier, vol. 143(C), pages 89-97.
    12. Benson, Paul R. & Fisher, Gene A. & Diana, Augusto & Simon, Lorna & Gamache, Gail & Tessler, Richard C. & McDermeit, Melissa, 1996. "A state network of family support services: The massachusetts family support demonstration project," Evaluation and Program Planning, Elsevier, vol. 19(1), pages 27-39, February.
    13. Kent, Douglas R. & Donaldson, Stewart I. & Wyrick, Phelan A. & Smith, Peggy J., 2000. "Evaluating criminal justice programs designed to reduce crime by targeting repeat gang offenders," Evaluation and Program Planning, Elsevier, vol. 23(1), pages 115-124, February.
    14. Tim Straub & Henner Gimpel & Florian Teschner & Christof Weinhardt, 2015. "How (not) to Incent Crowd Workers," Business & Information Systems Engineering: The International Journal of WIRTSCHAFTSINFORMATIK, Springer;Gesellschaft für Informatik e.V. (GI), vol. 57(3), pages 167-179, June.
    15. Nicolas Jacquemet & Alexander G James & Stéphane Luchini & James J Murphy & Jason F Shogren, 2021. "Do truth-telling oaths improve honesty in crowd-working?," PLOS ONE, Public Library of Science, vol. 16(1), pages 1-18, January.
    16. Carpiano, Richard M. & Fitz, Nicholas S., 2017. "Public attitudes toward child undervaccination: A randomized experiment on evaluations, stigmatizing orientations, and support for policies," Social Science & Medicine, Elsevier, vol. 185(C), pages 127-136.
    17. Chandler, Dana & Kapelner, Adam, 2013. "Breaking monotony with meaning: Motivation in crowdsourcing markets," Journal of Economic Behavior & Organization, Elsevier, vol. 90(C), pages 123-133.
    18. Thomas W. H. Ng & Lorenzo Lucianetti & Dennis Y. Hsu & Frederick H. K. Yim & Kelly L. Sorensen, 2021. "You Speak, I Speak: The Social‐Cognitive Mechanisms of Voice Contagion," Journal of Management Studies, Wiley Blackwell, vol. 58(6), pages 1569-1608, September.
    19. Mahmood, Ammara & Luffarelli, Jonathan & Mukesh, Mudra, 2019. "What's in a logo? The impact of complex visual cues in equity crowdfunding," Journal of Business Venturing, Elsevier, vol. 34(1), pages 41-62.
    20. Schwaiger, Rene & Hueber, Laura, 2021. "Do MTurkers exhibit myopic loss aversion?," Economics Letters, Elsevier, vol. 209(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:epplan:v:66:y:2018:i:c:p:183-194. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/evalprogplan .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.