IDEAS home Printed from https://ideas.repec.org/a/eee/epplan/v83y2020ics0149718920301762.html
   My bibliography  Save this article

Teaching programme evaluation: A problem of knowledge

Author

Listed:
  • Arbour, Ghislain

Abstract

This article conceptualises the problem of selecting teaching content that supports the practice of programme evaluation. Knowledge for evaluation practice falls within one of three categories of knowledge that are defined by the different roles they play in supporting practice. First, core knowledge relates to the defining activity of evaluation practice, i.e., that it informs the intellectual task of the determination of a programme’s value. Second, accessory knowledge informs activities that support and facilitate the concretisation of the previous activity in a delivery context (e.g., stakeholder participation, evaluation use, project management, etc.). Third and finally, supplementary knowledge informs activities that may, on occasion, occur during evaluation practice, but without relating to the determination of value, either inherently or in a support role. The selection of knowledge for the teaching of evaluation must match the knowledge needed for the pursuit of effective evaluation practice: core, accessory, and supplementary knowledge. The specifics of these three needs ultimately depend on the characteristics of a given practice. The selection of content for the teaching of evaluation should ideally address these specific needs with the best knowledge available, regardless of its disciplinary origins.

Suggested Citation

  • Arbour, Ghislain, 2020. "Teaching programme evaluation: A problem of knowledge," Evaluation and Program Planning, Elsevier, vol. 83(C).
  • Handle: RePEc:eee:epplan:v:83:y:2020:i:c:s0149718920301762
    DOI: 10.1016/j.evalprogplan.2020.101872
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0149718920301762
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.evalprogplan.2020.101872?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Gullickson, Amy M., 2020. "The whole elephant: Defining evaluation," Evaluation and Program Planning, Elsevier, vol. 79(C).
    2. Patton, Michael Quinn & Horton, Douglas, 2008. "Utilization-focused evaluation for agricultural innovation," ILAC Briefs 52533, Institutional Learning and Change (ILAC) Initiative.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Downes, Jenni & Gullickson, Amy M., 2022. "What does it mean for an evaluation to be ‘valid’? A critical synthesis of evaluation literature," Evaluation and Program Planning, Elsevier, vol. 91(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Lifshitz, Chen Chana, 2017. "Fostering employability among youth at-risk in a multi-cultural context: Insights from a pilot intervention program," Children and Youth Services Review, Elsevier, vol. 76(C), pages 20-34.
    2. Kupiec, Tomasz, 2022. "Does evaluation quality matter? Quantitative analysis of the use of evaluation findings in the field of cohesion policy in Poland," Evaluation and Program Planning, Elsevier, vol. 93(C).
    3. Delahais, Thomas & Ottaviani, Fiona & Berthaud, Annabelle & Clot, Hélène, 2023. "Bridging the gap between wellbeing and evaluation: Lessons from IBEST, a French experience," Evaluation and Program Planning, Elsevier, vol. 97(C).
    4. Gullickson, Amy M. & King, Jean A. & LaVelle, John M. & Clinton, Janet M., 2019. "The current state of evaluator education: A situation analysis and call to action," Evaluation and Program Planning, Elsevier, vol. 75(C), pages 20-30.
    5. Harman, Elena & Azzam, Tarek, 2018. "Incorporating public values into evaluative criteria: Using crowdsourcing to identify criteria and standards," Evaluation and Program Planning, Elsevier, vol. 71(C), pages 68-82.
    6. Pleasant, Andrew & O’Leary, Catina & Carmona, Richard H., 2020. "Using formative research to tailor a community intervention focused on the prevention of chronic disease," Evaluation and Program Planning, Elsevier, vol. 78(C).
    7. Gagnon, France & Aubry, Tim & Cousins, J. Bradley & Goh, Swee C. & Elliott, Catherine, 2018. "Validation of the evaluation capacity in organizations questionnaire," Evaluation and Program Planning, Elsevier, vol. 68(C), pages 166-175.
    8. Szijarto, Barbara & Milley, Peter & Svensson, Kate & Cousins, J. Bradley, 2018. "On the evaluation of social innovations and social enterprises: Recognizing and integrating two solitudes in the empirical knowledge base," Evaluation and Program Planning, Elsevier, vol. 66(C), pages 20-32.
    9. Neumann, Jan & Robson, Andrew & Sloan, Diane, 2018. "Monitoring and evaluation of strategic change programme implementation—Lessons from a case analysis," Evaluation and Program Planning, Elsevier, vol. 66(C), pages 120-132.
    10. Alemdar, Meltem & Cappelli, Christopher J. & Criswell, Brett A. & Rushton, Gregory T., 2018. "Evaluation of a Noyce program: Development of teacher leaders in STEM education," Evaluation and Program Planning, Elsevier, vol. 71(C), pages 1-11.
    11. Bourgeois, Isabelle & Whynot, Jane, 2018. "The influence of evaluation recommendations on instrumental and conceptual uses: A preliminary analysis," Evaluation and Program Planning, Elsevier, vol. 68(C), pages 13-18.
    12. Kalpazidou Schmidt, Evanthia & Graversen, Ebbe Krogh, 2020. "Developing a conceptual evaluation framework for gender equality interventions in research and innovation," Evaluation and Program Planning, Elsevier, vol. 79(C).
    13. Arasanz, Carla & Nylen, Kirk, 2020. "The theory of change of the evaluation support program: Enhancing the role of community organizations in providing an ecology of care for neurological disorders," Evaluation and Program Planning, Elsevier, vol. 80(C).
    14. Schalock, Robert L. & Lee, Tim & Verdugo, Miguel & Swart, Kees & Claes, Claudia & van Loon, Jos & Lee, Chun-Shin, 2014. "An evidence-based approach to organization evaluation and change in human service organizations evaluation and program planning," Evaluation and Program Planning, Elsevier, vol. 45(C), pages 110-118.
    15. Renger, Ralph & Foltysova, Jirina & Becker, Karin L. & Souvannasacd, Eric, 2015. "The power of the context map: Designing realistic outcome evaluation strategies and other unanticipated benefits," Evaluation and Program Planning, Elsevier, vol. 52(C), pages 118-125.
    16. Rorrer, Audrey S., 2016. "An evaluation capacity building toolkit for principal investigators of undergraduate research experiences: A demonstration of transforming theory into practice," Evaluation and Program Planning, Elsevier, vol. 55(C), pages 103-111.
    17. Abboud, Rida & Claussen, Caroline, 2016. "The use of Outcome Harvesting in learning-oriented and collaborative inquiry approaches to evaluation: An example from Calgary, Alberta," Evaluation and Program Planning, Elsevier, vol. 59(C), pages 47-54.
    18. Charles N. Herrick, 2018. "Self-Identity and Sense of Place: Some Thoughts regarding Climate Change Adaptation Policy Formulation," Environmental Values, , vol. 27(1), pages 81-102, February.
    19. Salabarría-Peña, Yamir & Robinson, William T., 2022. "Going beyond performance measures in HIV-prevention: A funder-recipient expedition," Evaluation and Program Planning, Elsevier, vol. 90(C).
    20. Rider W. Foley & Leanna M. Archambault & Annie E. Hale & Hsiang-Kai Dong, 2017. "Learning Outcomes in Sustainability Education Among Future Elementary School Teachers," Journal of Education for Sustainable Development, , vol. 11(1), pages 33-51, March.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:epplan:v:83:y:2020:i:c:s0149718920301762. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/evalprogplan .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.