IDEAS home Printed from https://ideas.repec.org/a/eee/epplan/v76y2019ic2.html
   My bibliography  Save this article

Evaluating unintended program outcomes through Ripple Effects Mapping (REM): Application of REM using grounded theory

Author

Listed:
  • Peterson, Christina
  • Skolits, Gary

Abstract

Several evaluation models exist for investigating unintended outcomes, including goal-free and systems evaluation. Yet methods for collecting and analyzing data on unintended outcomes remain under-utilized. Ripple Effects Mapping (REM) is a promising qualitative evaluation method with a wide range of program planning and evaluation applications. In situations where program results are likely to occur over time within complex settings, this method is useful for uncovering both intended and unintended outcomes. REM applies an Appreciative Inquiry facilitation technique to engage stakeholders in visually mapping sequences of program outcomes. Although it has been used to evaluate community development and health promotion initiatives, further methodological guidance for applying REM is still needed. The purpose of this paper is to contribute to the methodological development of evaluating unintended outcomes and extend the foundations of REM by describing steps for integrating it with grounded theory.

Suggested Citation

  • Peterson, Christina & Skolits, Gary, 2019. "Evaluating unintended program outcomes through Ripple Effects Mapping (REM): Application of REM using grounded theory," Evaluation and Program Planning, Elsevier, vol. 76(C), pages 1-1.
  • Handle: RePEc:eee:epplan:v:76:y:2019:i:c:2
    DOI: 10.1016/j.evalprogplan.2019.101677
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0149718919300072
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.evalprogplan.2019.101677?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Sherrill, Sam, 1984. "Identifying and measuring unintended outcomes," Evaluation and Program Planning, Elsevier, vol. 7(1), pages 27-34, January.
    2. Rick Davies, 2018. "Representing theories of change: technical challenges with evaluation consequences," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 10(4), pages 438-461, October.
    3. Joanna Coast, 1999. "The appropriate uses of qualitative methods in health economics," Health Economics, John Wiley & Sons, Ltd., vol. 8(4), pages 345-353, June.
    4. Rachel Welborn & Laura Downey & Patricia Hyjer Dyk & Pamela A. Monroe & Crystal Tyler-Mackey & Sheri L. Worthy, 2016. "Turning the Tide on Poverty: Documenting impacts through Ripple Effect Mapping," Community Development, Taylor & Francis Journals, vol. 47(3), pages 385-402, July.
    5. Eric P. S. Baumer & David Mimno & Shion Guha & Emily Quan & Geri K. Gay, 2017. "Comparing grounded theory and topic modeling: Extreme divergence or unlikely convergence?," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(6), pages 1397-1410, June.
    6. Carol H. Weiss, 1997. "How Can Theory-Based Evaluation Make Greater Headway?," Evaluation Review, , vol. 21(4), pages 501-524, August.
    7. Jabeen, Sumera, 2016. "Do we really care about unintended outcomes? An analysis of evaluation theory and practice," Evaluation and Program Planning, Elsevier, vol. 55(C), pages 144-154.
    8. Thayer, Colette E. & Fine, Allison H., 2001. "Evaluation and outcome measurement in the non-profit sector: stakeholder participation," Evaluation and Program Planning, Elsevier, vol. 24(1), pages 103-108, February.
    9. Bamberger, Michael & Tarsilla, Michele & Hesse-Biber, Sharlene, 2016. "Why so many “rigorous” evaluations fail to identify unintended consequences of development programs: How mixed methods can contribute," Evaluation and Program Planning, Elsevier, vol. 55(C), pages 155-162.
    10. Patton, Michael Quinn & Horton, Douglas, 2008. "Utilization-focused evaluation for agricultural innovation," ILAC Briefs 52533, Institutional Learning and Change (ILAC) Initiative.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Jabeen, Sumera, 2016. "Do we really care about unintended outcomes? An analysis of evaluation theory and practice," Evaluation and Program Planning, Elsevier, vol. 55(C), pages 144-154.
    2. Jabeen, Sumera, 2018. "Unintended outcomes evaluation approach: A plausible way to evaluate unintended outcomes of social development programmes," Evaluation and Program Planning, Elsevier, vol. 68(C), pages 262-274.
    3. Koch, Dirk-Jan & Schulpen, Lau, 2018. "Introduction to the special issue ‘unintended effects of international cooperation’," Evaluation and Program Planning, Elsevier, vol. 68(C), pages 202-209.
    4. de Alteriis, Martin, 2020. "What can we learn about unintended consequences from a textual analysis of monitoring reports and evaluations for U.S. foreign assistance programs?," Evaluation and Program Planning, Elsevier, vol. 79(C).
    5. Davidson, Angus Alexander & Young, Michael Denis & Leake, John Espie & O’Connor, Patrick, 2022. "Aid and forgetting the enemy: A systematic review of the unintended consequences of international development in fragile and conflict-affected situations," Evaluation and Program Planning, Elsevier, vol. 92(C).
    6. Ofek, Yuval, 2017. "Evaluating social exclusion interventions in university-community partnerships," Evaluation and Program Planning, Elsevier, vol. 60(C), pages 46-55.
    7. Smith, Jonathan D., 2017. "Positioning Missionaries in Development Studies, Policy, and Practice," World Development, Elsevier, vol. 90(C), pages 63-76.
    8. Schuster, Roseanne C. & Brewis, Alexandra & Wutich, Amber & Safi, Christelle & Vanrespaille, Teresa Elegido & Bowen, Gina & SturtzSreetharan, Cindi & McDaniel, Anne & Ochandarena, Peggy, 2023. "Individual interviews versus focus groups for evaluations of international development programs: Systematic testing of method performance to elicit sensitive information in a justice study in Haiti," Evaluation and Program Planning, Elsevier, vol. 97(C).
    9. von dem Knesebeck, Olaf & Joksimovic, Ljiljana & Badura, Bernhard & Siegrist, Johannes, 2002. "Evaluation of a community-level health policy intervention," Health Policy, Elsevier, vol. 61(1), pages 111-122, July.
    10. Massey, Oliver T., 2011. "A proposed model for the analysis and interpretation of focus groups in evaluation research," Evaluation and Program Planning, Elsevier, vol. 34(1), pages 21-28, February.
    11. Harris, Kevin & Adams, Andrew, 2016. "Power and discourse in the politics of evidence in sport for development," Sport Management Review, Elsevier, vol. 19(2), pages 97-106.
    12. Hart, Diane & Paucar-Caceres, Alberto, 2017. "A utilisation focussed and viable systems approach for evaluating technology supported learning," European Journal of Operational Research, Elsevier, vol. 259(2), pages 626-641.
    13. Lifshitz, Chen Chana, 2017. "Fostering employability among youth at-risk in a multi-cultural context: Insights from a pilot intervention program," Children and Youth Services Review, Elsevier, vol. 76(C), pages 20-34.
    14. Mohamed M. Mostafa, 2023. "A one-hundred-year structural topic modeling analysis of the knowledge structure of international management research," Quality & Quantity: International Journal of Methodology, Springer, vol. 57(4), pages 3905-3935, August.
    15. David Kernick, 2002. "Health economics: an evolving paradigm but sailing in the wrong direction? A view from the front line," Health Economics, John Wiley & Sons, Ltd., vol. 11(1), pages 87-88, January.
    16. LaVelle, John M. & Davies, Randall, 2021. "Seeking consensus: Defining foundational concepts for a graduate level introductory program evaluation course," Evaluation and Program Planning, Elsevier, vol. 88(C).
    17. Sarah Chapman & Adiilah Boodhoo & Carren Duffy & Suki Goodman & Maria Michalopoulou, 2023. "Theory of Change in Complex Research for Development Programmes: Challenges and Solutions from the Global Challenges Research Fund," The European Journal of Development Research, Palgrave Macmillan;European Association of Development Research and Training Institutes (EADI), vol. 35(2), pages 298-322, April.
    18. Melz, Heidi & Fromknecht, Anne E. & Masters, Loren D. & Richards, Tammy & Sun, Jing, 2023. "Incorporating multiple data sources to assess changes in organizational capacity in child welfare systems," Evaluation and Program Planning, Elsevier, vol. 97(C).
    19. Natasha Palmer & Anne Mills, 2003. "Classical versus relational approaches to understanding controls on a contract with independent GPs in South Africa," Health Economics, John Wiley & Sons, Ltd., vol. 12(12), pages 1005-1020, December.
    20. Stirling Bryan & David Parry, 2002. "Structural reliability of conjoint measurement in health care: an empirical investigation," Applied Economics, Taylor & Francis Journals, vol. 34(5), pages 561-567.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:epplan:v:76:y:2019:i:c:2. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/evalprogplan .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.