Author
Listed:
- Rebecca A. Maynard
- Rebecca N. Baelen
- David Fein
- Phomdaen Souvanna
Abstract
Background: This article offers a case example of how experimental evaluation methods can be coupled with principles of design-based implementation research (DBIR), improvement science (IS), and rapid-cycle evaluation (RCE) methods to provide relatively quick, low-cost, credible assessments of strategies designed to improve programs, policies, or practices. Objectives: This article demonstrates the feasibility and benefits of blending DBIR, IS, and RCE practices with embedded randomized controlled trials (RCTs) to improve the pace and efficiency of program improvement. Research design: This article describes a two-cycle experimental test of staff-designed strategies for improving a workforce development program. Youth enrolled in Year Up’s Professional Training Corps (PTC) programs were randomly assigned to “improvement strategies†designed to boost academic success and persistence through the 6-month learning and development (L&D) phase of the program, when participants spend most of their program-related time in courses offered by partner colleges. Subjects: The study sample includes 317 youth from three PTC program sites. Measures: The primary outcome measures are completion of the program’s L&D phase and continued college enrollment beyond the L&D phase. Results: The improvement strategies designed and tested during the study increased program retention through L&D by nearly 10 percentage points and increased college persistence following L&D by 13 percentage points. Conclusion: Blending DBIR, IS, and RCE principles with a multi-cycle RCT generated highly credible estimates of the efficacy of the tested improvement strategies within a relatively short period of time (18 months) at modest cost and with reportedly low burden for program staff.
Suggested Citation
Rebecca A. Maynard & Rebecca N. Baelen & David Fein & Phomdaen Souvanna, 2022.
"Using Iterative Experimentation to Accelerate Program Improvement: A Case Example,"
Evaluation Review, , vol. 46(5), pages 469-516, October.
Handle:
RePEc:sae:evarev:v:46:y:2022:i:5:p:469-516
DOI: 10.1177/0193841X20923199
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:evarev:v:46:y:2022:i:5:p:469-516. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.