IDEAS home Printed from https://ideas.repec.org/p/osf/socarx/ct9sj.html
   My bibliography  Save this paper

Making the Grade: The Sensitivity of Education Program Effectiveness to Input Choices and Outcome Measures

Author

Listed:
  • Kerwin, Jason Theodore
  • Thornton, Rebecca

Abstract

This paper demonstrates the acute sensitivity of education program effectiveness to the choices of inputs and outcome measures, using a randomized evaluation of a mother-tongue literacy program. The program raises reading scores by 0.64SDs and writing scores by 0.45SDs. A reduced-cost version instead yields statistically-insignificant reading gains and some large negative effects (-0.33SDs) on advanced writing. We combine a conceptual model of education production with detailed classroom observations to examine the mechanisms driving the results; we show they could be driven by the program initially lowering productivity before raising it, and potentially by missing complementary inputs in the reduced-cost version.

Suggested Citation

  • Kerwin, Jason Theodore & Thornton, Rebecca, 2020. "Making the Grade: The Sensitivity of Education Program Effectiveness to Input Choices and Outcome Measures," SocArXiv ct9sj, Center for Open Science.
  • Handle: RePEc:osf:socarx:ct9sj
    DOI: 10.31219/osf.io/ct9sj
    as

    Download full text from publisher

    File URL: https://osf.io/download/5f160a990870f2012109b329/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/ct9sj?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Angus Deaton, 2010. "Instruments, Randomization, and Learning about Development," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 424-455, June.
    2. Abhijit V. Banerjee & Shawn Cole & Esther Duflo & Leigh Linden, 2007. "Remedying Education: Evidence from Two Randomized Experiments in India," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 122(3), pages 1235-1264.
    3. Abhijit Banerjee & Rukmini Banerji & James Berry & Esther Duflo & Harini Kannan & Shobhini Mukerji & Marc Shotland & Michael Walton, 2017. "From Proof of Concept to Scalable Policies: Challenges and Solutions, with an Application," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 73-102, Fall.
    4. A. Colin Cameron & Jonah B. Gelbach & Douglas L. Miller, 2008. "Bootstrap-Based Improvements for Inference with Clustered Errors," The Review of Economics and Statistics, MIT Press, vol. 90(3), pages 414-427, August.
    5. Glenn W. Harrison & John A. List, 2004. "Field Experiments," Journal of Economic Literature, American Economic Association, vol. 42(4), pages 1009-1055, December.
    6. Jens Ludwig & Jeffrey R. Kling & Sendhil Mullainathan, 2011. "Mechanism Experiments and Policy Evaluations," Journal of Economic Perspectives, American Economic Association, vol. 25(3), pages 17-38, Summer.
    7. Hunt Allcott, 2015. "Site Selection Bias in Program Evaluation," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 130(3), pages 1117-1165.
    8. Duflo, Esther & Dupas, Pascaline & Kremer, Michael, 2015. "School governance, teacher incentives, and pupil–teacher ratios: Experimental evidence from Kenyan primary schools," Journal of Public Economics, Elsevier, vol. 123(C), pages 92-110.
    9. Pritchett, Lant & Filmer, Deon, 1999. "What education production functions really show: a positive theory of education expenditures," Economics of Education Review, Elsevier, vol. 18(2), pages 223-239, April.
    10. David K. Evans & Anna Popova, 2016. "What Really Works to Improve Learning in Developing Countries? An Analysis of Divergent Findings in Systematic Reviews," The World Bank Research Observer, World Bank, vol. 31(2), pages 242-270.
    11. Eva Vivalt, 0. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    12. Friedman, Jerome H. & Hastie, Trevor & Tibshirani, Rob, 2010. "Regularization Paths for Generalized Linear Models via Coordinate Descent," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 33(i01).
    13. Karthik Muralidharan & Venkatesh Sundararaman, 2013. "Contract Teachers: Experimental Evidence from India," NBER Working Papers 19440, National Bureau of Economic Research, Inc.
    14. Hainmueller, Jens & Hazlett, Chad, 2014. "Kernel Regularized Least Squares: Reducing Misspecification Bias with a Flexible and Interpretable Machine Learning Approach," Political Analysis, Cambridge University Press, vol. 22(2), pages 143-168, April.
    15. Jacobus Cilliers & Brahm Fleisch & Cas Prinsloo & Stephen Taylor, 2020. "How to Improve Teaching Practice?: An Experimental Comparison of Centralized Training and In-Classroom Coaching," Journal of Human Resources, University of Wisconsin Press, vol. 55(3), pages 926-962.
    16. Daniel O. Gilligan & Naureen Karachiwalla & Ibrahim Kasirye & Adrienne M. Lucas & Derek Neal, 2022. "Educator Incentives and Educational Triage in Rural Primary Schools," Journal of Human Resources, University of Wisconsin Press, vol. 57(1), pages 79-111.
    17. Anna, Petrenko, 2016. "Мaркування готової продукції як складова частина інформаційного забезпечення маркетингової діяльності підприємств овочепродуктового підкомплексу," Agricultural and Resource Economics: International Scientific E-Journal, Agricultural and Resource Economics: International Scientific E-Journal, vol. 2(1), March.
    18. Jeffrey R Kling & Jeffrey B Liebman & Lawrence F Katz, 2007. "Experimental Analysis of Neighborhood Effects," Econometrica, Econometric Society, vol. 75(1), pages 83-119, January.
    19. Glewwe, Paul & Kremer, Michael & Moulin, Sylvie & Zitzewitz, Eric, 2004. "Retrospective vs. prospective analyses of school inputs: the case of flip charts in Kenya," Journal of Development Economics, Elsevier, vol. 74(1), pages 251-268, June.
    20. Simon Heß, 2017. "Randomization inference with Stata: A guide and software," Stata Journal, StataCorp LP, vol. 17(3), pages 630-651, September.
    21. Melody M. Chao & Rajeev Dehejia & Anirban Mukhopadhyay & Sujata Visaria, 2015. "Unintended Negative Consequences of Rewards for Student Attendance: Results from a Field Experiment in Indian Classrooms," HKUST IEMS Working Paper Series 2015-22, HKUST Institute for Emerging Market Studies, revised Apr 2015.
    22. David S. Lee, 2009. "Training, Wages, and Sample Selection: Estimating Sharp Bounds on Treatment Effects," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 76(3), pages 1071-1102.
    23. David Roodman & James G. MacKinnon & Morten Ørregaard Nielsen & Matthew D. Webb, 2019. "Fast and wild: Bootstrap inference in Stata using boottest," Stata Journal, StataCorp LP, vol. 19(1), pages 4-60, March.
    24. Glewwe, P. & Muralidharan, K., 2016. "Improving Education Outcomes in Developing Countries," Handbook of the Economics of Education,, Elsevier.
    25. Isaac Mbiti & Karthik Muralidharan & Mauricio Romero & Youdi Schipper & Constantine Manda & Rakesh Rajani, 2019. "Inputs, Incentives, and Complementarities in Education: Experimental Evidence from Tanzania," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 134(3), pages 1627-1673.
    26. Jere R. Behrman & Susan W. Parker & Petra E. Todd & Kenneth I. Wolpin, 2015. "Aligning Learning Incentives of Students and Teachers: Results from a Social Experiment in Mexican High Schools," Journal of Political Economy, University of Chicago Press, vol. 123(2), pages 325-364.
    27. Nadel, Sara & Pritchett, Lant, 2016. "Searching for the Devil in the Details: Learning about Development Program Design," Working Paper Series rwp16-041, Harvard University, John F. Kennedy School of Government.
    28. Miriam Bruhn & David McKenzie, 2009. "In Pursuit of Balance: Randomization in Practice in Development Field Experiments," American Economic Journal: Applied Economics, American Economic Association, vol. 1(4), pages 200-232, October.
    29. Acharya, Avidit & Blackwell, Matthew & Sen, Maya, 2016. "Explaining Causal Findings Without Bias: Detecting and Assessing Direct Effects," American Political Science Review, Cambridge University Press, vol. 110(3), pages 512-529, August.
    30. Eva Vivalt, 2020. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    31. Bold, Tessa & Kimenyi, Mwangi & Mwabu, Germano & Ng’ang’a, Alice & Sandefur, Justin, 2018. "Experimental evidence on scaling up education reforms in Kenya," Journal of Public Economics, Elsevier, vol. 168(C), pages 1-20.
    32. Popova,Anna & Evans,David & Arancibia,Violeta, 2016. "Training teachers on the job : what works and how to measure it," Policy Research Working Paper Series 7834, The World Bank.
    33. Piper, Benjamin & Simmons Zuilkowski, Stephanie & Dubeck, Margaret & Jepkemei, Evelyn & King, Simon J., 2018. "Identifying the essential ingredients to literacy and numeracy improvement: Teacher professional development and coaching, student textbooks, and structured teachers’ guides," World Development, Elsevier, vol. 106(C), pages 324-336.
    34. Dan A. Black & Jeffrey A. Smith, 2006. "Estimating the Returns to College Quality with Multiple Proxies for Quality," Journal of Labor Economics, University of Chicago Press, vol. 24(3), pages 701-728, July.
    35. Sara Nadel and Lant Pritchett, 2016. "Searching for the Devil in the Details: Learning about Development Program Design," Working Papers 434, Center for Global Development.
    36. Steven D. Levitt & John A. List, 2007. "What Do Laboratory Experiments Measuring Social Preferences Reveal About the Real World?," Journal of Economic Perspectives, American Economic Association, vol. 21(2), pages 153-174, Spring.
    37. Wilbur Townsend, 2017. "ELASTICREGRESS: Stata module to perform elastic net regression, lasso regression, ridge regression," Statistical Software Components S458397, Boston College Department of Economics, revised 16 Apr 2018.
    38. Jonathan M.V. Davis & Jonathan Guryan & Kelly Hallberg & Jens Ludwig, 2017. "The Economics of Scale-Up," NBER Working Papers 23925, National Bureau of Economic Research, Inc.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Annie Duflo & Jessica Kiessel & Adrienne Lucas, 2020. "Experimental Evidence on Alternative Policies to Increase Learning at Scale," NBER Working Papers 27298, National Bureau of Economic Research, Inc.
    2. Eble, Alex & Frost, Chris & Camara, Alpha & Bouy, Baboucarr & Bah, Momodou & Sivaraman, Maitri & Hsieh, Pei-Tseng Jenny & Jayanty, Chitra & Brady, Tony & Gawron, Piotr & Vansteelandt, Stijn & Boone, P, 2021. "How much can we remedy very low learning levels in rural parts of low-income countries? Impact and generalizability of a multi-pronged para-teacher intervention from a cluster-randomized trial in the ," Journal of Development Economics, Elsevier, vol. 148(C).
    3. Blimpo, Moussa P. & Pugatch, Todd, 2021. "Entrepreneurship education and teacher training in Rwanda," Journal of Development Economics, Elsevier, vol. 149(C).
    4. Fazzio, Ila & Eble, Alex & Lumsdaine, Robin L. & Boone, Peter & Bouy, Baboucarr & Hsieh, Pei-Tseng Jenny & Jayanty, Chitra & Johnson, Simon & Silva, Ana Filipa, 2021. "Large learning gains in pockets of extreme poverty: Experimental evidence from Guinea Bissau," Journal of Public Economics, Elsevier, vol. 199(C).
    5. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    6. Masselus, Lise & Petrik, Christina & Ankel-Peters, Jörg, 2024. "Lost in the design space? Construct validity in the microfinance literature," Ruhr Economic Papers 1097, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    7. Karthik Muralidharan & Paul Niehaus, 2017. "Experimentation at Scale," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 103-124, Fall.
    8. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    9. Sabrin Beg & Waqas Halim & Adrienne M. Lucas & Umar Saif, 2022. "Engaging Teachers with Technology Increased Achievement, Bypassing Teachers Did Not," American Economic Journal: Economic Policy, American Economic Association, vol. 14(2), pages 61-90, May.
    10. Mo, Di & Bai, Yu & Shi, Yaojiang & Abbey, Cody & Zhang, Linxiu & Rozelle, Scott & Loyalka, Prashant, 2020. "Institutions, implementation, and program effectiveness: Evidence from a randomized evaluation of computer-assisted learning in rural China," Journal of Development Economics, Elsevier, vol. 146(C).
    11. Mariella Gonzales & Gianmarco León-Ciliotta & Luis R. Martínez, 2022. "How Effective Are Monetary Incentives to Vote? Evidence from a Nationwide Policy," American Economic Journal: Applied Economics, American Economic Association, vol. 14(1), pages 293-326, January.
    12. Faraz Usmani & Marc Jeuland & Subhrendu K. Pattanayak, 2018. "NGOs and the effectiveness of interventions," WIDER Working Paper Series wp-2018-59, World Institute for Development Economic Research (UNU-WIDER).
    13. Eduard Marinov, 2019. "The 2019 Nobel Prize in Economics," Economic Thought journal, Bulgarian Academy of Sciences - Economic Research Institute, issue 6, pages 78-116.
    14. Faraz Usmani & Marc Jeuland & Subhrendu K. Pattanayak, 2024. "NGOs and the Effectiveness of Interventions," The Review of Economics and Statistics, MIT Press, vol. 106(6), pages 1690-1708, November.
    15. Jules Gazeaud & Claire Ricard, 2021. "Conditional cash transfers and the learning crisis: evidence from Tayssir scale-up in Morocco," NOVAFRICA Working Paper Series wp2102, Universidade Nova de Lisboa, Nova School of Business and Economics, NOVAFRICA.
    16. Cruz Aguayo, Yyannu & Carneiro, Pedro & Intriago, Ruthy & Ponce, Juan & Schady, Norbert & Schodt, Sarah, 2022. "When Promising Interventions Fail: Personalized Coaching for Teachers in a Middle-Income Country," IZA Discussion Papers 15021, Institute of Labor Economics (IZA).
    17. Alex Eble & Peter Boone & Diana Elbourne, 2017. "On Minimizing the Risk of Bias in Randomized Controlled Trials in Economics," The World Bank Economic Review, World Bank, vol. 31(3), pages 687-707.
    18. Andrew Dustan & Stanislao Maldonado & Juan Manuel Hernandez-Agramonte, 2018. "Motivating bureaucrats with non-monetary incentives when state capacity is weak: Evidence from large-scale field experiments in Peru," Working Papers 136, Peruvian Economic Association.
    19. Cristina Corduneanu-Huci & Michael T. Dorsch & Paul Maarek, 2017. "Learning to constrain: Political competition and randomized controlled trials in development," THEMA Working Papers 2017-24, THEMA (THéorie Economique, Modélisation et Applications), Université de Cergy-Pontoise.
    20. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:socarx:ct9sj. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://arabixiv.org .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.