IDEAS home Printed from https://ideas.repec.org/p/zbw/dicedp/345.html
   My bibliography  Save this paper

Addressing validity and generalizability concerns in field experiments

Author

Listed:
  • Riener, Gerhard
  • Schneider, Sebastian
  • Wagner, Valentin

Abstract

In this paper, we systematically analyze the empirical importance of standard conditions for the validity and generalizability of field experiments: the internal and external overlap and unconfoundedness conditions. We experimentally varied the degree of overlap in disjoint sub-samples from a recruitment experiment with more than 3,000 public schools, mimicking small scale field experiments. This was achieved by using different techniques for treatment assignment. We applied standard methods, such as pure randomization, and the novel minMSE treatment assignment method. This new technique should achieve improved overlap by balancing covariate dependencies and variances instead of focusing on individual mean values. We assess the relevance of the overlap condition by linking the estimation precision in the disjoint sub-samples to measures of overlap and balance in general. Unconfoundedness is addressed by using a rich set of administrative data on institution and municipality characteristics to study potential self-selection. We find no evidence for the violation of unconfoundedness and establish that improved overlap, and balancedness, as achieved by the minMSE method, reduce the bias of the treatment effect estimation by more than 35% compared to pure randomization, illustrating the importance of, and suggesting a solution to, addressing overlap also in (field) experiments.

Suggested Citation

  • Riener, Gerhard & Schneider, Sebastian & Wagner, Valentin, 2020. "Addressing validity and generalizability concerns in field experiments," DICE Discussion Papers 345, Heinrich Heine University Düsseldorf, Düsseldorf Institute for Competition Economics (DICE).
  • Handle: RePEc:zbw:dicedp:345
    as

    Download full text from publisher

    File URL: https://www.econstor.eu/bitstream/10419/222345/1/1724895958.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Bruno Crépon & Esther Duflo & Marc Gurgand & Roland Rathelot & Philippe Zamora, 2013. "Do Labor Market Policies have Displacement Effects? Evidence from a Clustered Randomized Experiment," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 128(2), pages 531-580.
    2. Joseph G. Altonji & Todd E. Elder & Christopher R. Taber, 2005. "Selection on Observed and Unobserved Variables: Assessing the Effectiveness of Catholic Schools," Journal of Political Economy, University of Chicago Press, vol. 113(1), pages 151-184, February.
    3. Asako Ohinata & Jan C. van Ours, 2013. "How Immigrant Children Affect the Academic Achievement of Native Dutch Children," Economic Journal, Royal Economic Society, vol. 0, pages 308-331, August.
    4. Jonathan Schulz & Uwe Sunde & Petra Thiemann & Christian Thoeni, 2019. "Selection into Experiments: Evidence from a Population of Students," Discussion Papers 2019-09, The Centre for Decision Research and Experimental Economics, School of Economics, University of Nottingham.
    5. repec:hal:pseose:halshs-00942662 is not listed on IDEAS
    6. repec:hal:pseose:halshs-00840901 is not listed on IDEAS
    7. Francesco Avvisati & Marc Gurgand & Nina Guyon & Eric Maurin, 2014. "Getting Parents Involved: A Field Experiment in Deprived Schools," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 81(1), pages 57-83.
    8. Omar Al-Ubaydli & John A. List & Dana Suskind, 2019. "The Science of Using Science: Towards an Understanding of the Threats to Scaling Experiments," NBER Working Papers 25848, National Bureau of Economic Research, Inc.
    9. Johannes Abeler & Daniele Nosenzo, 2015. "Self-selection into laboratory experiments: pro-social motives versus monetary incentives," Experimental Economics, Springer;Economic Science Association, vol. 18(2), pages 195-214, June.
    10. Kasy, Maximilian, 2016. "Why Experimenters Might Not Always Want to Randomize, and What They Could Do Instead," Political Analysis, Cambridge University Press, vol. 24(3), pages 324-338, July.
    11. Edward P. Lazear & Ulrike Malmendier & Roberto A. Weber, 2012. "Sorting in Experiments with Application to Social Preferences," American Economic Journal: Applied Economics, American Economic Association, vol. 4(1), pages 136-163, January.
    12. Kraft, Matthew A. & Rogers, Todd, 2015. "The underutilized potential of teacher-to-parent communication: Evidence from a field experiment," Economics of Education Review, Elsevier, vol. 47(C), pages 49-63.
    13. Gerhard Riener & Valentin Wagner, 2019. "On the design of non-monetary incentives in schools," Education Economics, Taylor & Francis Journals, vol. 27(3), pages 223-240, May.
    14. Fischer, Mira & Wagner, Valentin, 2018. "Effects of timing and reference frame of feedback: Evidence from a field experiment," Discussion Papers, Research Unit: Market Behavior SP II 2018-206, WZB Berlin Social Science Center.
    15. Joseph Hotz, V. & Imbens, Guido W. & Mortimer, Julie H., 2005. "Predicting the efficacy of future training programs using past experiences at other locations," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 241-270.
    16. Miriam Bruhn & David McKenzie, 2009. "In Pursuit of Balance: Randomization in Practice in Development Field Experiments," American Economic Journal: Applied Economics, American Economic Association, vol. 1(4), pages 200-232, October.
    17. Christoph Rothe, 2017. "Robust Confidence Intervals for Average Treatment Effects Under Limited Overlap," Econometrica, Econometric Society, vol. 85, pages 645-660, March.
    18. Wagner, Valentin & Riener, Gerhard, 2015. "Peers or parents? On non-monetary incentives in schools," DICE Discussion Papers 203, Heinrich Heine University Düsseldorf, Düsseldorf Institute for Competition Economics (DICE).
    19. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    20. Ohinata, Asako & van Ours, Jan C., 2011. "How Immigrant Children Affect the Academic Achievement of Native Dutch Children," IZA Discussion Papers 6212, Institute of Labor Economics (IZA).
    21. Fischer, Mira & Wagner, Valentin, 2019. "Effects of Timing and Reference Frame of Feedback," Rationality and Competition Discussion Paper Series 150, CRC TRR 190 Rationality and Competition.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Charness, Gary & Cobo-Reyes, Ramón & Eyster, Erik & Katz, Gabriel & Sánchez, Ángela & Sutter, Matthias, 2023. "Improving children's food choices: Experimental evidence from the field," European Economic Review, Elsevier, vol. 159(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Damgaard, Mette Trier & Nielsen, Helena Skyt, 2018. "Nudging in education," Economics of Education Review, Elsevier, vol. 64(C), pages 313-342.
    2. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    3. Jeffrey Smith & Arthur Sweetman, 2016. "Viewpoint: Estimating the causal effects of policies and programs," Canadian Journal of Economics, Canadian Economics Association, vol. 49(3), pages 871-905, August.
    4. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    5. Thiemann, Petra & Schulz, Jonathan & Sunde, Uwe & Thöni, Christian, 2022. "Selection into experiments: New evidence on the role of preferences, cognition, and recruitment protocols," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 98(C).
    6. Hunt Allcott, 2012. "Site Selection Bias in Program Evaluation," NBER Working Papers 18373, National Bureau of Economic Research, Inc.
    7. Jason Fletcher & Jinho Kim & Jenna Nobles & Stephen Ross & Irina Shaorshadze, 2021. "The Effects of Foreign-Born Peers in US High Schools and Middle Schools," Journal of Human Capital, University of Chicago Press, vol. 15(3), pages 432-468.
    8. John List, 2020. "Non est Disputandum de Generalizability? A Glimpse into The External Validity Trial," Artefactual Field Experiments 00711, The Field Experiments Website.
    9. Abhijit Banerjee & Sylvain Chassang & Erik Snowberg, 2016. "Decision Theoretic Approaches to Experiment Design and External Validity," NBER Working Papers 22167, National Bureau of Economic Research, Inc.
    10. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    11. Pedro Carneiro & Sokbae Lee & Daniel Wilhelm, 2020. "Optimal data collection for randomized control trials," The Econometrics Journal, Royal Economic Society, vol. 23(1), pages 1-31.
    12. Clifton-Sprigg, Joanna, 2014. "Educational spillovers and parental migration," 2007 Annual Meeting, July 29-August 1, 2007, Portland, Oregon TN 2015-46, American Agricultural Economics Association (New Name 2008: Agricultural and Applied Economics Association).
    13. Cattaneo, Maria Alejandra & Wolter, Stefan C., 2012. "Migration Policy Can Boost PISA Results: Findings from a Natural Experiment," IZA Discussion Papers 6300, Institute of Labor Economics (IZA).
    14. Aufenanger, Tobias, 2017. "Machine learning to improve experimental design," FAU Discussion Papers in Economics 16/2017, Friedrich-Alexander University Erlangen-Nuremberg, Institute for Economics, revised 2017.
    15. Julia Bredtmann & Sebastian Otten & Christina Vonnahme, 2021. "Linguistic diversity in the classroom, student achievement, and social integration," Education Economics, Taylor & Francis Journals, vol. 29(2), pages 121-142, March.
    16. Girum Abebe & A Stefano Caria & Marcel Fafchamps & Paolo Falco & Simon Franklin & Simon Quinn, 2021. "Anonymity or Distance? Job Search and Labour Market Exclusion in a Growing African City [Endogenous Stratification in Randomized Experiments]," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 88(3), pages 1279-1310.
    17. Seah, Kelvin, 2016. "The Impact of Immigrant Peers on Native Students' Academic Achievement in Countries Where Parents of Immigrants Are Relatively Skilled," IZA Discussion Papers 10065, Institute of Labor Economics (IZA).
    18. Yao, Yuxin & Ohinata, Asako & van Ours, Jan, 2016. "The Education Consequences of Language Proficiency for Young Children," Other publications TiSEM 55d080a9-861e-4372-b542-e, Tilburg University, School of Economics and Management.
    19. Timothy B. Armstrong & Shu Shen, 2013. "Inference on Optimal Treatment Assignments," Cowles Foundation Discussion Papers 1927RR, Cowles Foundation for Research in Economics, Yale University, revised Apr 2015.
    20. Escarce, José J. & Rocco, Lorenzo, 2018. "Immigration and the Health of Older Natives in Western Europe," GLO Discussion Paper Series 228, Global Labor Organization (GLO).

    More about this item

    Keywords

    external validity; field experiments; generalizability; treatment effect; overlap; balance; precision; treatment assignment; unconfoundedness; self-selection bias; site-selection bias;
    All these keywords.

    JEL classification:

    • C9 - Mathematical and Quantitative Methods - - Design of Experiments
    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
    • D04 - Microeconomics - - General - - - Microeconomic Policy: Formulation; Implementation; Evaluation

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:zbw:dicedp:345. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ZBW - Leibniz Information Centre for Economics (email available below). General contact details of provider: https://edirc.repec.org/data/diduede.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.