IDEAS home Printed from https://ideas.repec.org/p/wbk/wbrwps/10296.html
   My bibliography  Save this paper

A Metadata Schema for Data from Experiments in the Social Sciences

Author

Listed:
  • Cavanagh,Jack
  • Fliegner,Jasmin Claire
  • Kopper,Sarah
  • Sautmann,Anja

Abstract

The use of randomized controlled trials (RCTs) in the social sciences has greatly expanded,resulting in newly abundant, high-quality data that can be reused to perform methods research in program evaluation, tosystematize evidence for policymakers, and for replication and training purposes. However, potential users of RCT dataoften face significant barriers to discovery and reuse. This paper proposes a metadata schema that standardizes RCT datadocumentation and can serve as the basis for one—or many, interoperable —data catalogs that make such data easily findable, searchable, and comparable, and thus more readilyreusable for secondary research. The schema is designed to document the unique properties of RCT data. Its set offields and associated encoding schemes (acceptable formats and values) can be used to describe any dataset associatedwith a social science RCT. The paper also makes recommendations for implementing a catalog or database basedon this metadata schema.

Suggested Citation

  • Cavanagh,Jack & Fliegner,Jasmin Claire & Kopper,Sarah & Sautmann,Anja, 2023. "A Metadata Schema for Data from Experiments in the Social Sciences," Policy Research Working Paper Series 10296, The World Bank.
  • Handle: RePEc:wbk:wbrwps:10296
    as

    Download full text from publisher

    File URL: http://documents.worldbank.org/curated/en/099945502062327217/pdf/IDU081c960a8049b504197099ff0d12be0b95375.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Karlan, Dean & Wood, Daniel H., 2017. "The effect of effectiveness: Donor response to aid effectiveness in a direct mail fundraising experiment," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 66(C), pages 1-8.
    2. Victor Chernozhukov & Denis Chetverikov & Mert Demirer & Esther Duflo & Christian Hansen & Whitney Newey & James Robins, 2018. "Double/debiased machine learning for treatment and structural parameters," Econometrics Journal, Royal Economic Society, vol. 21(1), pages 1-68, February.
    3. Steven Glazerman & Dan M. Levy & David Myers, 2003. "Nonexperimental Versus Experimental Estimates of Earnings Impacts," The ANNALS of the American Academy of Political and Social Science, , vol. 589(1), pages 63-93, September.
    4. Eva Vivalt, 2015. "Heterogeneous Treatment Effects in Impact Evaluation," American Economic Review, American Economic Association, vol. 105(5), pages 467-470, May.
    5. Vivi Alatas & Abhijit Banerjee & Rema Hanna & Benjamin A. Olken & Julia Tobias, 2012. "Targeting the Poor: Evidence from a Field Experiment in Indonesia," American Economic Review, American Economic Association, vol. 102(4), pages 1206-1240, June.
    6. Beegle, Kathleen & Carletto, Calogero & Himelein, Kristen, 2012. "Reliability of recall in agricultural data," Journal of Development Economics, Elsevier, vol. 98(1), pages 34-41.
    7. Emily Oster & Rebecca Thornton, 2012. "Determinants Of Technology Adoption: Peer Effects In Menstrual Cup Take-Up," Journal of the European Economic Association, European Economic Association, vol. 10(6), pages 1263-1293, December.
    8. Joshua D. Angrist & Jörn-Steffen Pischke, 2009. "Mostly Harmless Econometrics: An Empiricist's Companion," Economics Books, Princeton University Press, edition 1, number 8769.
    9. Jayachandran, Seema & Biradavolu, Monica & Cooper, Jan, 2021. "Using machine learning and qualitative interviews to design a five-question women's agency index," CEPR Discussion Papers 15961, C.E.P.R. Discussion Papers.
    10. Petra E. Todd & Kenneth I. Wolpin, 2010. "Structural Estimation and Policy Evaluation in Developing Countries," Annual Review of Economics, Annual Reviews, vol. 2(1), pages 21-50, September.
    11. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    12. Rachael Meager, 2019. "Understanding the Average Impact of Microcredit Expansions: A Bayesian Hierarchical Analysis of Seven Randomized Experiments," American Economic Journal: Applied Economics, American Economic Association, vol. 11(1), pages 57-91, January.
    13. Carolina Lopez & Anja Sautmann & Simone Schaner, 2022. "Does Patient Demand Contribute to the Overuse of Prescription Drugs?," American Economic Journal: Applied Economics, American Economic Association, vol. 14(1), pages 225-260, January.
    14. Felipe Barrera-Osorio & Pierre De Galbert & James Habyarimana & Shwetlena Sabarwal, 2020. "The Impact of Public-Private Partnerships on Private School Performance: Evidence from a Randomized Controlled Trial in Uganda," Economic Development and Cultural Change, University of Chicago Press, vol. 68(2), pages 429-469.
    15. A Stefano Caria & Grant Gordon & Maximilian Kasy & Simon Quinn & Soha Osman Shami & Alexander Teytelboym, 2024. "An Adaptive Targeted Field Experiment: Job Search Assistance for Refugees in Jordan," Journal of the European Economic Association, European Economic Association, vol. 22(2), pages 781-836.
    16. Maximilian Kasy & Anja Sautmann, 2021. "Adaptive Treatment Assignment in Experiments for Policy Choice," Econometrica, Econometric Society, vol. 89(1), pages 113-132, January.
    17. Duncan D. Chaplin & Thomas D. Cook & Jelena Zurovac & Jared S. Coopersmith & Mariel M. Finucane & Lauren N. Vollmer & Rebecca E. Morris, 2018. "The Internal And External Validity Of The Regression Discontinuity Design: A Meta‐Analysis Of 15 Within‐Study Comparisons," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 37(2), pages 403-429, March.
    18. Eva Vivalt, 2019. "Specification Searching and Significance Inflation Across Time, Methods and Disciplines," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 81(4), pages 797-816, August.
    19. Andrews, Isaiah & Oster, Emily, 2019. "A simple approximation for evaluating external validity bias," Economics Letters, Elsevier, vol. 178(C), pages 58-62.
    20. Michael L. Anderson & Jeremy Magruder, 2017. "Split-Sample Strategies for Avoiding False Discoveries," NBER Working Papers 23544, National Bureau of Economic Research, Inc.
    21. Thomas Fraker & Rebecca Maynard, 1987. "The Adequacy of Comparison Group Designs for Evaluations of Employment-Related Programs," Journal of Human Resources, University of Wisconsin Press, vol. 22(2), pages 194-227.
    22. Costas Meghir & Ahmed Mushfiq Mobarak & Corina D. Mommaerts & Melanie Morten, 2019. "Migration and Informal Insurance: Evidence from a Randomized Controlled Trial and a Structural Model," NBER Working Papers 26082, National Bureau of Economic Research, Inc.
    23. Maria Dimakopoulou & Zhengyuan Zhou & Susan Athey & Guido Imbens, 2017. "Estimation Considerations in Contextual Bandits," Papers 1711.07077, arXiv.org, revised Dec 2018.
    24. Hunt Allcott, 2015. "Site Selection Bias in Program Evaluation," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 130(3), pages 1117-1165.
    25. Rajeev H. Dehejia & Sadek Wahba, 2002. "Propensity Score-Matching Methods For Nonexperimental Causal Studies," The Review of Economics and Statistics, MIT Press, vol. 84(1), pages 151-161, February.
    26. Kenneth I. Wolpin & Petra E. Todd, 2006. "Assessing the Impact of a School Subsidy Program in Mexico: Using a Social Experiment to Validate a Dynamic Behavioral Model of Child Schooling and Fertility," American Economic Review, American Economic Association, vol. 96(5), pages 1384-1417, December.
    27. Mobarak, Ahmed & Levinsohn, James & Guiteras, Raymond, 2019. "Demand Estimation with Strategic Complementarities: Sanitation in Bangladesh," CEPR Discussion Papers 13498, C.E.P.R. Discussion Papers.
    28. Meager, Rachael, 2019. "Understanding the average impact of microcredit expansions: a Bayesian hierarchical analysis of seven randomized experiments," LSE Research Online Documents on Economics 88190, London School of Economics and Political Science, LSE Library.
    29. Joseph Hotz, V. & Imbens, Guido W. & Mortimer, Julie H., 2005. "Predicting the efficacy of future training programs using past experiences at other locations," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 241-270.
    30. Gary King, 2007. "An Introduction to the Dataverse Network as an Infrastructure for Data Sharing," Sociological Methods & Research, , vol. 36(2), pages 173-199, November.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Denis Fougère & Nicolas Jacquemet, 2020. "Policy Evaluation Using Causal Inference Methods," SciencePo Working papers Main hal-03455978, HAL.
    2. Vivian C. Wong & Peter M. Steiner & Kylie L. Anglin, 2018. "What Can Be Learned From Empirical Evaluations of Nonexperimental Methods?," Evaluation Review, , vol. 42(2), pages 147-175, April.
    3. Masselus, Lise & Petrik, Christina & Ankel-Peters, Jörg, 2024. "Lost in the Design Space? Construct Validity in the Microfinance Literature," OSF Preprints nwp8k_v1, Center for Open Science.
    4. Galiani, Sebastian & Quistorff, Brian, 2024. "Assessing external validity in practice," Research in Economics, Elsevier, vol. 78(3).
    5. Takuya Ishihara & Toru Kitagawa, 2021. "Evidence Aggregation for Treatment Choice," Papers 2108.06473, arXiv.org, revised Jul 2024.
    6. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
    7. Jeffrey Smith & Arthur Sweetman, 2016. "Viewpoint: Estimating the causal effects of policies and programs," Canadian Journal of Economics, Canadian Economics Association, vol. 49(3), pages 871-905, August.
    8. Karthik Muralidharan & Paul Niehaus, 2017. "Experimentation at Scale," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 103-124, Fall.
    9. Masselus, Lise & Petrik, Christina & Ankel-Peters, Jörg, 2024. "Lost in the design space? Construct validity in the microfinance literature," Ruhr Economic Papers 1097, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    10. Paul Hunermund & Elias Bareinboim, 2019. "Causal Inference and Data Fusion in Econometrics," Papers 1912.09104, arXiv.org, revised Mar 2023.
    11. Andrews, Isaiah & Oster, Emily, 2019. "A simple approximation for evaluating external validity bias," Economics Letters, Elsevier, vol. 178(C), pages 58-62.
    12. Sant’Anna, Pedro H.C. & Zhao, Jun, 2020. "Doubly robust difference-in-differences estimators," Journal of Econometrics, Elsevier, vol. 219(1), pages 101-122.
    13. A Stefano Caria & Grant Gordon & Maximilian Kasy & Simon Quinn & Soha Osman Shami & Alexander Teytelboym, 2024. "An Adaptive Targeted Field Experiment: Job Search Assistance for Refugees in Jordan," Journal of the European Economic Association, European Economic Association, vol. 22(2), pages 781-836.
    14. Higney, Anthony & Hanley, Nick & Moro, Mirko, 2022. "The lead-crime hypothesis: A meta-analysis," Regional Science and Urban Economics, Elsevier, vol. 97(C).
    15. Rahul Singh & Liyuan Xu & Arthur Gretton, 2020. "Kernel Methods for Causal Functions: Dose, Heterogeneous, and Incremental Response Curves," Papers 2010.04855, arXiv.org, revised Oct 2022.
    16. Pietro Emilio Spini, 2021. "Robustness, Heterogeneous Treatment Effects and Covariate Shifts," Papers 2112.09259, arXiv.org, revised Aug 2024.
    17. Belot, Michèle & James, Jonathan, 2016. "Partner selection into policy relevant field experiments," Journal of Economic Behavior & Organization, Elsevier, vol. 123(C), pages 31-56.
    18. Kaiser, Tim & Lusardi, Annamaria & Menkhoff, Lukas & Urban, Carly, 2022. "Financial education affects financial knowledge and downstream behaviors," Journal of Financial Economics, Elsevier, vol. 145(2), pages 255-272.
    19. Goodman-Bacon, Andrew, 2021. "Difference-in-differences with variation in treatment timing," Journal of Econometrics, Elsevier, vol. 225(2), pages 254-277.
    20. Abramovsky, Laura & Augsburg, Britta & Lührmann, Melanie & Oteiza, Francisco & Rud, Juan Pablo, 2023. "Community matters: Heterogeneous impacts of a sanitation intervention," World Development, Elsevier, vol. 165(C).

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wbk:wbrwps:10296. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Roula I. Yazigi (email available below). General contact details of provider: https://edirc.repec.org/data/dvewbus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.