IDEAS home Printed from https://ideas.repec.org/a/sae/joupea/v57y2020i6p692-700.html
   My bibliography  Save this article

How and how much does expert error matter? Implications for quantitative peace research

Author

Listed:
  • Kyle L Marquardt

    (School of Politics and Governance & International Center for the Study of Institutions and Development, 3570National Research University Higher School of Economics)

Abstract

Expert-coded datasets provide scholars with otherwise unavailable data on important concepts. However, expert coders vary in their reliability and scale perception, potentially resulting in substantial measurement error. These concerns are acute in expert coding of key concepts for peace research. Here I examine (1) the implications of these concerns for applied statistical analyses, and (2) the degree to which different modeling strategies ameliorate them. Specifically, I simulate expert-coded country-year data with different forms of error and then regress civil conflict onset on these data, using five different modeling strategies. Three of these strategies involve regressing conflict onset on point estimate aggregations of the simulated data: the mean and median over expert codings, and the posterior median from a latent variable model. The remaining two strategies incorporate measurement error from the latent variable model into the regression process by using multiple imputation and a structural equation model. Analyses indicate that expert-coded data are relatively robust: across simulations, almost all modeling strategies yield regression results roughly in line with the assumed true relationship between the expert-coded concept and outcome. However, the introduction of measurement error to expert-coded data generally results in attenuation of the estimated relationship between the concept and conflict onset. The level of attenuation varies across modeling strategies: a structural equation model is the most consistently robust estimation technique, while the median over expert codings and multiple imputation are the least robust.

Suggested Citation

  • Kyle L Marquardt, 2020. "How and how much does expert error matter? Implications for quantitative peace research," Journal of Peace Research, Peace Research Institute Oslo, vol. 57(6), pages 692-700, November.
  • Handle: RePEc:sae:joupea:v:57:y:2020:i:6:p:692-700
    DOI: 10.1177/0022343320959121
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0022343320959121
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0022343320959121?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Maestas, Cherie D. & Buttice, Matthew K. & Stone, Walter J., 2014. "Extracting Wisdom from Experts and Small Crowds: Strategies for Improving Informant-based Measures of Political Concepts," Political Analysis, Cambridge University Press, vol. 22(3), pages 354-373, July.
    2. Christopher Hare & David A. Armstrong & Ryan Bakker & Royce Carroll & Keith T. Poole, 2015. "Using Bayesian Aldrich‐McKelvey Scaling to Study Citizens' Ideological Preferences and Perceptions," American Journal of Political Science, John Wiley & Sons, vol. 59(3), pages 759-774, July.
    3. Miriam Barnum & James Lo, 2020. "Is the NPT unraveling? Evidence from text analysis of review conference statements," Journal of Peace Research, Peace Research Institute Oslo, vol. 57(6), pages 740-751, November.
    4. Shor, Boris & Bafumi, Joseph & Keele, Luke & Park, David, 2007. "A Bayesian Multilevel Modeling Approach to Time-Series Cross-Sectional Data," Political Analysis, Cambridge University Press, vol. 15(2), pages 165-181, April.
    5. Clinton, Joshua D. & Lewis, David E., 2008. "Expert Opinion, Agency Characteristics, and Agency Preferences," Political Analysis, Cambridge University Press, vol. 16(1), pages 3-20, January.
    6. John G. Cragg, 1994. "Making Good Inferences from Bad Data," Canadian Journal of Economics, Canadian Economics Association, vol. 27(4), pages 776-800, November.
    7. King, Gary & Wand, Jonathan, 2007. "Comparing Incomparable Survey Responses: Evaluating and Selecting Anchoring Vignettes," Political Analysis, Cambridge University Press, vol. 15(1), pages 46-66, January.
    8. Jerry Hausman, 2001. "Mismeasured Variables in Econometric Analysis: Problems from the Right and Problems from the Left," Journal of Economic Perspectives, American Economic Association, vol. 15(4), pages 57-67, Fall.
    9. Christopher J Fariss & Michael R Kenwick & Kevin Reuning, 2020. "Estimating one-sided-killings from a robust measurement model of human rights," Journal of Peace Research, Peace Research Institute Oslo, vol. 57(6), pages 801-814, November.
    10. Fearon, James D. & Laitin, David D., 2003. "Ethnicity, Insurgency, and Civil War," American Political Science Review, Cambridge University Press, vol. 97(1), pages 75-90, February.
    11. Susanne M. Schennach, 2016. "Recent Advances in the Measurement Error Literature," Annual Review of Economics, Annual Reviews, vol. 8(1), pages 341-377, October.
    12. Matthew Blackwell & James Honaker & Gary King, 2017. "A Unified Approach to Measurement Error and Missing Data: Overview and Applications," Sociological Methods & Research, , vol. 46(3), pages 303-341, August.
    13. K Chad Clay & Ryan Bakker & Anne-Marie Brook & Daniel W Hill Jr & Amanda Murdie, 2020. "Using practitioner surveys to measure human rights: The Human Rights Measurement Initiative’s civil and political rights metrics," Journal of Peace Research, Peace Research Institute Oslo, vol. 57(6), pages 715-727, November.
    14. Marquardt, Kyle L. & Pemstein, Daniel, 2018. "IRT Models for Expert-Coded Panel Data," Political Analysis, Cambridge University Press, vol. 26(4), pages 431-456, October.
    15. Zhanna Terechshenko, 2020. "Hot under the collar: A latent measure of interstate hostility," Journal of Peace Research, Peace Research Institute Oslo, vol. 57(6), pages 764-776, November.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Lindberg, Staffan I. & Lo Bue, Maria C. & Sen, Kunal, 2022. "Clientelism, corruption and the rule of law," World Development, Elsevier, vol. 158(C).
    2. Christopher J Fariss & Michael R Kenwick & Kevin Reuning, 2020. "Estimating one-sided-killings from a robust measurement model of human rights," Journal of Peace Research, Peace Research Institute Oslo, vol. 57(6), pages 801-814, November.
    3. Christopher J Fariss & James Lo, 2020. "Innovations in concepts and measurement for the study of peace and conflict," Journal of Peace Research, Peace Research Institute Oslo, vol. 57(6), pages 669-678, November.
    4. Florencia Montal & Carly Potz-Nielsen & Jane Lawrence Sumner, 2020. "What states want: Estimating ideal points from international investment treaty content," Journal of Peace Research, Peace Research Institute Oslo, vol. 57(6), pages 679-691, November.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Florencia Montal & Carly Potz-Nielsen & Jane Lawrence Sumner, 2020. "What states want: Estimating ideal points from international investment treaty content," Journal of Peace Research, Peace Research Institute Oslo, vol. 57(6), pages 679-691, November.
    2. Christopher J Fariss & James Lo, 2020. "Innovations in concepts and measurement for the study of peace and conflict," Journal of Peace Research, Peace Research Institute Oslo, vol. 57(6), pages 669-678, November.
    3. Jule Krüger & Ragnhild Nordås, 2020. "A latent variable approach to measuring wartime sexual violence," Journal of Peace Research, Peace Research Institute Oslo, vol. 57(6), pages 728-739, November.
    4. Andrea Bastianin & Paolo Castelnovo & Massimo Florio, 2017. "The Empirics of Regulatory Reforms Proxied by Categorical Variables: Recent Findings and Methodological Issues," ETA: Economic Theory and Applications 257877, Fondazione Eni Enrico Mattei (FEEM).
    5. Christopher J Fariss & Michael R Kenwick & Kevin Reuning, 2020. "Estimating one-sided-killings from a robust measurement model of human rights," Journal of Peace Research, Peace Research Institute Oslo, vol. 57(6), pages 801-814, November.
    6. Stephen A Meserve & Sivagaminathan Palani & Daniel Pemstein, 2018. "Measuring candidate selection mechanisms in European elections: Comparing formal party rules to candidate survey responses," European Union Politics, , vol. 19(1), pages 185-202, March.
    7. Depetris-Chauvin, Emilio & Özak, Ömer, 2023. "(De facto) Historical Ethnic Borders and Contemporary Conflict in Africa," MPRA Paper 116868, University Library of Munich, Germany.
    8. Bastianin, Andrea & Castelnovo, Paolo & Florio, Massimo, 2018. "Evaluating regulatory reform of network industries: a survey of empirical models based on categorical proxies," Utilities Policy, Elsevier, vol. 55(C), pages 115-128.
    9. Hachmi Ben Ameur & Fredj Jawadi & Abdoulkarim Idi Cheffou & Wael Louhichi, 2018. "Measurement errors in stock markets," Annals of Operations Research, Springer, vol. 262(2), pages 287-306, March.
    10. Lin, Zhongjian & Hu, Yingyao, 2024. "Binary choice with misclassification and social interactions, with an application to peer effects in attitude," Journal of Econometrics, Elsevier, vol. 238(1).
    11. Eric Blankmeyer, 2018. "Measurement Errors as Bad Leverage Points," Papers 1807.02814, arXiv.org, revised Mar 2020.
    12. Stefanie Heidrich, 2017. "Intergenerational mobility in Sweden: a regional perspective," Journal of Population Economics, Springer;European Society for Population Economics, vol. 30(4), pages 1241-1280, October.
    13. Yingyao Hu & Zhongjian Lin, 2018. "Misclassification and the hidden silent rivalry," CeMMAP working papers CWP12/18, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    14. Vanessa A Boese, 2019. "How (not) to measure democracy," International Area Studies Review, Center for International Area Studies, Hankuk University of Foreign Studies, vol. 22(2), pages 95-127, June.
    15. Heidrich, Stefanie, 2015. "Intergenerational Mobility in Sweden: a Regional Perspective," Umeå Economic Studies 916, Umeå University, Department of Economics.
    16. Carletto,Calogero & Dillon,Andrew S. & Zezza,Alberto, 2021. "Agricultural Data Collection to Minimize Measurement Error and Maximize Coverage," Policy Research Working Paper Series 9745, The World Bank.
    17. Mochen Yang & Edward McFowland & Gordon Burtch & Gediminas Adomavicius, 2022. "Achieving Reliable Causal Inference with Data-Mined Variables: A Random Forest Approach to the Measurement Error Problem," INFORMS Joural on Data Science, INFORMS, vol. 1(2), pages 138-155, October.
    18. Tobias Risse, 2024. "External threats and state support for arms control," Journal of Peace Research, Peace Research Institute Oslo, vol. 61(2), pages 214-227, March.
    19. Robert MacCulloch & Silvia Pezzini, 2010. "The Roles of Freedom, Growth, and Religion in the Taste for Revolution," Journal of Law and Economics, University of Chicago Press, vol. 53(2), pages 329-358, May.
    20. Adele Bergin, 2015. "Employer Changes and Wage Changes: Estimation with Measurement Error in a Binary Variable," LABOUR, CEIS, vol. 29(2), pages 194-223, June.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:joupea:v:57:y:2020:i:6:p:692-700. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: http://www.prio.no/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.