IDEAS home Printed from https://ideas.repec.org/a/cup/polals/v25y2017i01p131-137_00.html
   My bibliography  Save this article

Uncovering a Blind Spot in Sensitive Question Research: False Positives Undermine the Crosswise-Model RRT

Author

Listed:
  • Höglinger, Marc
  • Diekmann, Andreas

Abstract

Validly measuring sensitive issues such as norm violations or stigmatizing traits through self-reports in surveys is often problematic. Special techniques for sensitive questions like the Randomized Response Technique (RRT) and, among its variants, the recent crosswise model should generate more honest answers by providing full response privacy. Different types of validation studies have examined whether these techniques actually improve data validity, with varying results. Yet, most of these studies did not consider the possibility of false positives, i.e., that respondents are misclassified as having a sensitive trait even though they actually do not. Assuming that respondents only falsely deny but never falsely admit possessing a sensitive trait, higher prevalence estimates have typically been interpreted as more valid estimates. If false positives occur, however, conclusions drawn under this assumption might be misleading. We present a comparative validation design that is able to detect false positives without the need for an individual-level validation criterion — which is often unavailable. Results show that the most widely used crosswise-model implementation produced false positives to a nonignorable extent. This defect was not revealed by several previous validation studies that did not consider false positives — apparently a blind spot in past sensitive question research.

Suggested Citation

  • Höglinger, Marc & Diekmann, Andreas, 2017. "Uncovering a Blind Spot in Sensitive Question Research: False Positives Undermine the Crosswise-Model RRT," Political Analysis, Cambridge University Press, vol. 25(1), pages 131-137, January.
  • Handle: RePEc:cup:polals:v:25:y:2017:i:01:p:131-137_00
    as

    Download full text from publisher

    File URL: https://www.cambridge.org/core/product/identifier/S104719871600005X/type/journal_article
    File Function: link to article abstract page
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Marc Höglinger & Ben Jann, 2018. "More is not always better: An experimental individual-level validation of the randomized response technique and the crosswise model," PLOS ONE, Public Library of Science, vol. 13(8), pages 1-22, August.
    2. Ben Jann, 2007. "Making regression tables simplified," Stata Journal, StataCorp LP, vol. 7(2), pages 227-244, June.
    3. Kirchner Antje, 2015. "Validating Sensitive Questions: A Comparison of Survey and Register Data," Journal of Official Statistics, Sciendo, vol. 31(1), pages 31-59, March.
    4. John, Leslie K. & Loewenstein, George & Acquisti, Alessandro & Vosgerau, Joachim, 2018. "When and why randomized response techniques (fail to) elicit the truth," Organizational Behavior and Human Decision Processes, Elsevier, vol. 148(C), pages 101-123.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Chuang, Erica & Dupas, Pascaline & Huillery, Elise & Seban, Juliette, 2021. "Sex, lies, and measurement: Consistency tests for indirect response survey methods," Journal of Development Economics, Elsevier, vol. 148(C).
    2. Marc Höglinger & Ben Jann, 2018. "More is not always better: An experimental individual-level validation of the randomized response technique and the crosswise model," PLOS ONE, Public Library of Science, vol. 13(8), pages 1-22, August.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Burgstaller, Lilith & Feld, Lars P. & Pfeil, Katharina, 2022. "Working in the shadow: Survey techniques for measuring and explaining undeclared work," Journal of Economic Behavior & Organization, Elsevier, vol. 200(C), pages 661-671.
    2. Ivar Krumpal & Thomas Voss, 2020. "Sensitive Questions and Trust: Explaining Respondents’ Behavior in Randomized Response Surveys," SAGE Open, , vol. 10(3), pages 21582440209, July.
    3. Ben Jann, 2017. "Creating HTML or Markdown documents from within Stata using webdoc," Stata Journal, StataCorp LP, vol. 17(1), pages 3-38, March.
    4. Sylvain Chassang & Christian Zehnder, 2019. "Secure Survey Design in Organizations: Theory and Experiments," Working Papers 2019-22, Princeton University. Economics Department..
    5. Jacques Muthusi & Samuel Mwalili & Peter Young, 2019. "%svy_logistic_regression: A generic SAS macro for simple and multiple logistic regression and creating quality publication-ready tables using survey or non-survey data," PLOS ONE, Public Library of Science, vol. 14(9), pages 1-14, September.
    6. David Roodman, 2022. "Schooling and Labor Market Consequences of School Construction in Indonesia: Comment," Papers 2207.09036, arXiv.org, revised Mar 2024.
    7. Sebastian Butschek & Jan Sauermann, 2024. "The Effect of Employment Protection on Firms’ Worker Selection," Journal of Human Resources, University of Wisconsin Press, vol. 59(6), pages 1981-2020.
    8. Ben Jann, 2013. "Plotting regression coefficients and other estimates in Stata," University of Bern Social Sciences Working Papers 1, University of Bern, Department of Social Sciences, revised 18 Sep 2017.
    9. Ciani, Emanuele, 2016. "Retirement, pension eligibility and home production," Labour Economics, Elsevier, vol. 38(C), pages 106-120.
    10. Jakob, Martina & Combet, Benita, 2020. "Educational aspirations and decision-making in a context of poverty. A test of rational choice models in El Salvador," SocArXiv w9bkq, Center for Open Science.
    11. Hannes Kröger & Johan Fritzell & Rasmus Hoffmann, 2016. "The Association of Levels of and Decline in Grip Strength in Old Age with Trajectories of Life Course Occupational Position," PLOS ONE, Public Library of Science, vol. 11(5), pages 1-16, May.
    12. Carsten Sauer & Peter Valet & Stefan Liebig, 2013. "The Impact of within and between Occupational Inequalities on People's Justice Perceptions towards Their Own Earnings," SOEPpapers on Multidisciplinary Panel Data Research 567, DIW Berlin, The German Socio-Economic Panel (SOEP).
    13. Hlavac, Marek, 2016. "TableMaker: An Excel Macro for Publication-Quality Tables," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 70(c03).
    14. Vincenzo Atella & Federico Belotti & Ludovico Carrino & Andrea Piano Mortari, 2017. "The future of Long Term Care in Europe. An investigation using a dynamic microsimulation model," CEIS Research Paper 405, Tor Vergata University, CEIS, revised 08 May 2017.
    15. Zhao, Yuejun, 2023. "Job displacement and the mental health of households: Burden sharing counteracts spillover," Labour Economics, Elsevier, vol. 81(C).
    16. Dang, Hai-Anh & Lanjouw, Peter & Luoto, Jill & McKenzie, David, 2014. "Using repeated cross-sections to explore movements into and out of poverty," Journal of Development Economics, Elsevier, vol. 107(C), pages 112-128.
    17. Pier Francesco Perri & Eleni Manoli & Tasos C. Christofides, 2023. "Assessing the effectiveness of indirect questioning techniques by detecting liars," Statistical Papers, Springer, vol. 64(5), pages 1483-1506, October.
    18. Daniel Homocianu, 2023. "Exploring the Predictors of Co-Nationals’ Preference over Immigrants in Accessing Jobs—Evidence from World Values Survey," Mathematics, MDPI, vol. 11(3), pages 1-29, February.
    19. Diriwaechter, Patric & Shvartsman, Elena, 2018. "The anticipation and adaptation effects of intra- and interpersonal wage changes on job satisfaction," Journal of Economic Behavior & Organization, Elsevier, vol. 146(C), pages 116-140.
    20. S. Rinken & S. Pasadas-del-Amo & M. Rueda & B. Cobo, 2021. "No magic bullet: estimating anti-immigrant sentiment and social desirability bias with the item-count technique," Quality & Quantity: International Journal of Methodology, Springer, vol. 55(6), pages 2139-2159, December.

    More about this item

    JEL classification:

    • C81 - Mathematical and Quantitative Methods - - Data Collection and Data Estimation Methodology; Computer Programs - - - Methodology for Collecting, Estimating, and Organizing Microeconomic Data; Data Access
    • C83 - Mathematical and Quantitative Methods - - Data Collection and Data Estimation Methodology; Computer Programs - - - Survey Methods; Sampling Methods
    • C42 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods: Special Topics - - - Survey Methods

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:cup:polals:v:25:y:2017:i:01:p:131-137_00. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Kirk Stebbing (email available below). General contact details of provider: https://www.cambridge.org/pan .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.