IDEAS home Printed from https://ideas.repec.org/a/sae/medema/v43y2023i6p667-679.html
   My bibliography  Save this article

Comparison of Preferences and Data Quality between Discrete Choice Experiments Conducted in Online and Face-to-Face Respondents

Author

Listed:
  • Ruixuan Jiang

    (Center for Observational and Real-World Evidence, Merck & Co., Inc, Rahway, NJ, USA)

  • Eleanor Pullenayegum

    (Child Health Evaluative Sciences, Hospital for Sick Children, Toronto, Canada)

  • James W. Shaw

    (Patient-reported Outcomes Assessment, Bristol-Myers Squibb, Princeton, NJ, USA)

  • Axel Mühlbacher

    (Duke Department of Population Health Sciences and Duke Global Health Institute, Duke University, Durham, NC, USA, Germany)

  • Todd A. Lee

    (Department of Pharmacy Systems, Outcomes, and Policy, University of Illinois at Chicago College of Pharmacy, Chicago, IL, USA)

  • Surrey Walton

    (Department of Pharmacy Systems, Outcomes, and Policy, University of Illinois at Chicago College of Pharmacy, Chicago, IL, USA)

  • Thomas Kohlmann

    (Institute for Community Medicine, Medical University Greifswald, Greifswald, Germany)

  • Richard Norman

    (Curtin University School of Public Health, Perth, Australia)

  • A. Simon Pickard

    (Department of Pharmacy Systems, Outcomes, and Policy, University of Illinois at Chicago College of Pharmacy, Chicago, IL, USA)

Abstract

Introduction Discrete choice experiments (DCE) are increasingly being conducted using online panels. However, the comparability of such DCE-based preferences to traditional modes of data collection (e.g., in-person) is not well established. In this study, supervised, face-to-face DCE was compared with its unsupervised, online facsimile on face validity, respondent behavior, and modeled preferences. Methods Data from face-to-face and online EQ-5D-5L health state valuation studies were compared, in which each used the same experimental design and quota sampling procedure. Respondents completed 7 binary DCE tasks comparing 2 EQ-5D-5L health states presented side by side (health states A and B). Data face validity was assessed by comparing preference patterns as a function of the severity difference between 2 health states within a task. The prevalence of potentially suspicious choice patterns (i.e., all As, all Bs, and alternating As/Bs) was compared between studies. Preference data were modeled using multinomial logit regression and compared based on dimensional contribution to overall scale and importance ranking of dimension-levels. Results One thousand five Online respondents and 1,099 face-to-face screened (F2F S ) respondents were included in the main comparison of DCE tasks. Online respondents reported more problems on all EQ-5D dimensions except for Mobility. The face validity of the data was similar between comparators. Online respondents had a greater prevalence of potentially suspicious DCE choice patterns ([Online]: 5.3% [F2F S ] 2.9%, P  = 0.005). When modeled, the relative contribution of each EQ-5D dimension differed between modes of administration. Online respondents weighed Mobility more importantly and Anxiety/Depression less importantly. Discussion Although assessments of face validity were similar between Online and F2F S , modeled preferences differed. Future analyses are needed to clarify whether differences are attributable to preference or data quality variation between modes of data collection.

Suggested Citation

  • Ruixuan Jiang & Eleanor Pullenayegum & James W. Shaw & Axel Mühlbacher & Todd A. Lee & Surrey Walton & Thomas Kohlmann & Richard Norman & A. Simon Pickard, 2023. "Comparison of Preferences and Data Quality between Discrete Choice Experiments Conducted in Online and Face-to-Face Respondents," Medical Decision Making, , vol. 43(6), pages 667-679, August.
  • Handle: RePEc:sae:medema:v:43:y:2023:i:6:p:667-679
    DOI: 10.1177/0272989X231171912
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0272989X231171912
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0272989X231171912?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Marcel F. Jonker & Arthur E. Attema & Bas Donkers & Elly A. Stolk & Matthijs M. Versteegh, 2017. "Are Health State Valuations from the General Public Biased? A Test of Health State Reference Dependency Using Self‐assessed Health and an Efficient Discrete Choice Experiment," Health Economics, John Wiley & Sons, Ltd., vol. 26(12), pages 1534-1547, December.
    2. Caitlyn T. Wilke & A. Simon Pickard & Surrey M. Walton & Joern Moock & Thomas Kohlmann & Todd A. Lee, 2010. "Statistical implications of utility weighted and equally weighted HRQL measures: an empirical study," Health Economics, John Wiley & Sons, Ltd., vol. 19(1), pages 101-110, January.
    3. John Cairns & Marjon van der Pol & Andrew Lloyd, 2002. "Decision making heuristics and the elicitation of preferences: being fast and frugal about the future," Health Economics, John Wiley & Sons, Ltd., vol. 11(7), pages 655-658, October.
    4. Christopher G. Leggett & Naomi S. Kleckner & Kevin J. Boyle & John W. Dufield & Robert Cameron Mitchell, 2003. "Social Desirability Bias in Contingent Valuation Surveys Administered Through In-Person Interviews," Land Economics, University of Wisconsin Press, vol. 79(4), pages 561-575.
    5. Caroline Vass & Dan Rigby & Katherine Payne, 2017. "The Role of Qualitative Research Methods in Discrete Choice Experiments," Medical Decision Making, , vol. 37(3), pages 298-313, April.
    6. Verity Watson & Terry Porteous & Tim Bolt & Mandy Ryan, 2019. "Mode and Frame Matter: Assessing the Impact of Survey Mode and Sample Frame in Choice Experiments," Medical Decision Making, , vol. 39(7), pages 827-841, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Frode Alfnes & Chengyan Yue & Helen H. Jensen, 2010. "Cognitive dissonance as a means of reducing hypothetical bias," European Review of Agricultural Economics, Oxford University Press and the European Agricultural and Applied Economics Publications Foundation, vol. 37(2), pages 147-163, June.
    2. McCabe, Christopher & Brazier, John & Gilks, Peter & Tsuchiya, Aki & Roberts, Jennifer & O'Hagan, Anthony & Stevens, Katherine, 2006. "Using rank data to estimate health state utility models," Journal of Health Economics, Elsevier, vol. 25(3), pages 418-431, May.
    3. Hui Li & Robert P. Berrens & Alok K. Bohara & Hank C. Jenkins-Smith & Carol L. Silva & David L. Weimer, 2005. "Exploring the Beta Model Using Proportional Budget Information in a Contingent Valuation Study," Economics Bulletin, AccessEcon, vol. 17(8), pages 1-9.
    4. Drevs, Florian & Tscheulin, Dieter K. & Lindenmeier, Jörg & Renner, Simone, 2014. "Crowding-in or crowding out: An empirical analysis on the effect of subsidies on individual willingness-to-pay for public transportation," Transportation Research Part A: Policy and Practice, Elsevier, vol. 59(C), pages 250-261.
    5. Lindhjem, Henrik & Navrud, Ståle, 2011. "Using Internet in Stated Preference Surveys: A Review and Comparison of Survey Modes," International Review of Environmental and Resource Economics, now publishers, vol. 5(4), pages 309-351, September.
    6. Mandy Ryan & Emmanouil Mentzakis & Catriona Matheson & Christine Bond, 2020. "Survey modes comparison in contingent valuation: Internet panels and mail surveys," Health Economics, John Wiley & Sons, Ltd., vol. 29(2), pages 234-242, February.
    7. Baumgarth, Carsten & Yildiz, Özlem, 2016. "Discovery of brand image by the arts: Empirical comparison of Arts-Based Brand Research Methods (ABBR)," Working Papers 91, Berlin School of Economics and Law, Institute of Management Berlin (IMB).
    8. Paul Dolan & Georgios Kavetsos, 2012. "Happy Talk: Mode of Administration Effects on Subjective Well-Being," CEP Discussion Papers dp1159, Centre for Economic Performance, LSE.
    9. Stöckel, Jannis & van Exel, Job & Brouwer, Werner B.F., 2023. "Adaptation in life satisfaction and self-assessed health to disability - Evidence from the UK," Social Science & Medicine, Elsevier, vol. 328(C).
    10. Eline Poelmans & Sandra Rousseau, 2017. "Beer and Organic Labels: Do Belgian Consumers Care?," Sustainability, MDPI, vol. 9(9), pages 1-15, August.
    11. Jackson, Louise & Al-Janabi, Hareth & Roberts, Tracy & Ross, Jonthan, 2021. "Exploring young people's preferences for STI screening in the UK: A qualitative study and discrete choice experiment," Social Science & Medicine, Elsevier, vol. 279(C).
    12. Richard Melstrom, 2014. "Valuing historic battlefields: an application of the travel cost method to three American Civil War battlefields," Journal of Cultural Economics, Springer;The Association for Cultural Economics International, vol. 38(3), pages 223-236, August.
    13. Eline D'Haene & Juan Tur Cardona & Stijn Speelman & Koen Schoors & Marijke D'Haese, 2021. "Unraveling preferences for religious ties in food transactions: A consumer perspective," Agricultural Economics, International Association of Agricultural Economists, vol. 52(4), pages 701-716, July.
    14. Vandermersch, Mieke & Mathijs, Erik, 2004. "Consumer Willingness To Pay For Domestic Milk," Working Papers 31829, Katholieke Universiteit Leuven, Centre for Agricultural and Food Economics.
    15. Johnston, Andrew C., 2021. "Preferences, Selection, and the Structure of Teacher Pay," IZA Discussion Papers 14831, Institute of Labor Economics (IZA).
    16. Carol Mansfield & Daniel J. Phaneuf & F. Reed Johnson & Jui-Chen Yang & Robert Beach, 2008. "Preferences for Public Lands Management under Competing Uses: The Case of Yellowstone National Park," Land Economics, University of Wisconsin Press, vol. 84(2), pages 282-305.
    17. John A. List & Michael K. Price, 2016. "Editor's Choice The Use of Field Experiments in Environmental and Resource Economics," Review of Environmental Economics and Policy, Association of Environmental and Resource Economists, vol. 10(2), pages 206-225.
    18. Simon Deeming & Kim Edmunds & Alice Knight & Andrew Searles & Anthony P. Shakeshaft & Christopher M. Doran, 2022. "A Benefit-Cost Analysis of BackTrack, a Multi-Component, Community-Based Intervention for High-Risk Young People in a Rural Australian Setting," IJERPH, MDPI, vol. 19(16), pages 1-12, August.
    19. Cai, Zhen & Aguilar, Francisco X., 2013. "Meta-analysis of consumer's willingness-to-pay premiums for certified wood products," Journal of Forest Economics, Elsevier, vol. 19(1), pages 15-31.
    20. Zawojska, Ewa & Gastineau, Pascal & Mahieu, Pierre-Alexandre & Cheze, Benoit & Paris, Anthony, 2021. "Measuring policy consequentiality perceptions in stated preference surveys," 2021 Annual Meeting, August 1-3, Austin, Texas 313977, Agricultural and Applied Economics Association.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:medema:v:43:y:2023:i:6:p:667-679. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.