IDEAS home Printed from https://ideas.repec.org/a/spr/patien/v17y2024i6d10.1007_s40271-024-00708-4.html
   My bibliography  Save this article

The Performance of Kaizen Tasks Across Three Online Discrete Choice Experiment Surveys: An Evidence Synthesis

Author

Listed:
  • Benjamin Matthew Craig

    (University of South Florida)

  • Maksat Jumamyradov

    (University of South Florida)

  • Oliver Rivero-Arias

    (University of Oxford
    University of Oxford)

Abstract

Background Kaizen is a Japanese term for continuous improvement (kai ~ change, zen ~ good). In a kaizen task, a respondent makes sequential choices to improve an object’s profile, revealing a preference path. Including kaizen tasks in a discrete choice experiment has the advantage of collecting greater preference evidence than pick-one tasks, such as paired comparisons. Objective and Methods So far, three online discrete choice experiments have included kaizen tasks: the 2020 US COVID-19 vaccination (CVP) study, the 2021 UK Children’s Surgery Outcome Reporting (CSOR) study, and the 2023 US EQ-5D-Y-3L valuation (Y-3L) study. In this evidence synthesis, we describe the performance of the kaizen tasks in terms of response behaviors, conditional logit and Zermelo–Bradley–Terry (ZBT) estimates, and their standard errors in each of the surveys. Results Comparing the CVP and Y-3L, including hold-outs (i.e., attributes shared by all alternatives) seems to reduce positional behavior by half. The CVP tasks excluded multi-level improvements; therefore, we could not estimate logit main effects directly. In the CSOR, only 12 of the 21 logit estimates are significantly positive (p

Suggested Citation

  • Benjamin Matthew Craig & Maksat Jumamyradov & Oliver Rivero-Arias, 2024. "The Performance of Kaizen Tasks Across Three Online Discrete Choice Experiment Surveys: An Evidence Synthesis," The Patient: Patient-Centered Outcomes Research, Springer;International Academy of Health Preference Research, vol. 17(6), pages 635-644, November.
  • Handle: RePEc:spr:patien:v:17:y:2024:i:6:d:10.1007_s40271-024-00708-4
    DOI: 10.1007/s40271-024-00708-4
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s40271-024-00708-4
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s40271-024-00708-4?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Benjamin M. Craig & Sulabha Ramachandran, 2006. "Relative risk of a shuffled deck: a generalizable logical consistency criterion for sample selection in health state valuation studies," Health Economics, John Wiley & Sons, Ltd., vol. 15(8), pages 835-848, August.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Bansback, Nick & Hole, Arne Risa & Mulhern, Brendan & Tsuchiya, Aki, 2014. "Testing a discrete choice experiment including duration to value health states for large descriptive systems: Addressing design and sampling issues," Social Science & Medicine, Elsevier, vol. 114(C), pages 38-48.
    2. Benjamin M. Craig & Jan J. V. Busschbach, 2011. "Toward a more universal approach in health valuation," Health Economics, John Wiley & Sons, Ltd., vol. 20(7), pages 864-875, July.
    3. Lidia Engel & Nick Bansback & Stirling Bryan & Mary M. Doyle-Waters & David G. T. Whitehurst, 2016. "Exclusion Criteria in National Health State Valuation Studies," Medical Decision Making, , vol. 36(7), pages 798-810, October.
    4. Mônica Viegas Andrade & Kenya Noronha & Paul Kind & Carla de Barros Reis & Lucas Resende de Carvalho, 2016. "Logical Inconsistencies in 3 Preference Elicitation Methods for EQ-5D Health States," Medical Decision Making, , vol. 36(2), pages 242-252, February.
    5. Eve Wittenberg & Lisa Prosser, 2011. "Ordering errors, objections and invariance in utility survey responses," Applied Health Economics and Health Policy, Springer, vol. 9(4), pages 225-241, July.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:patien:v:17:y:2024:i:6:d:10.1007_s40271-024-00708-4. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.