IDEAS home Printed from https://ideas.repec.org/a/vrs/offsta/v39y2023i1p79-101n1.html
   My bibliography  Save this article

Using Eye-Tracking Methodology to Study Grid Question Designs in Web Surveys

Author

Listed:
  • Neuert Cornelia E.

    (GESIS – Leibniz Institute for the Social Sciences, P.O. Box 12 21 55, 68072 Mannheim, Germany.)

  • Roßmann Joss

    (GESIS – Leibniz Institute for the Social Sciences, P.O. Box 12 21 55, 68072 Mannheim, Germany.)

  • Silber Henning

    (GESIS – Leibniz Institute for the Social Sciences, P.O. Box 12 21 55, 68072 Mannheim, Germany.)

Abstract

Grid questions are frequently employed in web surveys due to their assumed response efficiency. In line with this, many previous studies have found shorter response times for grid questions compared to item-by-item formats. Our contribution to this literature is to investigate how altering the question format affects response behavior and the depth of cognitive processing when answering both grid question and item-by-item formats. To answer these questions, we implemented an experiment with three questions in an eye-tracking study. Each question consisted of a set of ten items which respondents answered either on a single page (large grid), on two pages with five items each (small grid), or on ten separate pages (item-by-item). We did not find substantial differences in cognitive processing overall, while the processing of the question stem and the response scale labels was significantly higher for the item-by-item design than for the large grid in all three questions. We, however, found that when answering an item in a grid question, respondents often refer to surrounding items when making a judgement. We discuss the findings and limitations of our study and provide suggestions for practical design decisions.

Suggested Citation

  • Neuert Cornelia E. & Roßmann Joss & Silber Henning, 2023. "Using Eye-Tracking Methodology to Study Grid Question Designs in Web Surveys," Journal of Official Statistics, Sciendo, vol. 39(1), pages 79-101, March.
  • Handle: RePEc:vrs:offsta:v:39:y:2023:i:1:p:79-101:n:1
    DOI: 10.2478/jos-2023-0004
    as

    Download full text from publisher

    File URL: https://doi.org/10.2478/jos-2023-0004
    Download Restriction: no

    File URL: https://libkey.io/10.2478/jos-2023-0004?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Vera Toepoel & Corrie Vis & Marcel Das & Arthur van Soest, 2009. "Design of Web Questionnaires," Sociological Methods & Research, , vol. 37(3), pages 371-392, February.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Lindhjem, Henrik & Navrud, Ståle, 2011. "Using Internet in Stated Preference Surveys: A Review and Comparison of Survey Modes," International Review of Environmental and Resource Economics, now publishers, vol. 5(4), pages 309-351, September.
    2. de Bruijne, M.A., 2015. "Designing web surveys for the multi-device internet," Other publications TiSEM 19e4d446-a62b-4a95-8691-8, Tilburg University, School of Economics and Management.
    3. Bart Buelens & Jan A. van den Brakel, 2015. "Measurement Error Calibration in Mixed-mode Sample Surveys," Sociological Methods & Research, , vol. 44(3), pages 391-426, August.
    4. Chatpong Tangmanee & Phattharaphong Niruttinanon, 2015. "Effects of Forced Responses and Question Display Styles on Web Survey Response Rates," International Journal of Research in Business and Social Science (2147-4478), Center for the Strategic Studies in Business and Finance, vol. 4(2), pages 54-62, April.
    5. Chatpong Tangmanee & Phattharaphong Niruttinanon, 2019. "Web Survey’s Completion Rates: Effects of Forced Responses, Question Display Styles, and Subjects’ Attitude," International Journal of Research in Business and Social Science (2147-4478), Center for the Strategic Studies in Business and Finance, vol. 8(1), pages 20-29, January.
    6. Tobias Gummer & Tanja Kunz, 2022. "Relying on External Information Sources When Answering Knowledge Questions in Web Surveys," Sociological Methods & Research, , vol. 51(2), pages 816-836, May.
    7. Carina Cornesse & Annelies G. Blom, 2023. "Response Quality in Nonprobability and Probability-based Online Panels," Sociological Methods & Research, , vol. 52(2), pages 879-908, May.
    8. Fabo, B., 2017. "Towards an understanding of job matching using web data," Other publications TiSEM b8b877f2-ae6a-495f-b6cc-9, Tilburg University, School of Economics and Management.
    9. Dana Garbarski & Nora Cate Schaeffer & Jennifer Dykema, 2019. "The Effects of Features of Survey Measurement on Self-Rated Health: Response Option Order and Scale Orientation," Applied Research in Quality of Life, Springer;International Society for Quality-of-Life Studies, vol. 14(2), pages 545-560, April.
    10. Tobias Gummer & Joss Roßmann & Henning Silber, 2021. "Using Instructed Response Items as Attention Checks in Web Surveys: Properties and Implementation," Sociological Methods & Research, , vol. 50(1), pages 238-264, February.
    11. Weijters, Bert & Millet, Kobe & Cabooter, Elke, 2021. "Extremity in horizontal and vertical Likert scale format responses. Some evidence on how visual distance between response categories influences extreme responding," International Journal of Research in Marketing, Elsevier, vol. 38(1), pages 85-103.
    12. Yüksel, Atila, 2017. "A critique of “Response Bias” in the tourism, travel and hospitality research," Tourism Management, Elsevier, vol. 59(C), pages 376-384.
    13. Riccardo Testa & Giorgio Schifani & Giuseppina Migliore, 2021. "Understanding Consumers’ Convenience Orientation. An Exploratory Study of Fresh-Cut Fruit in Italy," Sustainability, MDPI, vol. 13(3), pages 1-13, January.
    14. Brauner, Jacob, 2020. "Are Smileys Valid Answers? Survey Data Quality with Innovative Item Formats," SocArXiv dk9bc, Center for Open Science.
    15. Anna DeCastellarnau, 2018. "A classification of response scale characteristics that affect data quality: a literature review," Quality & Quantity: International Journal of Methodology, Springer, vol. 52(4), pages 1523-1559, July.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:vrs:offsta:v:39:y:2023:i:1:p:79-101:n:1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Peter Golla (email available below). General contact details of provider: https://www.sciendo.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.