IDEAS home Printed from https://ideas.repec.org/a/bit/bsrysr/v7y2016i2p91-103.html
   My bibliography  Save this article

Intracluster Homogeneity Selection Problem in a Business Survey

Author

Listed:
  • Žmuk Berislav

    (Faculty of Economics and Business, University of Zagreb, Croatia)

Abstract

Background: In the cluster sampling approach many parameters have influence on lowering the survey costs and one of the most important is the intracluster homogeneity.Objectives: The goal of the paper is to find the most optimal value of intracluster homogeneity in case when two or more questions or variables have a key role in the research.Methods/Approach: Five key variables have been selected from a business survey conducted in Croatia and results for the two-stage cluster sampling design approach were simulated. The calculated intracluster homogeneity values were compared among all the five observed questions and survey costs and precision levels were inspected.Results: In the new cluster sampling design, for the fixed precision level, the lowest survey costs would be achieved by using the intracluster homogeneity value which is the closest to the average intracluster homogeneity value among all the key questions. Similar results were obtained when survey costs were held fixed.Conclusions: If there is more than one key question in the survey, then the best solution would be to use an average intracluster homogeneity value. However, one should notice that in that case minimum survey costs would not be reached, but the precision levels would increase at all key questions.

Suggested Citation

  • Žmuk Berislav, 2016. "Intracluster Homogeneity Selection Problem in a Business Survey," Business Systems Research, Sciendo, vol. 7(2), pages 91-103, September.
  • Handle: RePEc:bit:bsrysr:v:7:y:2016:i:2:p:91-103
    DOI: 10.1515/bsrj-2016-0015
    as

    Download full text from publisher

    File URL: https://doi.org/10.1515/bsrj-2016-0015
    Download Restriction: no

    File URL: https://libkey.io/10.1515/bsrj-2016-0015?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Robert M. Groves & Steven G. Heeringa, 2006. "Responsive design for household surveys: tools for actively controlling survey errors and costs," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 169(3), pages 439-457, July.
    2. Jesse Bricker, 2014. "Survey Incentives, Survey Effort, and Survey Costs," Finance and Economics Discussion Series 2014-74, Board of Governors of the Federal Reserve System (U.S.).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Early Kirstin & Mankoff Jennifer & Fienberg Stephen E., 2017. "Dynamic Question Ordering in Online Surveys," Journal of Official Statistics, Sciendo, vol. 33(3), pages 625-657, September.
    2. Chun Asaph Young & Schouten Barry & Wagner James, 2017. "JOS Special Issue on Responsive and Adaptive Survey Design: Looking Back to See Forward – Editorial: In Memory of Professor Stephen E. Fienberg, 1942–2016," Journal of Official Statistics, Sciendo, vol. 33(3), pages 571-577, September.
    3. Reza C. Daniels, 2012. "A Framework for Investigating Micro Data Quality, with Application to South African Labour Market Household Surveys," SALDRU Working Papers 90, Southern Africa Labour and Development Research Unit, University of Cape Town.
    4. Reist, Benjamin M. & Rodhouse, Joseph B. & Ball, Shane T. & Young, Linda J., 2019. "Subsampling of Nonrespondents in the 2017 Census of Agriculture," NASS Research Reports 322826, United States Department of Agriculture, National Agricultural Statistics Service.
    5. Lewis Taylor, 2017. "Univariate Tests for Phase Capacity: Tools for Identifying When to Modify a Survey’s Data Collection Protocol," Journal of Official Statistics, Sciendo, vol. 33(3), pages 601-624, September.
    6. Jiayun Jin & Caroline Vandenplas & Geert Loosveldt, 2019. "The Evaluation of Statistical Process Control Methods to Monitor Interview Duration During Survey Data Collection," SAGE Open, , vol. 9(2), pages 21582440198, June.
    7. Andy Peytchev, 2013. "Consequences of Survey Nonresponse," The ANNALS of the American Academy of Political and Social Science, , vol. 645(1), pages 88-111, January.
    8. Roger Tourangeau & J. Michael Brick & Sharon Lohr & Jane Li, 2017. "Adaptive and responsive survey designs: a review and assessment," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 180(1), pages 203-223, January.
    9. repec:iab:iabfda:201307(en is not listed on IDEAS
    10. Roberts Caroline & Vandenplas Caroline & Herzing Jessica M.E., 2020. "A Validation of R-Indicators as a Measure of the Risk of Bias using Data from a Nonresponse Follow-Up Survey," Journal of Official Statistics, Sciendo, vol. 36(3), pages 675-701, September.
    11. Böhme, Marcus & Stöhr, Tobias, 2012. "Guidelines for the use of household interview duration analysis in CAPI survey management," Kiel Working Papers 1779, Kiel Institute for the World Economy (IfW Kiel).
    12. Mario Callegaro & Charlotte Steeh & Trent D. Buskirk & Vasja Vehovar & Vesa Kuusela & Linda Piekarski, 2007. "Fitting disposition codes to mobile phone surveys: experiences from studies in Finland, Slovenia and the USA," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 170(3), pages 647-670, July.
    13. Durrant Gabriele B. & Maslovskaya Olga & Smith Peter W. F., 2017. "Using Prior Wave Information and Paradata: Can They Help to Predict Response Outcomes and Call Sequence Length in a Longitudinal Study?," Journal of Official Statistics, Sciendo, vol. 33(3), pages 801-833, September.
    14. Raphael Nishimura & James Wagner & Michael Elliott, 2016. "Alternative Indicators for the Risk of Non-response Bias: A Simulation Study," International Statistical Review, International Statistical Institute, vol. 84(1), pages 43-62, April.
    15. Holly Matulewicz & Eric Grau & Arif Mamun & Gina Livermore, "undated". "Promoting Readiness of Minors in Supplemental Security Income (PROMISE): PROMISE 60-Month Sampling and Survey Plan," Mathematica Policy Research Reports be402161c12e402392af9182e, Mathematica Policy Research.
    16. G. Blom, Annelies, 2008. "Measuring nonresponse cross-nationally," ISER Working Paper Series 2008-41, Institute for Social and Economic Research.
    17. Sofie Marien & Marc Hooghe & Ellen Quintelier, 2010. "Inequalities in Non‐institutionalised Forms of Political Participation: A Multi‐level Analysis of 25 countries," Political Studies, Political Studies Association, vol. 58(1), pages 187-213, February.
    18. Lipps Oliver & Voorpostel Marieke, 2020. "Can Interviewer Evaluations Predict Short-Term and Long-Term Participation in Telephone Panels?," Journal of Official Statistics, Sciendo, vol. 36(1), pages 117-136, March.
    19. Willems, Jurgen, 2015. "Individual perceptions on the participant and societal functionality of non-formal education for youth: Explaining differences across countries based on the human development index," International Journal of Educational Development, Elsevier, vol. 44(C), pages 11-20.
    20. van Berkel Kees & van der Doef Suzanne & Schouten Barry, 2020. "Implementing Adaptive Survey Design with an Application to the Dutch Health Survey," Journal of Official Statistics, Sciendo, vol. 36(3), pages 609-629, September.
    21. Stephanie Coffey, PhD. & Jaya Damineni & John Eltinge, PhD. & Anup Mathur, PhD. & Kayla Varela & Allison Zotti, 2023. "Some Open Questions on Multiple-Source Extensions of Adaptive-Survey Design Concepts and Methods," Working Papers 23-03, Center for Economic Studies, U.S. Census Bureau.

    More about this item

    Keywords

    business survey; cluster sampling; complex survey sampling design; design effect; key survey question; rate of homogeneity; survey costs;
    All these keywords.

    JEL classification:

    • C83 - Mathematical and Quantitative Methods - - Data Collection and Data Estimation Methodology; Computer Programs - - - Survey Methods; Sampling Methods

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bit:bsrysr:v:7:y:2016:i:2:p:91-103. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Peter Golla (email available below). General contact details of provider: https://www.sciendo.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.