IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0123483.html
   My bibliography  Save this article

Wiki Surveys: Open and Quantifiable Social Data Collection

Author

Listed:
  • Matthew J Salganik
  • Karen E C Levy

Abstract

In the social sciences, there is a longstanding tension between data collection methods that facilitate quantification and those that are open to unanticipated information. Advances in technology now enable new, hybrid methods that combine some of the benefits of both approaches. Drawing inspiration from online information aggregation systems like Wikipedia and from traditional survey research, we propose a new class of research instruments called wiki surveys. Just as Wikipedia evolves over time based on contributions from participants, we envision an evolving survey driven by contributions from respondents. We develop three general principles that underlie wiki surveys: they should be greedy, collaborative, and adaptive. Building on these principles, we develop methods for data collection and data analysis for one type of wiki survey, a pairwise wiki survey. Using two proof-of-concept case studies involving our free and open-source website www.allourideas.org, we show that pairwise wiki surveys can yield insights that would be difficult to obtain with other methods.

Suggested Citation

  • Matthew J Salganik & Karen E C Levy, 2015. "Wiki Surveys: Open and Quantifiable Social Data Collection," PLOS ONE, Public Library of Science, vol. 10(5), pages 1-17, May.
  • Handle: RePEc:plo:pone00:0123483
    DOI: 10.1371/journal.pone.0123483
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0123483
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0123483&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0123483?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Daria Dzyabura & John R. Hauser, 2011. "Active Machine Learning for Consideration Heuristics," Marketing Science, INFORMS, vol. 30(5), pages 801-819, September.
    2. Margaret E. Roberts & Brandon M. Stewart & Dustin Tingley & Christopher Lucas & Jetson Leder‐Luis & Shana Kushner Gadarian & Bethany Albertson & David G. Rand, 2014. "Structural Topic Models for Open‐Ended Survey Responses," American Journal of Political Science, John Wiley & Sons, vol. 58(4), pages 1064-1082, October.
    3. Montgomery, Jacob M. & Cutler, Josh, 2013. "Computerized Adaptive Testing for Public Opinion Surveys," Political Analysis, Cambridge University Press, vol. 21(2), pages 172-192, April.
    4. Robert M. Groves & Steven G. Heeringa, 2006. "Responsive design for household surveys: tools for actively controlling survey errors and costs," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 169(3), pages 439-457, July.
    5. Olivier Toubia & Laurent Florès, 2007. "Adaptive Idea Screening Using Consumers," Marketing Science, INFORMS, vol. 26(3), pages 342-360, 05-06.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Weichen Wu & Nynke Niezink & Brian Junker, 2022. "A diagnostic framework for the Bradley–Terry model," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 185(S2), pages 461-484, December.
    2. Buil-Gil, David & Solymosi, Reka & Moretti, Angelo, 2019. "Non-parametric bootstrap and small area estimation to mitigate bias in crowdsourced data. Simulation study and application to perceived safety," SocArXiv 8hgjt, Center for Open Science.
    3. Gafari Lukumon & Mark Klein, 2023. "Crowd-sourced idea filtering with Bag of Lemons: the impact of the token budget size," DECISION: Official Journal of the Indian Institute of Management Calcutta, Springer;Indian Institute of Management Calcutta, vol. 50(2), pages 205-219, June.
    4. Sergio Alonso & Rosana Montes & Daniel Molina & Iván Palomares & Eugenio Martínez-Cámara & Manuel Chiachio & Juan Chiachio & Francisco J. Melero & Pablo García-Moral & Bárbara Fernández & Cristina Mor, 2021. "Ordering Artificial Intelligence Based Recommendations to Tackle the SDGs with a Decision-Making Model Based on Surveys," Sustainability, MDPI, vol. 13(11), pages 1-27, May.
    5. Michael Park & Erin Leahey & Russell Funk, 2021. "The decline of disruptive science and technology," Papers 2106.11184, arXiv.org, revised Jul 2022.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Early Kirstin & Mankoff Jennifer & Fienberg Stephen E., 2017. "Dynamic Question Ordering in Online Surveys," Journal of Official Statistics, Sciendo, vol. 33(3), pages 625-657, September.
    2. Sandra Wankmüller, 2023. "A comparison of approaches for imbalanced classification problems in the context of retrieving relevant documents for an analysis," Journal of Computational Social Science, Springer, vol. 6(1), pages 91-163, April.
    3. James Agarwal & Wayne DeSarbo & Naresh K. Malhotra & Vithala Rao, 2015. "An Interdisciplinary Review of Research in Conjoint Analysis: Recent Developments and Directions for Future Research," Customer Needs and Solutions, Springer;Institute for Sustainable Innovation and Growth (iSIG), vol. 2(1), pages 19-40, March.
    4. Minchul Lee & Min Song, 2020. "Incorporating citation impact into analysis of research trends," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1191-1224, August.
    5. Grajzl, Peter & Murrell, Peter, 2021. "A machine-learning history of English caselaw and legal ideas prior to the Industrial Revolution I: generating and interpreting the estimates," Journal of Institutional Economics, Cambridge University Press, vol. 17(1), pages 1-19, February.
    6. Chun Asaph Young & Schouten Barry & Wagner James, 2017. "JOS Special Issue on Responsive and Adaptive Survey Design: Looking Back to See Forward – Editorial: In Memory of Professor Stephen E. Fienberg, 1942–2016," Journal of Official Statistics, Sciendo, vol. 33(3), pages 571-577, September.
    7. Reza C. Daniels, 2012. "A Framework for Investigating Micro Data Quality, with Application to South African Labour Market Household Surveys," SALDRU Working Papers 90, Southern Africa Labour and Development Research Unit, University of Cape Town.
    8. Reist, Benjamin M. & Rodhouse, Joseph B. & Ball, Shane T. & Young, Linda J., 2019. "Subsampling of Nonrespondents in the 2017 Census of Agriculture," NASS Research Reports 322826, United States Department of Agriculture, National Agricultural Statistics Service.
    9. Dehler-Holland, Joris & Schumacher, Kira & Fichtner, Wolf, 2021. "Topic Modeling Uncovers Shifts in Media Framing of the German Renewable Energy Act," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 2(1).
    10. Marcel Fratzscher & Tobias Heidland & Lukas Menkhoff & Lucio Sarno & Maik Schmeling, 2023. "Foreign Exchange Intervention: A New Database," IMF Economic Review, Palgrave Macmillan;International Monetary Fund, vol. 71(4), pages 852-884, December.
    11. Bokyong Shin & Chaitawat Boonjubun, 2021. "Media and the Meanings of Land: A South Korean Case Study," American Journal of Economics and Sociology, Wiley Blackwell, vol. 80(2), pages 381-425, March.
    12. Parijat Chakrabarti & Margaret Frye, 2017. "A mixed-methods framework for analyzing text data: Integrating computational techniques with qualitative methods in demography," Demographic Research, Max Planck Institute for Demographic Research, Rostock, Germany, vol. 37(42), pages 1351-1382.
    13. Li Tang & Jennifer Kuzma & Xi Zhang & Xinyu Song & Yin Li & Hongxu Liu & Guangyuan Hu, 2023. "Synthetic biology and governance research in China: a 40-year evolution," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(9), pages 5293-5310, September.
    14. Lewis Taylor, 2017. "Univariate Tests for Phase Capacity: Tools for Identifying When to Modify a Survey’s Data Collection Protocol," Journal of Official Statistics, Sciendo, vol. 33(3), pages 601-624, September.
    15. Jiayun Jin & Caroline Vandenplas & Geert Loosveldt, 2019. "The Evaluation of Statistical Process Control Methods to Monitor Interview Duration During Survey Data Collection," SAGE Open, , vol. 9(2), pages 21582440198, June.
    16. Benjamin E. Bagozzi & Daniel Berliner & Zack W. Almquist, 2021. "When does open government shut? Predicting government responses to citizen information requests," Regulation & Governance, John Wiley & Sons, vol. 15(2), pages 280-297, April.
    17. Asim Ansari & Yang Li & Jonathan Z. Zhang, 2018. "Probabilistic Topic Model for Hybrid Recommender Systems: A Stochastic Variational Bayesian Approach," Marketing Science, INFORMS, vol. 37(6), pages 987-1008, November.
    18. Han, Chunjia & Yang, Mu & Piterou, Athena, 2021. "Do news media and citizens have the same agenda on COVID-19? an empirical comparison of twitter posts," Technological Forecasting and Social Change, Elsevier, vol. 169(C).
    19. Andy Peytchev, 2013. "Consequences of Survey Nonresponse," The ANNALS of the American Academy of Political and Social Science, , vol. 645(1), pages 88-111, January.
    20. Roger Tourangeau & J. Michael Brick & Sharon Lohr & Jane Li, 2017. "Adaptive and responsive survey designs: a review and assessment," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 180(1), pages 203-223, January.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0123483. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.