IDEAS home Printed from https://ideas.repec.org/a/spr/qualqt/v57y2023i1d10.1007_s11135-022-01372-2.html
   My bibliography  Save this article

How to prepare data for the automatic classification of politically related beliefs expressed on Twitter? The consequences of researchers’ decisions on the number of coders, the algorithm learning procedure, and the pre-processing steps on the performance of supervised models

Author

Listed:
  • Paweł Matuszewski

    (Collegium Civitas)

Abstract

Due to the recent advances in natural language processing, social scientists use automatic text classification methods more and more frequently. The article raises the question about how researchers’ subjective decisions affect the performance of supervised deep learning models. The aim is to deliver practical advice for researchers concerning: (1) whether it is more efficient to monitor coders’ work to ensure a high quality training dataset or have every document coded once and obtain a larger dataset instead; (2) whether lemmatisation improves model performance; (3) if it is better to apply passive learning or active learning approaches; and (4) if the answers are dependent on the models’ classification tasks. The models were trained to detect if a tweet is about current affairs or political issues, the tweet’s subject matter and the tweet author’s stance on this. The study uses a sample of 200,000 manually coded tweets published by Polish political opinion leaders in 2019. The consequences of decisions under different conditions were checked by simulating 52,800 results using the fastText algorithm (DV: F1-score). Linear regression analysis suggests that the researchers’ choices not only strongly affect model performance but may also lead, in the worst-case scenario, to a waste of funds.

Suggested Citation

  • Paweł Matuszewski, 2023. "How to prepare data for the automatic classification of politically related beliefs expressed on Twitter? The consequences of researchers’ decisions on the number of coders, the algorithm learning pro," Quality & Quantity: International Journal of Methodology, Springer, vol. 57(1), pages 301-321, February.
  • Handle: RePEc:spr:qualqt:v:57:y:2023:i:1:d:10.1007_s11135-022-01372-2
    DOI: 10.1007/s11135-022-01372-2
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11135-022-01372-2
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11135-022-01372-2?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Igor Mozetič & Miha Grčar & Jasmina Smailović, 2016. "Multilingual Twitter Sentiment Classification: The Role of Human Annotators," PLOS ONE, Public Library of Science, vol. 11(5), pages 1-26, May.
    2. Grimmer, Justin & Stewart, Brandon M., 2013. "Text as Data: The Promise and Pitfalls of Automatic Content Analysis Methods for Political Texts," Political Analysis, Cambridge University Press, vol. 21(3), pages 267-297, July.
    3. Denny, Matthew J. & Spirling, Arthur, 2018. "Text Preprocessing For Unsupervised Learning: Why It Matters, When It Misleads, And What To Do About It," Political Analysis, Cambridge University Press, vol. 26(2), pages 168-189, April.
    4. Daniel J. Hopkins & Gary King, 2010. "A Method of Automated Nonparametric Content Analysis for Social Science," American Journal of Political Science, John Wiley & Sons, vol. 54(1), pages 229-247, January.
    5. Miller, Blake & Linder, Fridolin & Mebane, Walter R., 2020. "Active Learning Approaches for Labeling Text: Review and Assessment of the Performance of Active Learning Approaches," Political Analysis, Cambridge University Press, vol. 28(4), pages 532-551, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Tim Schatto-Eckrodt & Robin Janzik & Felix Reer & Svenja Boberg & Thorsten Quandt, 2020. "A Computational Approach to Analyzing the Twitter Debate on Gaming Disorder," Media and Communication, Cogitatio Press, vol. 8(3), pages 205-218.
    2. Mohamed M. Mostafa, 2023. "A one-hundred-year structural topic modeling analysis of the knowledge structure of international management research," Quality & Quantity: International Journal of Methodology, Springer, vol. 57(4), pages 3905-3935, August.
    3. Latifi, Albina & Naboka-Krell, Viktoriia & Tillmann, Peter & Winker, Peter, 2024. "Fiscal policy in the Bundestag: Textual analysis and macroeconomic effects," European Economic Review, Elsevier, vol. 168(C).
    4. Giovanni Di Franco & Michele Santurro, 2021. "Machine learning, artificial neural networks and social research," Quality & Quantity: International Journal of Methodology, Springer, vol. 55(3), pages 1007-1025, June.
    5. Zhang, Han, 2021. "How Using Machine Learning Classification as a Variable in Regression Leads to Attenuation Bias and What to Do About It," SocArXiv 453jk, Center for Open Science.
    6. Purwoko Haryadi Santoso & Edi Istiyono & Haryanto & Wahyu Hidayatulloh, 2022. "Thematic Analysis of Indonesian Physics Education Research Literature Using Machine Learning," Data, MDPI, vol. 7(11), pages 1-41, October.
    7. Rauh, Christian, 2018. "Validating a sentiment dictionary for German political language—a workbench note," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 15(4), pages 319-343.
    8. Camilla Salvatore & Silvia Biffignandi & Annamaria Bianchi, 2022. "Corporate Social Responsibility Activities Through Twitter: From Topic Model Analysis to Indexes Measuring Communication Characteristics," Social Indicators Research: An International and Interdisciplinary Journal for Quality-of-Life Measurement, Springer, vol. 164(3), pages 1217-1248, December.
    9. Martin Haselmayer & Marcelo Jenny, 2017. "Sentiment analysis of political communication: combining a dictionary approach with crowdcoding," Quality & Quantity: International Journal of Methodology, Springer, vol. 51(6), pages 2623-2646, November.
    10. Jason Anastasopoulos & George J. Borjas & Gavin G. Cook & Michael Lachanski, 2018. "Job Vacancies, the Beveridge Curve, and Supply Shocks: The Frequency and Content of Help-Wanted Ads in Pre- and Post-Mariel Miami," NBER Working Papers 24580, National Bureau of Economic Research, Inc.
    11. Angela Chang & Peter J. Schulz & Angus Wenghin Cheong, 2020. "Online Newspaper Framing of Non-Communicable Diseases: Comparison of Mainland China, Taiwan, Hong Kong and Macao," IJERPH, MDPI, vol. 17(15), pages 1-15, August.
    12. Seraphine F. Maerz & Carsten Q. Schneider, 2020. "Comparing public communication in democracies and autocracies: automated text analyses of speeches by heads of government," Quality & Quantity: International Journal of Methodology, Springer, vol. 54(2), pages 517-545, April.
    13. Iasmin Goes, 2023. "Examining the effect of IMF conditionality on natural resource policy," Economics and Politics, Wiley Blackwell, vol. 35(1), pages 227-285, March.
    14. Lehotský, Lukáš & Černoch, Filip & Osička, Jan & Ocelík, Petr, 2019. "When climate change is missing: Media discourse on coal mining in the Czech Republic," Energy Policy, Elsevier, vol. 129(C), pages 774-786.
    15. Yeomans, Michael, 2021. "A concrete example of construct construction in natural language," Organizational Behavior and Human Decision Processes, Elsevier, vol. 162(C), pages 81-94.
    16. Weifeng Zhong, 2016. "The candidates in their own words: A textual analysis of 2016 president primary debates," AEI Economic Perspectives, American Enterprise Institute, April.
    17. Karell, Daniel & Freedman, Michael Raphael, 2019. "Rhetorics of Radicalism," SocArXiv yfzsh, Center for Open Science.
    18. Sara Kahn-Nisser, 2019. "When the targets are members and donors: Analyzing inter-governmental organizations’ human rights shaming," The Review of International Organizations, Springer, vol. 14(3), pages 431-451, September.
    19. Justyna Klejdysz & Robin L. Lumsdaine, 2023. "Shifts in ECB Communication: A Textual Analysis of the Press Conference," International Journal of Central Banking, International Journal of Central Banking, vol. 19(2), pages 473-542, June.
    20. Javier De la Hoz-M & Mª José Fernández-Gómez & Susana Mendes, 2021. "LDAShiny: An R Package for Exploratory Review of Scientific Literature Based on a Bayesian Probabilistic Model and Machine Learning Tools," Mathematics, MDPI, vol. 9(14), pages 1-21, July.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:qualqt:v:57:y:2023:i:1:d:10.1007_s11135-022-01372-2. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.