IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v127y2022i8d10.1007_s11192-022-04427-1.html
   My bibliography  Save this article

Readability is decreasing in language and linguistics

Author

Listed:
  • Shan Wang

    (University of Macau
    University of Macau)

  • Xiaojun Liu

    (University of Macau)

  • Jie Zhou

    (University of Macau)

Abstract

Readability reflects the ease of reading a text and high readability indicates easy texts. Based on a corpus consisting of 71,628 abstracts published in SSCI journals in language and linguistics from 1991 to 2020, this paper employs nine readability indexes to analyze their readability and relationship with citations. The results show that the readability of abstracts in journals of language and linguistics is low. Moreover, in the past 30 years, the abstract readability in language and linguistics abstracts is decreasing. Meanwhile, readability is significantly negatively correlated with the number of citations, even though the effect size is very small. The results above suggest that abstracts are very difficult to read; they are becoming more and more difficult than before; the abstract of the articles with more citations appear to be less readable. Faced with decreasing readability, it is suggested that scholars make themselves understood when expressing their ideas with jargon. This study not only has implications for scholars to use linguistic features to improve readability, but also provides quantitative support for the research on readability.

Suggested Citation

  • Shan Wang & Xiaojun Liu & Jie Zhou, 2022. "Readability is decreasing in language and linguistics," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(8), pages 4697-4729, August.
  • Handle: RePEc:spr:scient:v:127:y:2022:i:8:d:10.1007_s11192-022-04427-1
    DOI: 10.1007/s11192-022-04427-1
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-022-04427-1
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-022-04427-1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. James Hartley & James W. Pennebaker & Claire Fox, 2003. "Abstracts, introductions and discussions: How far do they differ in style?," Scientometrics, Springer;Akadémiai Kiadó, vol. 57(3), pages 389-398, July.
    2. Dolnicar, Sara & Chapple, Alexander, 2015. "The readability of articles in tourism journals," Annals of Tourism Research, Elsevier, vol. 52(C), pages 161-166.
    3. Jonathan Knight, 2003. "Clear as mud," Nature, Nature, vol. 423(6938), pages 376-378, May.
    4. Dowling, Michael & Hammami, Helmi & Zreik, Ousayna, 2018. "Easy to read, easy to cite?," Economics Letters, Elsevier, vol. 173(C), pages 100-103.
    5. Lei Lei & Sheng Yan, 2016. "Readability and citations in information science: evidence from abstracts and articles of four journals (2003–2012)," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(3), pages 1155-1169, September.
    6. Michael Dowling & Helmi Hammami & Ousayna Zreik, 2018. "Easy to read, easy to cite?," Post-Print hal-01958017, HAL.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Farrell, Michael & Murphy, Dermot & Painter, Marcus & Zhang, Guangli, 2023. "The complexity yield puzzle: A textual analysis of municipal bond disclosures," Working Papers 338, The University of Chicago Booth School of Business, George J. Stigler Center for the Study of the Economy and the State.
    2. Xi Zhao & Li Li & Wei Xiao, 2023. "The diachronic change of research article abstract difficulty across disciplines: a cognitive information-theoretic approach," Palgrave Communications, Palgrave Macmillan, vol. 10(1), pages 1-12, December.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Ju Wen & Lei Lei, 2022. "Adjectives and adverbs in life sciences across 50 years: implications for emotions and readability in academic texts," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(8), pages 4731-4749, August.
    2. Xi Zhao & Li Li & Wei Xiao, 2023. "The diachronic change of research article abstract difficulty across disciplines: a cognitive information-theoretic approach," Palgrave Communications, Palgrave Macmillan, vol. 10(1), pages 1-12, December.
    3. Ante, Lennart, 2022. "The relationship between readability and scientific impact: Evidence from emerging technology discourses," Journal of Informetrics, Elsevier, vol. 16(1).
    4. Dowling, Michael & Hammami, Helmi & Zreik, Ousayna, 2018. "Easy to read, easy to cite?," Economics Letters, Elsevier, vol. 173(C), pages 100-103.
    5. Diego Marino Fages, 2020. "Write better, publish better," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(3), pages 1671-1681, March.
    6. Omar Mubin & Dhaval Tejlavwala & Mudassar Arsalan & Muneeb Ahmad & Simeon Simoff, 2018. "An assessment into the characteristics of award winning papers at CHI," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(2), pages 1181-1201, August.
    7. Don Watson & Manfred Krug & Claus-Christian Carbon, 2022. "The relationship between citations and the linguistic traits of specific academic discourse communities identified by using social network analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(4), pages 1755-1781, April.
    8. Rose, Michael E. & Opolot, Daniel C. & Georg, Co-Pierre, 2022. "Discussants," Research Policy, Elsevier, vol. 51(10).
    9. McCannon, Bryan C., 2019. "Readability and research impact," Economics Letters, Elsevier, vol. 180(C), pages 76-79.
    10. Song, Ningyuan & Chen, Kejun & Zhao, Yuehua, 2023. "Understanding writing styles of scientific papers in the IS-LS domain: Evidence from abstracts over the past three decades," Journal of Informetrics, Elsevier, vol. 17(1).
    11. Burke, Matt & Fry, John, 2019. "How easy is it to understand consumer finance?," Economics Letters, Elsevier, vol. 177(C), pages 1-4.
    12. Lei Lei & Sheng Yan, 2016. "Readability and citations in information science: evidence from abstracts and articles of four journals (2003–2012)," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(3), pages 1155-1169, September.
    13. Tan Jin & Huiqiong Duan & Xiaofei Lu & Jing Ni & Kai Guo, 2021. "Do research articles with more readable abstracts receive higher online attention? Evidence from Science," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(10), pages 8471-8490, October.
    14. Sungbin Youk & Hee Sun Park, 2019. "Where and what do they publish? Editors’ and editorial board members’ affiliated institutions and the citation counts of their endogenous publications in the field of communication," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(3), pages 1237-1260, September.
    15. Zhou-min Yuan & Mingxin Yao, 2022. "Is academic writing becoming more positive? A large-scale diachronic case study of Science research articles across 25 years," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(11), pages 6191-6207, November.
    16. Lorenz Graf-Vlachy, 2022. "Is the readability of abstracts decreasing in management research?," Review of Managerial Science, Springer, vol. 16(4), pages 1063-1084, May.
    17. Meva Bayrak Karsli & Sinem Karabey & Nergiz Ercil Cagiltay & Yuksel Goktas, 2018. "Comparison of the discussion sections of PhD dissertations in educational technology: the case of Turkey and the USA," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(3), pages 1381-1403, December.
    18. Christos Alexakis & Michael Dowling & Konstantinos Eleftheriou & Michael Polemis, 2021. "Textual Machine Learning: An Application to Computational Economics Research," Computational Economics, Springer;Society for Computational Economics, vol. 57(1), pages 369-385, January.
    19. Feld, Jan & Lines, Corinna & Ross, Libby, 2024. "Writing matters," Journal of Economic Behavior & Organization, Elsevier, vol. 217(C), pages 378-397.
    20. Lutz Bornmann & Markus Wolf & Hans-Dieter Daniel, 2012. "Closed versus open reviewing of journal manuscripts: how far do comments differ in language use?," Scientometrics, Springer;Akadémiai Kiadó, vol. 91(3), pages 843-856, June.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:127:y:2022:i:8:d:10.1007_s11192-022-04427-1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.