IDEAS home Printed from https://ideas.repec.org/a/gam/jscscx/v12y2023i8p435-d1208555.html
   My bibliography  Save this article

What ChatGPT Tells Us about Gender: A Cautionary Tale about Performativity and Gender Biases in AI

Author

Listed:
  • Nicole Gross

    (National College of Ireland, School of Business, D01Y300 Dublin, Ireland)

Abstract

Large language models and generative AI, such as ChatGPT, have gained influence over people’s personal lives and work since their launch, and are expected to scale even further. While the promises of generative artificial intelligence are compelling, this technology harbors significant biases, including those related to gender. Gender biases create patterns of behavior and stereotypes that put women, men and gender-diverse people at a disadvantage. Gender inequalities and injustices affect society as a whole. As a social practice, gendering is achieved through the repeated citation of rituals, expectations and norms. Shared understandings are often captured in scripts, including those emerging in and from generative AI, which means that gendered views and gender biases get grafted back into social, political and economic life. This paper’s central argument is that large language models work performatively, which means that they perpetuate and perhaps even amplify old and non-inclusive understandings of gender. Examples from ChatGPT are used here to illustrate some gender biases in AI. However, this paper also puts forward that AI can work to mitigate biases and act to ‘undo gender’.

Suggested Citation

  • Nicole Gross, 2023. "What ChatGPT Tells Us about Gender: A Cautionary Tale about Performativity and Gender Biases in AI," Social Sciences, MDPI, vol. 12(8), pages 1-15, August.
  • Handle: RePEc:gam:jscscx:v:12:y:2023:i:8:p:435-:d:1208555
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2076-0760/12/8/435/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2076-0760/12/8/435/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Simonetta Manfredi & Kate Clayton-Hathway & Emily Cousens, 2019. "Increasing Gender Diversity in Higher Education Leadership: The Role of Executive Search Firms," Social Sciences, MDPI, vol. 8(6), pages 1-17, June.
    2. ., 2023. "The artificial intelligence ecosystem," Chapters, in: The Rise of Algorithmic Society and the Strategic Role of Arts and Culture, chapter 2, pages 6-30, Edward Elgar Publishing.
    3. Jennifer Pabst & Scott M. Walfield & Ryan Schacht, 2022. "Patterning of Sexual Violence against Women across US Cities and Counties," Social Sciences, MDPI, vol. 11(5), pages 1-9, May.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Jiao, Yong & Wang, Gaofei & Li, Chengyou & Pan, Jia, 2024. "Digital inclusive finance, factor flow and industrial structure upgrading: Evidence from the yellow river basin," Finance Research Letters, Elsevier, vol. 62(PA).
    2. Jiang, Yirui & Tran, Trung Hieu & Williams, Leon, 2023. "Machine learning and mixed reality for smart aviation: Applications and challenges," Journal of Air Transport Management, Elsevier, vol. 111(C).
    3. Yunyu Xiao & Edward Pinkney & Tianzi Li & Paul S. F. Yip, 2023. "Breaking through the glass ceiling: unveiling women’s representation by gender and race in the higher education hierarchy," Palgrave Communications, Palgrave Macmillan, vol. 10(1), pages 1-7, December.
    4. Reza Banai, 2024. "How Does the Neighborhood Unit Inform Community Revitalization?," Land, MDPI, vol. 13(6), pages 1-14, May.
    5. Jai Mohan Pandit & Bino Paul, 2023. "Gender Diversity, Sustainable Development Goals and Human Resource Management Practices in Higher Education," Indian Journal of Human Development, , vol. 17(1), pages 111-130, April.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jscscx:v:12:y:2023:i:8:p:435-:d:1208555. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.