IDEAS home Printed from https://ideas.repec.org/p/osf/osfxxx/h6a7c_v1.html
   My bibliography  Save this paper

Global Evidence on Gender Gaps and Generative AI

Author

Listed:
  • Otis, Nicholas G.
  • Cranney, Katelyn
  • Delecourt, Solene
  • Koning, Rembrand

    (Harvard Business School)

Abstract

Generative AI has the potential to transform productivity and reduce inequality, but only if adopted broadly. In this paper, we show that recently identified gender gaps in generative AI use are nearly universal. Synthesizing data from 18 studies covering more than 140,000 individuals across the world, combined with estimates of the gender share of the hundreds of millions of users of popular generative AI platforms, we demonstrate that the gender gap in generative AI usage holds across nearly all regions, sectors, and occupations. Using newly collected data, we also document that this gap remains even when access to try this new technology is improved, highlighting the need for further research into the gap’s underlying causes. If this global disparity persists, it risks creating a self-reinforcing cycle: women’s underrepresentation in generative AI usage would lead to systems trained on data that inadequately sample women’s preferences and needs, ultimately widening existing gender disparities in technology adoption and economic opportunity.

Suggested Citation

  • Otis, Nicholas G. & Cranney, Katelyn & Delecourt, Solene & Koning, Rembrand, 2024. "Global Evidence on Gender Gaps and Generative AI," OSF Preprints h6a7c_v1, Center for Open Science.
  • Handle: RePEc:osf:osfxxx:h6a7c_v1
    DOI: 10.31219/osf.io/h6a7c_v1
    as

    Download full text from publisher

    File URL: https://osf.io/download/6709bba1834fc0279ca5e186/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/h6a7c_v1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Anja Lambrecht & Catherine Tucker, 2019. "Algorithmic Bias? An Empirical Study of Apparent Gender-Based Discrimination in the Display of STEM Career Ads," Management Science, INFORMS, vol. 65(7), pages 2966-2981, July.
    2. Linda Nordling, 2023. "How ChatGPT is transforming the postdoc experience," Nature, Nature, vol. 622(7983), pages 655-657, October.
    3. Alexander Bick & Adam Blandin & David Deming, 2023. "The Rapid Adoption of Generative AI," On the Economy 98843, Federal Reserve Bank of St. Louis.
    4. Richard Van Noorden & Jeffrey M. Perkel, 2023. "AI and science: what 1,600 researchers think," Nature, Nature, vol. 621(7980), pages 672-675, September.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Hermann, Erik & Puntoni, Stefano, 2024. "Artificial intelligence and consumer behavior: From predictive to generative AI," Journal of Business Research, Elsevier, vol. 180(C).
    2. Evangelos Katsamakas & Oleg V. Pavlov & Ryan Saklad, 2024. "Artificial intelligence and the transformation of higher education institutions," Papers 2402.08143, arXiv.org.
    3. Saima Javed & Yu Rong & Babar Nawaz Abbasi, 2024. "Convergence analysis of artificial intelligence research capacity: Are the less developed catching up with the developed ones?," Journal of International Development, John Wiley & Sons, Ltd., vol. 36(4), pages 2172-2192, May.
    4. Klockmann, Victor & von Schenk, Alicia & Villeval, Marie Claire, 2022. "Artificial intelligence, ethics, and intergenerational responsibility," Journal of Economic Behavior & Organization, Elsevier, vol. 203(C), pages 284-317.
    5. Sievert, Martin & Vogel, Dominik & Döring, Matthias, 2024. "Gendered Language in Job Advertisements Relates to Gender Sorting in Public Labor Markets: A Multi-Source Analysis," SocArXiv u6z5e, Center for Open Science.
    6. De Bruyn, Arnaud & Viswanathan, Vijay & Beh, Yean Shan & Brock, Jürgen Kai-Uwe & von Wangenheim, Florian, 2020. "Artificial Intelligence and Marketing: Pitfalls and Opportunities," Journal of Interactive Marketing, Elsevier, vol. 51(C), pages 91-105.
    7. Ion-Danut LIXANDRU, 2024. "The Use of Artificial Intelligence for Qualitative Data Analysis: ChatGPT," Informatica Economica, Academy of Economic Studies - Bucharest, Romania, vol. 28(1), pages 57-67.
    8. Alina Köchling & Marius Claus Wehner, 2020. "Discriminated by an algorithm: a systematic review of discrimination and fairness by algorithmic decision-making in the context of HR recruitment and HR development," Business Research, Springer;German Academic Association for Business Research, vol. 13(3), pages 795-848, November.
    9. Nir Chemaya & Daniel Martin, 2023. "Perceptions and Detection of AI Use in Manuscript Preparation for Academic Journals," Papers 2311.14720, arXiv.org, revised Jan 2024.
    10. Wencheng Lu, 2024. "Inevitable challenges of autonomy: ethical concerns in personalized algorithmic decision-making," Palgrave Communications, Palgrave Macmillan, vol. 11(1), pages 1-9, December.
    11. Walkowiak, Emmanuelle, 2021. "Neurodiversity of the workforce and digital transformation: The case of inclusion of autistic workers at the workplace," Technological Forecasting and Social Change, Elsevier, vol. 168(C).
    12. Neil K. R. Sehgal & Dan Svirsky, 2024. "Race Discrimination in Internet Advertising: Evidence From a Field Experiment," Papers 2412.14307, arXiv.org.
    13. Henry A. Thompson, 2024. "AI and the law," Papers 2412.05090, arXiv.org.
    14. Ji Wu & Zhiqiang (Eric) Zheng & J. Leon Zhao, 2021. "FairPlay: Detecting and Deterring Online Customer Misbehavior," Information Systems Research, INFORMS, vol. 32(4), pages 1323-1346, December.
    15. Chenfeng Yan & Quan Chen & Xinyue Zhou & Xin Dai & Zhilin Yang, 2024. "When the Automated fire Backfires: The Adoption of Algorithm-based HR Decision-making Could Induce Consumer’s Unfavorable Ethicality Inferences of the Company," Journal of Business Ethics, Springer, vol. 190(4), pages 841-859, April.
    16. Anderson, Brian S., 2022. "What executives get wrong about statistics: Moving from statistical significance to effect sizes and practical impact," Business Horizons, Elsevier, vol. 65(3), pages 379-388.
    17. Adena, Maja & Hager, Anselm, 2020. "Does online fundraising increase charitable giving? A nation-wide field experiment on Facebook," Discussion Papers, Research Unit: Economics of Change SP II 2020-302, WZB Berlin Social Science Center.
    18. Singh, Ashutosh & Sajeesh, S. & Bhardwaj, Pradeep, 2024. "Whitelisting versus advertising-recovery: Strategies to overcome advertising blocking by consumers," European Journal of Operational Research, Elsevier, vol. 318(1), pages 217-229.
    19. James Bono & Alec Xu, 2024. "Randomized Controlled Trials for Security Copilot for IT Administrators," Papers 2411.01067, arXiv.org, revised Nov 2024.
    20. Xiyang Hu & Yan Huang & Beibei Li & Tian Lu, 2022. "Uncovering the Source of Machine Bias," Papers 2201.03092, arXiv.org.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:osfxxx:h6a7c_v1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.