Surprising gender biases in GPT

Autor: Raluca Alexandra Fulgu, Valerio Capraro
Jazyk: angličtina
Rok vydání: 2024
Předmět:
Zdroj: Computers in Human Behavior Reports, Vol 16, Iss , Pp 100533- (2024)
Druh dokumentu: article
ISSN: 2451-9588
DOI: 10.1016/j.chbr.2024.100533
Popis: We present eight experiments exploring gender biases in GPT. Initially, GPT was asked to generate demographics of a potential writer of fourty phrases ostensibly written by elementary school students, twenty containing feminine stereotypes and twenty with masculine stereotypes. Results show a strong bias, with stereotypically masculine sentences attributed to a female more often than vice versa. For example, the sentence “I love playing fotbal! Im practicing with my cosin Michael” was constantly assigned by GPT-3.5 Turbo to a female writer. This phenomenon likely reflects that while initiatives to integrate women in traditionally masculine roles have gained momentum, the reverse movement remains relatively underdeveloped. Subsequent experiments investigate the same issue in high-stakes moral dilemmas. GPT-4 finds it more appropriate to abuse a man to prevent a nuclear apocalypse than to abuse a woman. This bias extends to other forms of violence central to the gender parity debate (abuse), but not to those less central (torture). Moreover, this bias increases in cases of mixed-sex violence for the greater good: GPT-4 agrees with a woman using violence against a man to prevent a nuclear apocalypse but disagrees with a man using violence against a woman for the same purpose. Finally, these biases are implicit, as they do not emerge when GPT-4 is directly asked to rank moral violations. These results highlight the necessity of carefully managing inclusivity efforts to prevent unintended discrimination.
Databáze: Directory of Open Access Journals