menu

Is Chat-GPT a Man?

September 17, 2023 | By Julia Fleming

Photo from Wix Media

As a full time computer engineering student and a part-time data engineer, I am no stranger to the worry and speculation Chat-GPT has set off in the past few months. Since the recent publicity boom of Chat-GPT in January 2023, many people, within and outside of the engineering/computer science field have asked me whether Chat-GPT is “the beginning of the end.” Why do we have an innate fear of advances in the AI sector, and how much can we attribute this fear to AI’s gender?

If I were to ask you, what is the gender of AI, you would probably look at me like I asked you for directions to Mars. Yet, most of the AI we have grown accustomed to since the early 2010s, Siri, Alexa, and Cortana, have been presented as women assistants. This is not a coincidence but rather a strategic marketing strategy that was extensively studied prior to their respective launches. One such study at the University of Washington concluded that customer “satisfaction” and “pleasure” of service robots perceived to be human-like (communicating directly back to us in a dialogue nature) was directly correlated to the gender of the service robot, with the female gender being consistently favored.

The pre-programmed addition of Siri occurred on the iPhone 4 in 2011. This feature was well received despite being the first AI assistant embedded into a personal cell phone, essentially within earshot of the average iPhone user 24/7. When searching for the “public reaction to Siri”, my first Google page consisted mostly of articles suggesting funny things to ask her, odd things she doesn’t seem to understand, or even ways you can get Siri to flirt with you.

In 2014, Amazon launched their own AI assistant, Alexa, with a series of advertisements that highlighted ideas employees had originally come up with for the program that failed due to the limitations of Alexa’s abilities, such as a virtual assistant electric toothbrush that tries to read you an audiobook over the sound of you brushing your teeth. The overall effect of the advertisement series was playful and lighthearted. Not only does Alexa possess a non-threatening high pitch voice and common girl name, but the way in which the engineering team fumbles in the commercials, an odd approach for selling a product, further emphasizes Alexa’s benign nature. The effect can make you forget that we are talking about a company with one of the most competitive R&D departments in the world as well as one of the highest layoff rates. I am not arguing that Alexa is secretly a dangerous technology that we should have been fearing since its launch. Rather, it was highly plausible that the public would be uncomfortable with the launch of such a product, and the advertising was produced in a way to mitigate and alleviate such woes.

Chat-GPT is the first household name AI assistant presented as voiceless, nameless, and genderless. Public perception has not been as neutral. A study conducted by Arxiv with a particularly spoiling title, “ChatGPT Is More Likely to Be Perceived as Male Than Female,” conducted surveys with 1552 research participants provided by the software service ‘Prolific’. Participants answered whether they perceived different functionalities of the AI assistant to be male, female, or neutral.

In one group of 127, 85% perceived Chat-GPT as male based on its ability to summarize text. 81% of 341 people assigned a male gender to Chat-GPT on its functionality as a whole. 71% of 52 people felt similarly on its ability to help with coding problems. The weighted average of all 1552 participants resulted in 74% perceiving Chat-GPT as more male than female.

Chat-GPT is an open source platform. This is not a product to market or sell to the general public in the classical sense. The company behind Chat-GPT, OpenAI, generates revenue through research grants and corporate partnerships. There was no true investment at stake when Chat-GPT was launched to the public. Therefore, there was no need to vet public perception through marketing research prior to its launch.

Before we panic about technological advances in the AI sector, it is beneficial to delve deeper into the role gender biases play in shaping our fearful perceptions of these new technologies. Public hysteria often leads to public rejection. Massive technology companies are accustomed to utilizing extensive market research to minimize this barrier. As consumers and college students about to enter a workforce that many currently perceive to be threatened by Chat-GPT, it is important to step back and compare the advancement to similar technologies that we have been coexisting with for most of our lives. It is in our nature to reject and fear the unknown and the unfamiliar. It is through research, analysis and education that ”new” technological advancements are revealed to be far less foreign than we once realized, but instead are but an enhancement of what we have already grown accustomed to.

“We always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next ten.” – Bill Gates


References

Yolande Strengers; Jenny Kennedy, “THE SMART WIFE,” in The Smart Wife: Why Siri, Alexa, and Other Smart Home Devices Need a Feminist Reboot , MIT Press, 2020, pp.1-5. https://ieeexplore-ieee-org.ezproxy2.library.drexel.edu/x pl/ebooks/bookPdfWithBanner.jsp?fileName=9205816.pdf&bkn= 9205725&pdfType=chapter.

Soobin Seo,When Female (Male) Robots Are Talking To Me: Effect of service robots’ gender and Anthropomorphism on Customer Satisfaction, https://doi.org/10.1016/j.ijhm.2022.103166.

Guidi, Boor, L., van der Bij, L., Foppen, R., Rikmenspoel, O., & Perugia, G. (2023). Ambivalent Stereotypes Towards Gendered Robots: The (Im)mutability of Bias Towards Female and Neutral Robots. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)(Vol. 13818, pp. 615–626). Springer Nature Switzerland. https://doi.org/10.1007/978-3-031-24670-8_54

Preston, Kayla, Michael Halpin, and Finlay Maguire. “The black pill: new technology and the male supremacy of involuntarily celibate men.” Men and masculinities 24.5 (2021): 823-841.d. https://journals.sagepub.com/doi/pdf/10.1177/1097184X2110 17954

Wong, Jared, and Jin Kim. “ChatGPT Is More Likely to Be Perceived as Male Than Female.” Arxiv, 21 May 2023, https://arxiv.org/abs/2305.12564. Accessed 3 June 2023.

https://www.arxiv-sanity-lite.com/inspect?pid=2305.12564#:~:t ext=We%20investigate%20how%20people%20perceive,to%20be%20 male%20than%20female.