2021, so far, has been a freakin Wild Ride where I feel motivated and challenged to dare new adventures in my work and art. In this first blog post on the website of my new company, I am happy to announce that I have just completed and published a second master’s thesis on the topic of social robots and society. In spite of corona, I finally managed to push through – to continue writing – no matter what the obstacle and present a critical reading of the current phenomenon of using AI and chatbots in hiring processes. Many thanks to my supervisor professor Ericka Johnson at Linköping University for the dynamic feedback and support in the process.
https://liu.diva-portal.org/smash/record.jsf?pid=diva2%3A1572100&dswid=-4966
Abstract
The topic of this thesis is AI recruitment chatbots, digital discrimination, and data feminism (D´Ignazio and F.Klein 2020), where I aim to critically analyze issues of bias in these types of human-machine interaction technologies. Coming from a professional background of theatre, performance art, and drama,I am curious to analyze how using AI and social robots as hiring tools entails a new type of “stage” (actor’s space), with a special emphasis on social acting. Humans are now required to adjust their performance and facial expressions in the search for, and approval of, a new job. I will use my “theatrical glasses” with an intersectional lens, and through a methodology of cultural analysis, reflect on various examples of conversational AI used in recruitment processes. The silver bullet syndrome is a term that points to a tendency to believe in a miraculous new technological tool that will “magically” solve human-related problems in a company or an organization. The captivating marketing message of the Swedish recruitment conversational AI tool – Tengai Unbiased – is the promise of a scientifically proven objective hiring tool, to solve the diversity problem for company management. But is it really free from bias? According to Karen Barad, agency is not an attribute, but the ongoing reconfiguration of the world influenced by what she terms intra-actions, a mutual constitution of entanglement between human and non-human agencies (2003:818). However, tech developers often disregard their entanglement of human-to-machine interferences which unfortunately generates unconscious bias. The thesis raises ethical questions of how algorithmic measurement of social competence risks holding unconscious biases, benefiting those already privileged or those acting within a normative spectrum.