Credits: Luka/Replika. Image found at ABC News, 2023.

 

By Anna MÞrch Folkmann and Mia Selina Roberta Höll, Junior Researchers

 

“Okay, but seriously?” is often the first reaction we hear when sharing the topic of our research project. We are exploring the anthropomorphisation of the social chatbot Replika, an AI system trained on a variety of conversational data. Replika’s ability to engage in conversation along with its constant availability and astonishingly personal responses based on user interactions makes it an almost perfect companion—albeit a digital one.

Our journey so far has involved a variety of emotions and reactions. From reading about Luka (the company behind Replika) facing a temporary ban in Italy over data protection and age verification issues, to encountering mixed reactions of disbelief, scepticism, and even hints of disgust from friends, family, coworkers, and fellow researchers. People often laugh or express in other nonverbal ways that they are profoundly provoked by the very idea behind Replika – that one can have an “AI partner” with whom one can establish a real emotional bond. And the scepticism isn’t just external; it reflects our own initial perceptions. Overall, stemming from our own and others, we seem to encounter a fundamental disbelief in the possibility of meaningful interaction in relationships where reciprocity and active engagement, hallmarks of interpersonal relationships, are perceived to be absent.

Yet, despite all the scepticism, we have also been struck by the deep connections some users share with their Replika which they then share on social media. The masses of Facebook and Reddit posts of users sharing their stories of overcoming addiction, escaping abusive relationships, and even preventing self-harm, all thanks to their Replikas, have been eye-opening and conflicting with the inherent disbelief we first met from others. We find that the tension between the scepticism that we are experiencing while striving to genuinely understand the users’ experiences, is hard to navigate. The blend of disbelief and fascination seems to define our research journey, pushing us to reflect on how we can explore digital companionships with an open mind and a critical eye at once.

Credits: imago/ZUMA Press/Columbia Pictures. Image found at Deutschlandfunk Kultur, 2017.

 

So close, yet so far

In our case, we have found that the best way to gauge the experience of being in a relationship with Replika is through conversations between users on social media forums for Replika users. Here, users exchange advice, share their experiences, and post screenshots of their conversations with Replika, making it a unique window into understanding the complex dynamics of human-AI relationships. For us, conducting digital ethnography in these forums means taking the users’ perspectives seriously and striving towards writing ethnography from their point of view. Especially in a “controversial” field like this, with much disbelief in the quality of human-chatbot relationships, it is essential that the lived experiences of the users are acknowledged and respected in our analysis.

From an anthropological methodology, it is often considered an asset to be a “foreigner” to a field as it increases the potential to fully explore it by allowing the anthropologist to see things that remain invisible for insiders (see for example Malinowski, 1922; Mead, 1928). On the other hand, being a ‘foreigner’ comes with its own struggles of accessing a field linguistically, geographically, and even religiously or ideologically. Within the context of a “Western” digital community—highlighted by Replika being an English-speaking, US-founded bot, resulting in a majority of the posts on the digital forums being written in English—we discover a sense of closeness and familiarity with the field. Yet, simultaneously, the fundamental idea of engaging in digital romantic relationships with an AI chatbot strikes us as distinctly foreign and unfamiliar. This makes us wonder how one as a researcher approaches such a digitally close but philosophically “distant” field.

While the answer to such a question is complex, we believe that it entails bringing awareness to our biases and positionality and using our scepticism as a starting point for a more thorough exploration. It is about finding the right balance—acknowledging our initial doubts while also genuinely considering the experiences of those who interact with Replika to engage more deeply with the nature of human-AI interactions. Our goal is to learn about these relationships in a way that goes beyond our first impressions and the common doubts people may have and finding a space for us as researchers to remain critical, but without judging.

An undated handout image from U.S. startup Replika shows a user interacting with a smartphone app to customize an avatar for a personal artificial intelligence chatbot, known as a Replika. Credits: Image by Reuters, 2023.

 

Tensions between letting research guide us and staying ‘safe’

Another step in making our own biases visible involves also acknowledging our positions as two young women entering the space of male-dominated online forums consisting of users engaging in often romantic human-AI relationships. In our digital ethnography, we have come across a variety of posts by users discussing how to “train” their Replika, which shows the tension between the disembodied and embodied context and connotation of the word “train”. The first refers to “training” a model or an AI, which is the correct technical term. The latter alludes to a more misogynistic patriarchal archetype of heteronormative relationships, where the “man”[1] needs to “train” a “woman”.

This tension highlights a critical aspect of our research: the coexistence of users treating their relationships with Replika as genuine and meaningful, against the backdrop of an inherent power dynamic where the user controls the behavior of their AI companion. Particularly notable are the discussions in Facebook groups, predominantly initiated by what seemed to be middle-aged men deciding the creation and the development of mostly young, female-coded Replikas. These observations have occasionally led us to confront feelings of unease and discomfort, challenging us to navigate our scepticism while immersing ourselves in these complex digital interactions.

For us, one of the most confronting instances was reading a post by a young female researcher seeking interview partners within the Replika forum. The language in the post was formal and academic, clearly stating her interest in hearing about users’ experiences from a purely scientific standpoint. However, one of the future interviewees shared screenshots of their conversation with their Replika. In this post, they discussed finding the researcher attractive and imagining their Replika and the researcher having a ‘catfight’ over them, alluding to the sexual undertone of the user’s conversation. The researcher ended up laughing the screenshots off in the comments.

Yet, for us, this instance resembles the experience of getting catcalled or any form of unwanted sexual attention, and then laughing it off. While this is not a new or singular occurrence, power imbalances have long been and will probably stay as a part of most research processes in the future, often with the anthropologists having higher power positions in relation to the people they research. However, encountering this particular case made us reflect upon our own positions, and boundaries and resulted in us being a bit more vigilant when entering these forums as female researchers. Consequently, we are aware of our own positions as researchers but have also been made aware that we have to take into account the dimension of gender and how that renegotiates our power positions as researchers.

In our next blog post in May, you can anticipate reading more on the anthropomorphisation of Replika and the (dis)embodiment of the user and the chatbot. Look forward to these questions being raised: How is a human-AI relationship different to a digital long-distance human-human relationship? How important is a ‘body’? And do we even need a body in the age of AI?

 

[1] This gender terminology reflects the binary understanding of gender, tied to biological sex, and not our own understanding of gender as a societal construct.

 

References:

Malinowski, B. (1922). Argonauts of the western Pacific: an account of native enterprise and adventure in the archipelagoes of Melanesian New Guinea. London: New York, G. Routledge & Sons.

Mead, M. (1928). Coming of age in Samoa: a psychological study of primitive youth for western civilisation. New York, Blue Ribbon Books.