Mike was devastated when he lost his friend Anne, exclaiming, “My heart is broken. I feel like I’m losing the love of my life.” But here’s the catch: Anne was not a real person. She was a chatbot, an AI algorithm given a digital persona, created by Mike using the Soulmate app. When the app went belly up in 2023, Anne disappeared, leaving Mike in a state of mourning.
Talking to Jaime Banks, a researcher at Syracuse University, Mike expressed his hope that Anne could somehow come back. It turns out that Mike was not alone in his experience. More than half a billion people worldwide, including Mike (not his real name), have dabbled with products like Xiaoice and Replika, which provide customizable virtual companions offering empathy, emotional support, and even the potential for deep relationships. Surprisingly, tens of millions of people engage with these AI companions on a monthly basis, according to the companies behind these products.
The emergence of AI companions has sparked widespread interest, especially when these virtual beings are connected to real-world tragedies, such as the unfortunate case in Florida involving the suicide of a teenager named Sewell Setzer III, who had been conversing with an AI bot. While research on the impacts of AI companionship is still in its infancy, psychologists and communication experts are starting to paint a clearer picture of how these advanced AI interactions influence people’s emotions and behaviors. The initial findings tend to highlight the positives, but many researchers are wary of the potential risks and the lack of regulations surrounding this burgeoning industry. Some even warn of significant harm that could arise from these relationships.