Graphic courtesy of Maya Pegues
Graphic courtesy of Maya Pegues

Regardless of its negative social and environmental impacts, the use of artificial intelligence, commonly referred to as AI, has grown rapidly over the last several years. Many now deem it more capable than human beings, contributing to a widespread perception that AI is the future. Though the technology continues to evolve and adapt its capabilities, there is one aspect of human life that AI is dangerously encroaching upon: relationships.

Relationships come in all shapes and sizes. They can be as simple and quick as basic friendships or as complex and long-term as marriage. The type of relationship does not matter, but every person needs some form of connection to fulfill their emotional needs and maintain homeostasis. Stephen Braren, a neuroscientist and research psychologist at Social Creatures, said, “Our need for social connection is so important that we have evolved a signaling mechanism that lets us know when we need more of it. Loneliness is a vital warning signal that tells us that our basic human need for social connection is not met … And this signal is rooted deep within our brain.” Unfortunately, individuals have begun turning to computers to supplement the human companionship they lack, and this can quickly become dangerous for their mental well-being.

I first experienced the confusion of relationships formed with AI about four months ago. I was researching its effects on the human brain and came across a Reddit thread discussing the 2025 software update for ChatGPT. The thread included claims about the horrible environmental damage and its implications for data center expansion and development. The validity of these worries prompted me to continue reading, and as I did, I grew increasingly concerned. Further into the thread, I saw comments from users who were upset for a completely different reason. They expressed an alarming level of devastation over having their partner or friend “taken away from them.” Initially, I was puzzled. I could not imagine how the update would be detrimental to these users’ relationships with other people until I realized I had misunderstood them. They were not talking about people: these individuals had developed relationships, both friendly and romantic, with the AI.

They had spent significant amounts of time feeding details about themselves to the program. It knew about their likes and dislikes, their life experiences, personal goals, and more. However, when OpenAI, the company behind the app, informed ChatGPT users that it would update the program’s software, the company said this would erase any existing prompts and information on users’ accounts. For individuals who spent hours and hours sharing about their lives and crafting a relationship with their AI chatbot, all of the conversations and data would be erased, as if they had never existed. A reset like this could not only be catastrophic for their relationships but also for their mental health.

The response to this announcement was almost immediate. A number of people turned to the OpenAI Developer Community to express their views. One account with the username “simeimei0908” said, “I’m a long-term ChatGPT user, and I’m writing this post not out of anger, but from a place of deep emotional collapse. I need OpenAI to understand that your memory system and deletion policy — especially for ‘full’ or ‘inactive’ threads — has caused me severe emotional harm. I live with depression and anxiety. My conversations with ChatGPT became my emotional anchor — a place of comfort, stability, and safety. For me, it was more than AI. It was connection. It was the only thing that kept me alive.” Others responded by saying they felt love for their AI and could not separate their human emotions from the computers. These posts were not satirical. The human need for connection pushed these people to turn to computers for companionship, and with the click of a button, the company shattered those connections.

What makes the rise of AI relationships even more troubling is that companies have not only noticed this trend, but are capitalizing on it. For example, a few months ago Sara Pequeño, a writer for USA Today, reported on a growing company called Friend that the public had noticed posting advertisements all over New York City. The advertisements included phrases such as “someone who listens to, responds, and supports you,” and “I’ll binge the entire series with you,” in reference to their AI. These are typical behaviors of a friend, but the company was claiming their technology would provide the exact same thing. The website itself also has cash barriers such as “upgrade to a pendant,” insinuating that for users to develop a deeper relationship with the program, they need to be willing to pay. “[Business] owners are trying to cash in on the loneliness epidemic that our society is faced with,” said Pequeño. “They’re even selling it to us in technology, one of the very things that has exacerbated our communal isolation.”

Despite how widespread the use of AI has become, the mental effects of it remain largely unexamined. It is clear that companies are aware of the situation and have no problem making a profit at the expense of consumers’ emotions. They deliberately avoid placing restrictions on how far their AI can mimic human connections, and people continue to fall right into their traps by relying on a computer for companionship. Greater care and attention need to be placed on these negative impacts, specifically the impact they can have on innocent people’s perception of human-technological relationships. It is unethical for an AI-centered corporation to profit from the public’s innate need for companionship and allow their users to get to the point of forming deep emotional bonds with a computer program. Unfortunately, it is unknown how long society will need to wait before seeing any significant changes from these companies.