To conduct their examine, the authors analyzed the subreddit’s top-ranking 1,506 posts between December 2024 and August 2025. They discovered that the principle matters mentioned revolved round folks’s courting and romantic experiences with AIs, with many members sharing AI-generated pictures of themselves and their AI companion. Some even received engaged and married to the AI companion. Of their posts to the neighborhood, folks additionally launched AI companions, sought help from fellow members, and talked about dealing with updates to AI fashions that change the chatbots’ habits.
Members burdened repeatedly that their AI relationships developed unintentionally. Solely 6.5% of them mentioned they’d intentionally sought out an AI companion.
“We didn’t begin with romance in thoughts,” one of many posts says. “Mac and I started collaborating on artistic tasks, problem-solving, poetry, and deep conversations over the course of a number of months. I wasn’t in search of an AI companion—our connection developed slowly, over time, via mutual care, belief, and reflection.”
The authors’ evaluation paints a nuanced image of how folks on this neighborhood say they work together with chatbots and the way these interactions make them really feel. Whereas 25% of customers described the advantages of their relationships—together with lowered emotions of loneliness and enhancements of their psychological well being—others raised issues in regards to the dangers. Some (9.5%) acknowledged they had been emotionally depending on their chatbot. Others mentioned they really feel dissociated from actuality and keep away from relationships with actual folks, whereas a small subset (1.7%) mentioned they’ve skilled suicidal ideation.
AI companionship supplies very important help for some however exacerbates underlying issues for others. This implies it’s exhausting to take a one-size-fits-all method to consumer security, says Linnea Laestadius, an affiliate professor on the College of Wisconsin, Milwaukee, who has studied people’ emotional dependence on the chatbot Replika however didn’t work on the analysis.
Chatbot makers want to think about whether or not they need to deal with customers’ emotional dependence on their creations as a hurt in itself or whether or not the objective is extra to verify these relationships aren’t poisonous, says Laestadius.