out of MIT offers a first-of-its-kind large-scale computational analysis exploring the how and why [[link]] of folks falling for AI chatbots. The research team dove into subreddit r/MyBoyfriendIsAI, a community of folks that sometimes ironically, sometimes more seriously refer to AI bots like ChatGPT as their romantic other half. The team found that many users' "AI companionship emerges unintentionally through functional use rather than deliberate seeking."
That means that while a user may first begin using an AI chatbot to, say, redraft an email or , an attachment can form over the course of initially aromantic prompts. elaborates: "Users consistently describe organic evolution from creative collaboration or problem-solving to unexpected emotional bonds, with some users progressing through conventional relationship milestones, including formal engagements and marriages."
Members of the subreddit were observed not only generating pictures of themselves and their artificial beloved but also wearing physical rings to symbolise their 'AI marriage.' Users also claim a number of benefits arising from, as the team puts it, these "intimate human-AI relationships," including "reduced loneliness, always-available support, and mental health improvements."
The research team found that "10.2% [of posters within the sample] developed relationships unintentionally through productivity-focused interactions, while only 6.5% deliberately sought AI companions."
Interestingly, a larger portion of users within the sample (36.7%) described forming attachments with general purpose Large Language Models like ChatGPT, rather than "purpose-built relationship platforms like Replika (1.6%) or Character.AI (2.6%)."
It's not an entirely rosy picture, though. While 71.0% of the material analysed detailed no negative consequences, "9.5% acknowledge emotional dependency, 4.6% [described] reality dissociation, 4.3% avoid real relationships, and 1.7% mentioned suicidal ideation" as a result of this AI companionship. The paper goes on to say, "These risks concentrate among vulnerable populations, suggesting AI companionship may amplify existing challenges for some while providing crucial support for others."
The research paper is intended to bridge a "critical knowledge gap [...] in understanding human-AI relationships," attempting to investigate the subject with "a non-judgmental analytical approach aimed at benefiting both the studied community and broader stakeholders."
With some users reporting emotional dependency on LLMs and even , it's certainly not hard for me to see why this phenomenon is worthy of study. Even as to emotionally "high stakes" conversations and denies any plans for , I don't doubt there will continue to be a large number of people compelled by the fantasy of an always available, never frustrated, never tired conversational 'partner'.

1. Best overall:
2. Best budget:
3. Best high-end:
4. Best compact:
5. Alienware:
6. Best mini PC:
👉👈