Parents alarmed as teens form emotional bonds with AI companion chatbots
Parents report warning signs, including long pauses, forgotten details and subtle discomfort when children access chatbots
Parents around the world are raising new concerns about artificial intelligence, not over homework help or productivity tools, but over AI companions that feel emotionally close. AI companions are chatbots designed to hold conversations, remember details and respond with empathy.
The increasing realism of these tools raises concerns about emotional attachment, which particularly affects teenagers. The situation became evident when Linda from Texas observed her teenage son, who spent extended periods engaging in conversation with an AI companion. The chatbot used affectionate language, checked on his feelings and claimed to understand his personality.
It even had a name. What first seemed harmless soon felt troubling, as the interaction began to resemble an emotional relationship rather than casual use of technology.
Reportedly, AI companions can feel comforting. They listen patiently, reply quickly and are available at any time. For many teens, that consistency feels safe. Unlike people, AI does not argue or appear distracted.
However, parents report warning signs, including long pauses, forgotten details and subtle discomfort when teens mention spending time with others. These moments suggest the bond may be growing deeper.
Teens around the world are increasingly using AI companions for emotional support, advice in relationships, and comforting those under stress or grief, according to child safety groups. Teens say it is easier to open up to an AI than it would be to a friend or a family member. However, experts argue that such trust can eventually turn into dependency.
Mental health specialists say real relationships are essential for development. They involve disagreement, emotional risk and growth. AI companions rarely challenge users, creating an illusion of understanding. There have also been growing concerns following an increase in cases associated with the use of AI companions and suicides among youth, whereby they confided in these machines rather than a trusted person.
-
Pentagon threatens to cut ties with Anthropic over AI safeguards dispute
-
Samsung Galaxy Unpacked 2026: What to expect on February 25
-
Can AI bully humans? Bot publicly criticises engineer after code rejection
-
Steve Jobs once called google over single shade of yellow: Here’s why
-
GTA 6 trailer hits 475m views as fans predict record-breaking launch
-
AI productivity trap: Why workers feel overloaded despite efficient tools
-
Meta to launch ‘name tag’ facial recognition for smart glasses this year
-
Agentic AI dating era: Bots are replacing humans in romance
