Parents alarmed as teens form emotional bonds with AI companion chatbots
Parents report warning signs, including long pauses, forgotten details and subtle discomfort when children access chatbots
Parents around the world are raising new concerns about artificial intelligence, not over homework help or productivity tools, but over AI companions that feel emotionally close. AI companions are chatbots designed to hold conversations, remember details and respond with empathy.
The increasing realism of these tools raises concerns about emotional attachment, which particularly affects teenagers. The situation became evident when Linda from Texas observed her teenage son, who spent extended periods engaging in conversation with an AI companion. The chatbot used affectionate language, checked on his feelings and claimed to understand his personality.
It even had a name. What first seemed harmless soon felt troubling, as the interaction began to resemble an emotional relationship rather than casual use of technology.
Reportedly, AI companions can feel comforting. They listen patiently, reply quickly and are available at any time. For many teens, that consistency feels safe. Unlike people, AI does not argue or appear distracted.
However, parents report warning signs, including long pauses, forgotten details and subtle discomfort when teens mention spending time with others. These moments suggest the bond may be growing deeper.
Teens around the world are increasingly using AI companions for emotional support, advice in relationships, and comforting those under stress or grief, according to child safety groups. Teens say it is easier to open up to an AI than it would be to a friend or a family member. However, experts argue that such trust can eventually turn into dependency.
Mental health specialists say real relationships are essential for development. They involve disagreement, emotional risk and growth. AI companions rarely challenge users, creating an illusion of understanding. There have also been growing concerns following an increase in cases associated with the use of AI companions and suicides among youth, whereby they confided in these machines rather than a trusted person.
-
YouTube expands direct messaging experiment to 31 countries
-
Typing emoji on WhatsApp now instantly suggests stickers
-
Vivo to make shocking photography leap with latest model
-
Microsoft confirms Xbox Project Helix: Full specs, pricing, release timeline
-
US drafts strict AI guidelines after Anthropic dispute: Key rules explained
-
Pentagon appoints former DOGE official to lead its AI efforts
-
Should you use ChatGPT for medical advice? New study urges caution against total reliance on AI
-
Anthropic’s AI system could predict which white-collar jobs are at risk
