Parents alarmed as teens form emotional bonds with AI companion chatbots
Parents report warning signs, including long pauses, forgotten details and subtle discomfort when children access chatbots
Parents around the world are raising new concerns about artificial intelligence, not over homework help or productivity tools, but over AI companions that feel emotionally close. AI companions are chatbots designed to hold conversations, remember details and respond with empathy.
The increasing realism of these tools raises concerns about emotional attachment, which particularly affects teenagers. The situation became evident when Linda from Texas observed her teenage son, who spent extended periods engaging in conversation with an AI companion. The chatbot used affectionate language, checked on his feelings and claimed to understand his personality.
It even had a name. What first seemed harmless soon felt troubling, as the interaction began to resemble an emotional relationship rather than casual use of technology.
Reportedly, AI companions can feel comforting. They listen patiently, reply quickly and are available at any time. For many teens, that consistency feels safe. Unlike people, AI does not argue or appear distracted.
However, parents report warning signs, including long pauses, forgotten details and subtle discomfort when teens mention spending time with others. These moments suggest the bond may be growing deeper.
Teens around the world are increasingly using AI companions for emotional support, advice in relationships, and comforting those under stress or grief, according to child safety groups. Teens say it is easier to open up to an AI than it would be to a friend or a family member. However, experts argue that such trust can eventually turn into dependency.
Mental health specialists say real relationships are essential for development. They involve disagreement, emotional risk and growth. AI companions rarely challenge users, creating an illusion of understanding. There have also been growing concerns following an increase in cases associated with the use of AI companions and suicides among youth, whereby they confided in these machines rather than a trusted person.
-
What happens if ChatGPT gains access to your financial accounts? Experts are alarmed
-
Anthropic seeks legal pause on Pentagon supply-chain risk decision: Here’s why
-
'AI washing' or real shift? Atlassian cuts 1,600 jobs in latest tech shake-up
-
Experts predict AI will trigger biggest shift in mathematics history
-
China’s cyber agency raises concerns over OpenClaw AI
-
WhatsApp plans major change for younger users
-
Musk unveils Tesla, xAI joint project ‘Macrohard’ amid advanced AI push
-
Nvidia secures $2 billion deal with AI cloud provider Nebius
