Sam Altman warns against using ChatGPT as a therapist

People should think twice while sharing personal information with ChatGPT

By Web Desk
|
July 29, 2025
Sam Altman warns against using ChatGPT as a therapist

OpenAI CEO Sam Altman has cautioned users against turning to ChatGPT for emotional support citing a lack of legal confidentiality protections.

In the recent episode of the podcast This Past Weekend with Theon Von, Altman stated that unlike human therapists, AI conversations are not legally protected.

He stated: “People use it, young people, especially, use it as a therapist, a life coach; having these relationship problems and (asking) ‘what should I do?'.”

“And right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it," he continued.

“There’s doctor-patient confidentiality, there’s legal confidentiality, whatever. And we haven’t figured that out yet when you talk to ChatGPT,” Altman added.

He warned that in legal disputes, the information (shared in form of prompts) can be utilised to disclose private conversations.

Citing this he referred to the recent lawsuit with The New York Times in which the company is already embroiled over retaining deleted chats.

The recent court order asked OpenAI to conserve chat logs even if users ask them to delete.

Altman shared that in his viewpoint, AI should “have the same concept of privacy for your conversations with AI that we do with a therapist or whatever — and no one had to think about that even a year ago.”

Earlier, an AI safety and research company Anthorpic (maker of ChatGPT rival Claude), conducted a study to analyze 4.5 million conversations to assess how much people think chatbots are reliable for emotional conversations.

The study reveals that only 2.9% interactions are emotive conversations while companionship and roleplay relationships make up 0.5%.

Overall distribution of affective conversation types in Claude.ai Free and Pro

Another study conducted by OpenAI and MIT noted that “Emotional engagement with ChatGPT is rare in real-world usage.” It further added: “Affective cues were not present in the vast majority of on-platform conversations we assessed.”

On platform data analysis
Randomized controlled trial

Despite low usage, Altman warns that as AI becomes more advanced, people may increasingly turn to chatbots for personal advice. Without legal safeguards, sensitive conversations could be exposed in lawsuits or data breaches. So, for now users may think twice before sharing personal information with AI as it can’t guarantee confidentiality.