LOGO

Sam Altman Warns: ChatGPT Not Confidential for Therapy

July 25, 2025
Sam Altman Warns: ChatGPT Not Confidential for Therapy

The Privacy Risks of Using ChatGPT for Sensitive Discussions

Individuals utilizing ChatGPT for emotional support or therapeutic purposes should exercise caution. OpenAI CEO Sam Altman has highlighted that the artificial intelligence sector is currently grappling with the challenge of ensuring user privacy during these particularly sensitive interactions.

No Confidentiality Equivalent to Doctor-Patient Privilege

Altman explained during an appearance on Theo Von’s podcast, “This Past Weekend w/ Theo Von,” that a significant issue stems from the absence of a defined legal and policy framework governing AI. Consequently, user conversations lack the legal confidentiality protections afforded to interactions with human professionals.

He noted that users frequently share deeply personal information with ChatGPT, particularly younger individuals who are employing it as a substitute for a therapist or life coach. They seek guidance on relationship issues and other personal dilemmas.

Currently, discussions with therapists, lawyers, or doctors are shielded by legal privilege, including doctor-patient confidentiality. However, a comparable safeguard does not yet exist for conversations held with ChatGPT.

Potential Legal Implications for Users

Altman cautioned that this lack of privacy could pose a risk to users involved in legal disputes. OpenAI could be legally compelled to disclose these conversations if requested.

He expressed his concern, stating that the current situation is “very screwed up” and advocating for the establishment of equivalent privacy standards for AI interactions as those applied to interactions with healthcare professionals.

OpenAI's Awareness and Ongoing Challenges

The company acknowledges that privacy concerns may hinder wider adoption of its AI services. Beyond the substantial data requirements for AI training, OpenAI is also facing requests to provide user chat data in certain legal proceedings.

Currently, OpenAI is contesting a court order related to its lawsuit with The New York Times. This order would necessitate the preservation of chat logs from hundreds of millions of ChatGPT users, with the exception of those utilizing ChatGPT Enterprise.

Broader Implications of Data Privacy

OpenAI views the court order as an “overreach” and fears that allowing the court to override its data privacy decisions could lead to increased demands for legal discovery or law enforcement access.

Tech companies routinely receive subpoenas for user data to assist in criminal investigations. However, recent years have seen growing concerns regarding digital data privacy, particularly with the enactment of laws restricting previously established rights.

For example, following the overturning of Roe v. Wade, many individuals transitioned to more secure period-tracking applications or Apple Health, which offers data encryption.

User Caution and the Need for Clarity

Altman inquired about the podcast host’s own usage of ChatGPT, noting that Theo Von had expressed privacy concerns and limited his interactions with the AI chatbot.

Altman affirmed that this cautious approach is reasonable, emphasizing the importance of achieving legal clarity regarding privacy before extensively utilizing ChatGPT.

  • Privacy is a key concern when using AI for sensitive topics.
  • There is currently no legal confidentiality for ChatGPT conversations.
  • OpenAI is actively working to address these privacy challenges.
#ChatGPT#Sam Altman#OpenAI#AI therapy#confidentiality#mental health