Tuesday, July 29, 2025

ChatGPT therapy conversations may not be private, warns OpenAI CEO Sam Altman

Date:

If you’ve been treating AI chatbots like ChatGPT like a digital confidant, it might be time to reconsider what you’re sharing. OpenAI CEO Sam Altman has recently admitted that AI chatbots aren’t protected by confidentiality in the way that doctor-patient or lawyer-client discussions are.In a recent podcast appearance on This Past Weekend with Theo Von, Altman mentioned that the tech world hasn’t yet figured out how to offer users the same level of privacy for sensitive topics. And that could have consequences if those chats end up in court.
“People talk about the most personal details in their lives to ChatGPT. People use it, young people, especially, use it as a therapist, a life coach; having these relationship problems and [asking] ‘what should I do?’ And right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it. There’s doctor-patient confidentiality, there’s legal confidentiality, whatever. And we haven’t figured that out yet for when you talk to ChatGPT,” Altman said.
Further, Altman said that this lack of protection could have serious consequences. In legal situations, user conversations with ChatGPT could be subpoenaed and used in court. “If someone confides their most personal issues to ChatGPT, and that ends up in legal proceedings, we could be compelled to hand that over. And that’s a real problem,” he warned.Unlike encrypted platforms such as WhatsApp or Signal, which ensure messages remain private between sender and receiver, ChatGPT conversations are not end-to-end encrypted.

OpenAI retains the ability to access and review those chats, which are sometimes used for improving the model and preventing misuse.

While OpenAI claims that it deletes free-tier user chats within 30 days, exceptions exist for legal and security reasons. The issue of data retention has come under increased scrutiny recently, especially in light of ongoing legal battles.

One such case is a lawsuit filed by The New York Times against OpenAI and Microsoft in 2023, alleging unauthorised use of millions of its articles to train ChatGPT. In that case, a court has ordered OpenAI to preserve all related chat data, conflicting with the company’s standard data deletion policies.

In response, OpenAI has pushed back. “We will fight any demand that compromises our users’ privacy; this is a core principle,” Altman wrote on X, formerly Twitter.

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Share post:

Subscribe

spot_imgspot_img

Popular

More like this
Related

REC share price target cut by analysts but PSU remains a ‘consensus buy’

Shares of state-run power financier REC Ltd. will be...

India pips China as top smartphone supplier to US in Q2, 2025: Canalys

कैनालिस के अनुसार, नई दिल्ली, जुलाई 29...

Bad credit score: Meaning, impact on loans and jobs, and ways to improve it fast

A bad credit score can immensely restrict access to...

Honda N-One e can power your home, charges in 30 mins, launches in Japan this September

Honda has officially unveiled the N-One e:, its smallest...