Home Tech Hub Sam Altman warns that chat is not your doctor and your secrets...

Sam Altman warns that chat is not your doctor and your secrets are not legally private

0

Sam Altman warns that chat is not your doctor and your secrets are not legally private

Openai CEO Sam Altman has warned users that interaction with Chatgpt lacks legal privacy and can be disclosed in the court until the clear privacy law is established.

Listen to the story

Advertisement
Sam Altman warns that chat is not your doctor and your secrets are not legally private
Openai CEO Sam Altman (Credit: Reuters/Jonathan Ernst)

In short

  • Openai CEO Sam Altman admitted that AI chat lacks privacy like a doctor
  • He confirms that the user conversation with Chatgpt can be revealed by the court order
  • The chat removed within 30 days was removed until it is necessary to keep legally

If you are taking out your heart to chat, you may want to stay for a moment, or at least think carefully what you are typing. Openai CEO Sam Altman has recently admitted that, for now, AI chats do not enjoy the same privacy as a conversation with a doctor, lawyer or physician. In the last weekend, appearing on the podcast of comedian Thio von, Altman revealed that the AI industry has not been caught only when it comes to protecting deep personal interactions with users. And it can result in consequences if they end in chat court.

Advertisement

“People talk about the most personal details in their lives,” Altman confessed. “People use it, young people, in particular, use it as a doctor, a life coach; these relationship problems and (ask) ‘What should I do?” And now, if you talk to a physician or lawyer or doctor about those problems, then there is a legal privilege for this.

Altman warned that, as things stand, if a court orders it, the user’s interaction with CHATGPT can be revealed. He said, “This can create a privacy concern for users in a case of a case,” he said, stating that OpenaiI would currently be legally bound to produce those records.

He said, “I think it has become very bad. I think we should have the same concept of privacy for your interaction with AI that we do with a physician or whatever – and no one had to think about it a year ago,” he said.

Legal gray area

There is a clear comment because Openi finds himself in the midst of a high-profile court fight with the New York Times. In June, the newspaper and the other plaintiff sought a court order, demanding that Openai maintain all the users maintain conversations, even those who were removed, indefinitely, as part of an ongoing copyright case.

Openai has described the request as a “overrech” and confirmed that it is appealing, arguing that allowing courts to determine data storage will provide flood attacks for future demands from law enforcement and legal teams.

,

Unlike encrypted messaging apps such as WhatsApp, Openai employees can use conversation. This is partially so that they can fix the model and also keep an eye on misuse.

The level of access has become a sticky point for some users, especially in a world where digital privacy investigation is increasing. For example, Roe v of Supreme Court. After Wade turned, millions of women moved away from unnovated period-tracking apps towards safe options such as Apple Health.

Therapy? not yet

Advertisement

Altman’s warning can reach home for those who use Chatgpt as a sounding board for their emotional ups and downs. Without a legal framework, AI does not just offer the same security that a professional human consultant.

“I think it really wants privacy clarity, before you use (chat), like legal clarity,” Altman told Vaughan, who admitted that he really avoids using chatbott for that reason.

Therefore, while Chatgpt can feel like a non-nine friend, the legal system does not look at it in this way, at least, not yet.

– Ends

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Exit mobile version