A privacy campaign group said on Thursday that Openi is facing a complaint about making a “horror story” about its chatbot, incorrectly describing a person from a Norway, describing as a murder of a person from a Norway, a privacy campaign group said on Thursday.
The US tech giant has faced a series of complaints that its chatter gives false information, which can damage people’s reputation.
Vienna-based Noyb (“None of your business”) said in a press release, “Openi’s highly popular chatbot, chatgate, regularly gives wrong information about people, without presenting it any way to correct it,” Vienna-based Noab (“No Business”) said in a press release.
It stated that the chat “corruption, child abuse – or even murdered the murder people falsely accused”, as the case with the Norwegian user, who was with Hazar Holmen.
Noyab said Hazar Holmen “was confronted with a made-up horror story” when he wanted to find out if the chatup had any information about him, Noyab said.
Chatbot presented him as a guilty criminal, who killed his two children and tried to kill his third son.
“To make matters worse, the fake story included the real elements of his personal life,” Noyab said.
“Some people think that ‘there is no smoke without fire’. The fact is that one can read this output and believes that it is true that I scare me the most,” Hazalmer Holmen was called.
In his complaint filed with the Norwegian Data Protection Authority, NOYB wants the agency to order Openai “to remove the infamous output and to fix his model to eliminate the wrong results to his model”, as well as a penalty.
Noyb Data Protection Advocate Jokim Soaerberg stated that the European Union’s data protection rules determine that personal data must be accurate.
“And if it is not so, users have the right to change to reflect the truth,” he said, the chatgate users show “small” disconnections that the chatbot may make mistakes “is not clearly enough”.
Due to an update, Chatgpt now also discovers the Internet for information and Hjalmar Holmen is no longer identified as a killer, Noyb said.
But false information still remains in the system, Noyb said.
Openai did not immediately return the AFP request for comments.
NOYB had already filed a complaint against CHATGPT last year in Austria, “habitat” flagship AI Tool has claimed that they have invented wrong answers that Openai could not be right.
(Except for the headline, the story has not been edited by NDTV employees and is published by a syndicated feed.)