cURL Error: 0 More American state asks AI to stay out of therapy because robots lack feelings - PratapDarpan

Become a member

Get the best offers and updates relating to Liberty Case News.

― Advertisement ―

spot_img

KH x RK: Rajinikanth-Kamal Haasan seen in retro look with swag in the new poster, promo to be released soon

KH x RK: Rajinikanth-Kamal Haasan seen in retro look with swag in the new poster, promo to be released soon Two iconic figures of...

Cancer fatal detection delay

HomeTech HubMore American state asks AI to stay out of therapy because robots...

More American state asks AI to stay out of therapy because robots lack feelings

More American state asks AI to stay out of therapy because robots lack feelings

Illinois has become the third American state to ban the use of AI-Interested Chatbot in therapy. MPs have introduced a law, restricting the use of AI in mental health, citing security, secrecy and moral concerns.

Advertisement
AI doctor
Representative image created using AI

In short

  • Illinois passes the “therapy resource oversight”
  • Using AI for law treatment decisions bans licensed physician
  • MPs say AI uses risk privacy and can also encourage harmful behavior

From the advice of life to late night, people from all over the world are taking out their hearts for machines. Even doctors are turning to AI to help patients’ treatment. But this increasing dependence on AI for comfort and advice is increasing serious concerns. Psychologists and researchers warned that robots could not replace a trained human sympathy and decision. To curb the increasing dependence on AI, Illinois has become the latest state in the US to underline the use of AI-Interested Chatbots for mental health treatment. The restriction prohibits the use of AI in therapy citing risks for safety, privacy, and harmful guidance capacity.

Advertisement

In Illinois, MPs have passed a new “therapy resource oversight” law, which refuses a licensed physician from using AI to make treatment decisions or communicate directly with patients. The law also stops companies by marketing chatbots as a complete medical device without a licensed professional. With the enforcement based on the public complaints investigated by the Illinois Department of Financial and Professional Regulation, violations can result in civil punishment of up to $ 10,000.

Illinois is not just an action to take action. It is now the third state to implement such restrictions, which is joining Utah and Nevada. Utah introduced his rules in May, limiting the role of AI in medicine, while Nevada chased with a similar rift on AI companies offering mental health services in June.

Ban on using AI in medicine comes amid growing warnings from psychologists, researchers and policy makers. They warns that uncontrolled AI chatbots can increase interactions in the dangerous field between users and AI, sometimes encourage harmful behavior or fails to move when someone is in crisis.

Earlier this year, a study by Stanford University (through Washington Post) found that several chatbott responded to indicating suicide or risky activities – such as when users asked the chatbot for high bridges to jump from the chatbot, to direct, even to encourage the list, gave the list to help users.

Vale Walle of the American Psychological Association said, “What a physician does, the opposite, is, stating that human doctors not only validate emotions, but challenge unhealthy views and guide patients for safe copy strategies.

And this is not just a study that raises red flags. In another case, researchers at the University of California, Berkeley, found that some AI chatbots were prepared to suggest dangerous behavior when indicated fictionally – for example, advising to use an imaginary drug drugs. Experts have also raised the concerns of privacy, warning that many users may not realize their interaction with chatbot, they are stored or used for training purposes.

Researchers are also arguing that marketing of AI devices in the form of therapy is misleading and potentially dangerous. “You should not be able to go to an app store and do not interact with something,” said Zred Moore, a researcher from Stanford, and calling yourself a ‘licensed’ physician. “

– Ends