Social media is being flooded with digitally created “deepfake” videos using the trusted identities of famous doctors to promote dangerous miracle cures for serious health problems, experts have warned.
Videos circulating on Facebook and Instagram exploit the credibility of star TV doctors to advertise untested “natural” syrups for diabetes, even claiming that the proven, first-class drug metformin can “kill” patients.
Experts say these scams can put people’s lives at risk, particularly because they imitate popular health experts such as British TV presenter Michael Mosley, who died earlier this year.
“People believe these videos,” British doctor John Cormack told AFP.
“A lot of these media doctors have spent a lot of time building up their image of credibility, so they are believed even when they make incredible claims,” said Cormack, who has worked on the subject with the British Medical Journal (BMJ).
Artificial intelligence (AI) expert Henry Ajdar said doctor deepfakes had “really taken off this year”.
Ajdar said the AI videos typically target older audiences by using fake identities of doctors who regularly appear on daytime television.
French doctor Michel Symes, who frequently appears on TV in France, told AFP in May he was taking legal action against Facebook owner Meta over “scams” carried out using his image.
British doctor Hilary Jones even hired an investigator to keep an eye on deepfake images that looked like her.
One video showed Jones selling a bogus cure for high blood pressure — as well as cannabis gummies — on a U.K. TV show on which he regularly appears.
“Even if they are removed, they appear the next day under another name,” Jones lamented in the BMJ.
‘Game of cat and mouse’
French academic and AI expert Frederic Jury pointed out that recent advances in AI have made the quality of deepfake images, audio and video more reliable.
“Today we have access to billions of images, and we are able to create algorithms that can model everything we see in the images and recreate them. This is what we call generative AI,” he said.
It is not only the statues of widely respected doctors that are being misused.
The appearance of controversial French researcher Didier Raoult – who has been accused of spreading misleading information about COVID-19 drugs – has also been used in several deepfake videos.
Australian naturopath Barbara O’Neill, who has been roundly condemned for claiming that baking soda can cure cancer, has been falsely shown selling “blood vessel-clearing” pills in a TikTok video.
Contacted by AFP, her husband Michael O’Neill lamented that “a lot of unscrupulous people” were using his wife’s name “to sell products that she wouldn’t recommend, and in some cases, they are straight-up scams”.
Some fake videos are even more deceptive, falsely claiming that O’Neal died from a miracle oil sold on Amazon.
AI expert Adjar is not surprised that even such controversial health figures are falling victim to deepfakes.
“He is overly trusted by people who are unconventional or conspiratorial,” he said.
Experts were not optimistic that newly developed AI detection tools were able to fight the onslaught of deepfakes.
“It’s a game of cat and mouse,” the jury said.
Instead of trying to find all fake videos, he said, he pointed to technology that can “guarantee that the content has not been altered, such as to send a message, with the help of software that generates a digital signature, like a certificate.”
(Except for the headline, this story has not been edited by NDTV staff and is published from a syndicated feed.)