Researchers have developed a new AI model that can detect diseases by taking a picture of your tongue

Researchers from Iraq and Australia have developed an AI model that diagnoses diseases by analysing images of the tongue.

Listen to the story

Advertisement
Researchers have developed a new AI model that can detect diseases by taking a picture of your tongue
Representative image created using AI

Researchers from Iraq and Australia have reportedly developed a new artificial intelligence (AI) model that is capable of diagnosing a variety of diseases by analysing images of a person’s tongue. The model has achieved 98 per cent accuracy in tests by simply analysing the colour of the human tongue.

The AI ​​model was developed in collaboration between the Middle Technical University (MTU) in Baghdad and the University of South Australia (UniSA), reports Newswise. Researchers explain that the concept of diagnosing health conditions through examining the tongue has been a key component of traditional Chinese medicine for more than 2,000 years. In this practice, doctors analyze human tongues to diagnose illnesses based on specific colors and textures that indicate particular ailments. Leveraging this ancient technique with modern technology, the research team led by Ali Al-Naji, adjunct associate professor at both MTU and UniSA, sought to use the power of AI to bring this diagnostic method into the 21st century.

Advertisement

“Typically, people with diabetes have a pale tongue, while cancer patients may have a purple tongue with a thick greasy coating. Acute stroke patients often have an abnormally red tongue,” Professor Al-Naji explained. He further explained that a white tongue can be a sign of anaemia, while severe COVID-19 cases often have a dark red tongue. On the other hand, an indigo or violet tongue may be a sign of vascular or gastrointestinal problems or asthma.

Like other AI models, the researchers trained this AI model using a dataset of 5,260 tongue images that were carefully labeled to correspond to various medical conditions. The training process enabled the AI ​​to accurately recognize subtle differences in tongue color and texture, which are key indicators of health problems.

To further validate the accuracy of this AI model, the team of researchers also conducted tests using 60 tongue images of patients at two teaching hospitals in the Middle East. The patients sat about 20 centimeters (about 8 inches) away from a laptop equipped with a webcam, which captured photos of their tongues. The AI ​​model then analyzed the images and successfully identified the associated medical conditions in almost all cases.

The findings of this study, published in the journal Technologies, showed that AI-powered tongue analysis could become a safe, efficient and user-friendly method for disease screening. The researchers envision a future where this technology could be integrated into smartphone apps, allowing users to receive instant health assessments by taking a picture of their tongue.

UniSA professor and study co-author Javan Chahal emphasised the potential of this technology to complement and advance modern medical practices. “These results confirm that computerised tongue analysis is a safe, efficient, user-friendly and cost-effective method for disease screening that supports modern methods with age-old practice,” Chahal said.

However, the researchers acknowledged that there are still challenges to be overcome before this technology can be widely adopted. One of the main concerns is to address patient concerns about data privacy and to ensure that the camera reflection does not hinder the accuracy of the AI ​​model.

LEAVE A REPLY

Please enter your comment!
Please enter your name here