Apple wants to convert airpods into heart rate monitor, will use AI for this

Apple wants to convert airpods into heart rate monitor, will use AI for this

Apple is researching ways to use AI in existing hardware such as airpods to monitor non-invasive heart rate.

Listen to the story

Advertisement
Apple wants to convert airpods into heart rate monitor, will use AI for this
Representative image created using AI

In short

  • Apple AI- Searching to monitor heart rate through audio analysis
  • Apple researchers are taking advantage of AI to understand the voice of the heart
  • Future airpods can serve as a passive health-tracking device

Apple is finding out how wearable equipment such as airpods can double as a heart rate monitor. A new published research paper by the Apple Research team discusses the possibility of using AI-operated acoustic models to estimate heart rate from heart sound recording. These heart rate recording or sounds can be captured from the body using devices such as airpods.

The study, titled “Foundation Model Hidden Representation for Heart Rate Estation for Oscalets”, wants to identify whether the Foundation AI model trained on general audio and speech can make an accurate estimate of the heart rate from the voice of the AI ​​model heart. This non-invasive method, known as oscultation, usually involves listening to the sounds produced by the heart.

The idea is the same as doctors use a stethoscope to listen to heart rate to diagnose and monitor various medical conditions. Apple researchers want to follow the same technique and use wearballs such as airpods to catch the voice of the heart and use AI to measure heart rate by analyzing it.

Apple revealed that its researchers tested six major foundation models, including Hubbert, Wav2VEC2, and their own internal developed versions (contrast language-audio preparation), to see how well these models could lift the heartbeat from the phonocardiogram to see how well these models could lift the heartbeat. Research suggests that even though these models were not made for healthcare tasks, they managed to perform better by traditional methods based on handicraft audio features.

“In this work, using a publicly available phonocardiogram (PCG) dataset and a heart rate (HR) assessment model, we do a layer-wise check of six acoustic representation FMS: Hubert, Vav 2 VEC 2, Vavalm, Vavalm, Vavalm, Valm, Contrastruptic Language-Audio Preparation (Clap), and apple.

During the research, the Apple team used a publicly available dataset of more than 20 hours of the heart-ridden heart sounds anotated by medical experts. The team then divided the audio clip into a 5-second segment, AI analyzed them to predict heart rate per minute (BPM) beats.

The study has shown that the middle-level layers in the AI ​​model did the best work to detect heart signals, while deep layers-which are usually fine for speech recognition-were less effective in analyzing biological sounds such as the heartbeat. This suggests that Apple will need to focus on specific parts of the AI ​​model instead of using for health tracking.

Although research did not reveal any scheme for a commercial product, it suggests Apple’s intention to do more with its equipment. Apple has already previewed the expansion possibilities of earbuds such as the Beats Powerbits Pro 2, which can offer heart rate tracking. But with AI, Apple wants her wearballs to do even more.

Airpods already have high-quality microphones used for active noise cancellation (ANC) and transparency mode, which can theoretically lift micro-hearted sounds. If Apple integrates in airpods to detect this AI-operated heart rate, it can help users with a passive heart rate monitoring without the need for an apple watch and offer more advanced fitness trekking and quick detection of irregularities with the heart.

Advertisement
Share This Article
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Exit mobile version