Get the latest tech news
New Apple study shows how your AirPods might one day double as an AI heart monitor
Apple Research has published a study shows how AI models can estimate heart rate from heart sound recordings.
The Apple Research team has published a pretty interesting study that investigated whether AI models can estimate heart rate from stethoscope recordings, even though they weren’t specifically trained with that purpose in mind. Looking ahead, the researchers say they plan to keep refining the models for health applications, build lighter versions that could run on low-power devices, and explore other body-related sounds that might be worth listening to. “In the future, we plan to: (i) explore combining acoustic features with FM representations, using feature concatenation before the downstream model or through late fusion methods within the model, for improved performance and investigate if such methods are able to capture complementary information and be more robust to individual variabilities;(ii) investigate fine-tuning the FMs to the target domains to reduce the domain mismatch and hence explore if such adaptation translates to improved performance, better mitigate the challenges in HR estimation, and capture complex pathological characteristics; (iii) assess their ap plicability to other downstream tasks and physiological pa rameters, including pathological conditions; (iv) augment and adapt more data that’s clinically significant; (v) compare them with other bioacoustic foundation models, such as HeAR [30]; and (vi) explore model simplification strategies, such as pruning, distillation, and lightweight encoder design, to enable deployable solutions with lower computational cost while maintaining performance.”
Or read this on r/apple