Get the latest tech news

SensorLM: Learning the Language of Wearable Sensors


wearable sensors July 28, 2025 Yuzhe Yang, Visiting Faculty Researcher, and Kumar Ayush, Senior Research Engineer, Google Research We present SensorLM, a new family of sensor–language foundation models trained on 60 million hours of data, connecting multimodal wearable sensor signals to natural language for a deeper understanding of our health and activities. Wearable devices, from smartwatches to fitness trackers, have become ubiquitous, continuously capturing a rich stream of data about our lives.

We present SensorLM, a new family of sensor–language foundation models trained on 60 million hours of data, connecting multimodal wearable sensor signals to natural language for a deeper understanding of our health and activities. To overcome the annotation bottleneck, we developed a novel hierarchical pipeline that automatically generates descriptive text captions by calculating statistics, identifying trends, and describing events from the sensor data itself. Our research establishes a foundation for unlocking the understanding of wearable sensor data through natural language, enabled by a novel hierarchical captioning pipeline and the largest sensor-language dataset to date.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of language

language

Photo of Wearable sensors

Wearable sensors

Photo of sensorlm

sensorlm

Related news:

News photo

Losing language features: some stories about disjoint unions

News photo

Show HN: I built this to talk Danish to my girlfriend – works with any language

News photo

A Dictionary of the Language of Myst's D'ni