Get the latest tech news
SensorLM: Learning the Language of Wearable Sensors
wearable sensors July 28, 2025 Yuzhe Yang, Visiting Faculty Researcher, and Kumar Ayush, Senior Research Engineer, Google Research We present SensorLM, a new family of sensor–language foundation models trained on 60 million hours of data, connecting multimodal wearable sensor signals to natural language for a deeper understanding of our health and activities. Wearable devices, from smartwatches to fitness trackers, have become ubiquitous, continuously capturing a rich stream of data about our lives.
We present SensorLM, a new family of sensor–language foundation models trained on 60 million hours of data, connecting multimodal wearable sensor signals to natural language for a deeper understanding of our health and activities. To overcome the annotation bottleneck, we developed a novel hierarchical pipeline that automatically generates descriptive text captions by calculating statistics, identifying trends, and describing events from the sensor data itself. Our research establishes a foundation for unlocking the understanding of wearable sensor data through natural language, enabled by a novel hierarchical captioning pipeline and the largest sensor-language dataset to date.
Or read this on Hacker News