Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
SensorLM: Learning the Language of Wearable Sensors (research.google)
20 points by smusamashah 3 months ago | hide | past | favorite | 5 comments


this seems very interesting: they got a big sensor dataset and generated some text from that. I guess this involves things like maximum values, mean values, maybe simple trends and things like was the person walking or biking etc. It would be interesting to see if the model identifies things that were not so easily provided in the training data. Otherwise this is just teaching the model to sort of calculate the mean from sensor data instead of using tools to do this


Surely agree with it. I am more interested on how this can be replicated with other domains. Any ideas for how to put sensor data in context


Very interesting. We will see soon a rise of training assistants reading through our wearable sensors.

Sadly, it seems these foundation models are still not open to the public. I can't find any links within the research page or the paper to tinker a little bit...


I have not read the paper yet, but here is something related from Apple: https://machinelearning.apple.com/research/beyond-sensor


im confused, surely there already exists clasification models, which, given multidimensional sensor data, output the most likely "activity" type? why would text modality be a better choice?




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: