Get the latest tech news
Turning AirPods into a fitness tracker
A fun side project for a great cause featuring Core Motion, SwiftUI, a little help from AI, and a pair of AirPods to count 100 push-ups a day.
I know that the Core Motion framework provides access to sensor data streams, and thanks to my quiet hobby of learning about esoteric Apple APIs (hello, CNLabelContactRelationYoungerCousinMothersSiblingsDaughterOrFathersSistersDaughter), I remembered the potentially useful CMHeadphoneMotionManager introduced in iOS14, back in 2020 when we were all trapped inside during the pandemic wondering if we’d ever be allowed to do push ups together again. I started by training a custom GPT on Core Motion, and created a prompt to instruct it what to do with the information: how to reference, to be helpful, specific, concise and share its reasoning. Being a detector, it’s also responsible for interpreting the raw data from the Motion Manager, such as whether the person is in the prone position or not (this is useful for determining whether we should be counting pushups or not, for example if they are standing or vertical).
Or read this on Hacker News