Get the latest tech news
Meta's smart glasses can now tell you where you parked your car
Meta is rolling out some of the previously announced features to its AI-powered Ray-Ban smart glasses for users in the US and Canada.
CTO Andrew Bosworth posted on Threads that today's update to the glasses includes more natural language recognition, meaning the stilted commands of "Hey Meta, look and tell me" should be gone. Meta's smart glasses already made headlines once today after two students from Harvard University used them to essentially dox total strangers. Google is expanding Gmail’s summary cards, the AI-driven contextual snippets extracted for things like incoming packages.
Or read this on Endgadget