Get the latest tech news
These smart glasses can read menus and 'see for you', thanks to AI
Envision, an accessible tech company, just released the Ally Solos smart glasses, which use multimodal AI to describe your surroundings, read text, and even recognize people.
The glasses can read text, describe your environment, and even perform web searches for the user -- all through audio cues delivered through the built-in speakers. Much of the technology looks similar, with 2K resolution camera sensors on the frames to process visual information and connectivity via the Ally app on iOS and Android. Although explicitly designed for low-vision individuals, the glasses also have features anyone could use, like translation capabilities or document scanning and capture by invoking the cameras.
Or read this on ZDNet