Get the latest tech news

These smart glasses can read menus and 'see for you', thanks to AI


Envision, an accessible tech company, just released the Ally Solos smart glasses, which use multimodal AI to describe your surroundings, read text, and even recognize people.

The glasses can read text, describe your environment, and even perform web searches for the user -- all through audio cues delivered through the built-in speakers. Much of the technology looks similar, with 2K resolution camera sensors on the frames to process visual information and connectivity via the Ally app on iOS and Android. Although explicitly designed for low-vision individuals, the glasses also have features anyone could use, like translation capabilities or document scanning and capture by invoking the cameras.

Get the Android app

Or read this on ZDNet

Read more on:

Photo of thanks

thanks

Photo of smart glasses

smart glasses

Photo of menus

menus

Related news:

News photo

I tried Meta's new Oakley smart glasses in my production studio - my verdict as a content creator

News photo

Solos is equipping its smart glasses with an AI for the blind and low-vision community

News photo

HTC takes on Meta with the Vive Eagle smart glasses