Get the latest tech news
Apple researchers develop AI that can ‘see’ and understand screen context
Apple researchers develop AI system that understands screen references and context, enabling more natural voice assistant interactions.
Apple researchers have developed a new artificial intelligence system that can understand ambiguous references to on-screen entities as well as conversational and background context, enabling more natural interactions with voice assistants, according to a paper published on Friday. Request an invite Apple’s AI system, ReALM, can understand references to on-screen entities like the “260 Sample Sale” listing shown in this mockup, enabling more natural interactions with voice assistants. But the famously secretive tech giant faces stiff competition from the likes of Google, Microsoft, Amazon and OpenAI, who have aggressively productized generative AI in search, office software, cloud services and more.
Or read this on Venture Beat