Get the latest tech news
AI-powered ‘sonar’ on smartglasses tracks gaze and facial expressions
Cornell researchers have developed two technologies that track a person’s gaze and facial expressions through sonar-like sensing.
“In a VR environment, you want to recreate detailed facial expressions and gaze movements so that you can have better interactions with other users,” said Ke Li, a doctoral student in the field of information science who led the GazeTrak and EyeEcho development. “The privacy concerns associated with systems that use video will become more and more important as VR/AR headsets become much smaller and, ultimately, similar to today’s smartglasses," said co-author François Guimbretière, professor of information science in Cornell Bowers CIS and the multicollege Department of Design Tech. The resulting sound signals are fed into a customized deep learning pipeline that uses artificial intelligence to continuously infer the direction of the person’s gaze.
Or read this on r/tech