Get the latest tech news
Future Apple Vision Pro could take commands by just reading your lips
Apple has been researching how to have a future Apple Vision Pro determine mouth movements in order to take commands or dictation purely through lip reading.
"Similarly, background noise in some environments can interfere with the ability of the head-mountable device to accurately and reliably recognize voice inputs from the user," it continues. One is that "a vision sensor carried by the display frame and oriented externally in a downward direction," could be "configured to detect mouth movement." And in case three options isn't sufficient redundancy, there could be yet another "sensor including an external-facing camera to detect a hand gesture indicating confirmation of the input selection."
Or read this on r/apple