Look And Tell Me This: Does This Feature Work for Users Who Have Combined Hearing and Vision Loss?
Meta Ray-Ban smart glasses have gained popularity among users with and without vision loss due to their ability to film videos, make hands-free calls, dictate messages, and offer various features. The latest generation of these smart glasses has introduced a feature called "look and tell," which provides voice descriptions of photos taken. This feature has opened up new opportunities for blind users, allowing them to identify objects, read signs, take notes, and more using AI-powered descriptions.
Accessibility Limitations and Challenges
While the "look and tell" feature enhances accessibility for blind users, there are limitations for those with combined hearing and vision loss. Previously, users had to rely on voice commands to access descriptions generated by Meta, making it challenging for those who couldn't hear or speak effectively. Additionally, routing audio from the glasses to hearing aids or cochlear implants was possible on some platforms but has since been restricted, impacting users' access to information.

Last week, Meta introduced the MetaAI app, integrating the glasses into a new environment for updates. However, the glasses still redirect phone audio to their internal speakers, limiting the ability to use external devices for audio output. This setup may pose challenges in settings where loud audio is inappropriate, affecting users who rely on hearing aids or cochlear implants.
Potential Improvements for Braille Users
Following the app update, improvements have been made for braille users, allowing them to inquire about captured photos using MetaAI. However, the "look and tell" feature still requires a voice command to activate, hindering silent interaction with images. A suggested enhancement is to enable the Capture button to automatically trigger the "look and tell" feature, facilitating non-verbal engagement with photos.

While the current setup may not be ideal for users with hearing loss, there is optimism that Meta will consider feedback for future updates. Users are encouraged to provide feedback through the Meta app to help enhance the accessibility and usability of Meta Ray-Ban smart glasses. By addressing concerns such as audio routing and voice activation, Meta can cater to a wider range of users effectively.

© 2025 Helen Keller Services (HKS). All rights reserved.