Meta adds live translation, AI video to Ray-Ban smart glasses
Meta Platforms has announced the latest updates to the Ray-Ban Meta smart glasses, introducing AI video capabilities and real-time language translation functionality. This announcement follows the initial reveal of these features at the annual Connect conference in September.
The new features are part of the v11 software update, which is set to start rolling out on Monday for members of Meta's "Early Access Program". One of the key enhancements is the integration of video capabilities into Meta's AI chatbot assistant. This enhancement allows the smart glasses to analyze and respond to real-time visual cues from the user.
Real-time Language Translation
With the latest update, the Ray-Ban smart glasses will now support real-time language translation between English and Spanish, French, or Italian. Meta explains that users will be able to converse in one of these languages and hear translations in real-time through the glasses' open-ear speakers or view them as transcripts on their phone.
Additional Features
Meta has also included the popular app Shazam in the smart glasses, enabling users to identify songs while wearing the device. This feature will initially be available to users in the U.S. and Canada.
Back in September, Meta had revealed its plans to introduce several AI-driven features to the Ray-Ban smart glasses, such as tools for setting reminders and the ability to scan QR codes and phone numbers using voice commands.
These updates demonstrate Meta's commitment to enhancing the functionality of its wearable technology, providing users with innovative features that blend seamlessly into their daily lives.