My Experience with the Latest AI Updates on Ray-Ban Meta Smart Glasses
Just over a week ago at Meta Connect 2024, the Zuck announced a bunch of new AI features coming to the Ray-Ban Meta Smart Glasses, alongside a limited edition transparent model that I immediately bought with no regrets. Up until this point, Meta AI’s non-existence here in the UK has meant these specs have been nothing more than a fancier-looking pair of Snapchat Spectacles. But not only is Meta AI now finally available in the UK and Australia, I got to try some of the newest features coming to the best smart glasses. And put simply, if you thought the multi-modal AI features were smart before, you ain’t seen nothing yet. These updates are a quiet revolution, which makes these less of a question + answer machine, and more of a legitimately helpful assistant.
Enhanced AI Capabilities
Ray-Ban Meta smart glasses (with transition lenses): $379 @ Best Buy
These smart glasses far exceed their 12MP camera capture abilities. Meta AI features give them all the smarts to answer any questions about the world around you, thanks to multimodal AI. And all of this is underpinned by a stellar-looking set of specs that feel great to wear all-day round.
Improved User Experience
In one of my big predictions for Connect 2024, I had my fingers crossed for more natural interactions with the Ray-Bans. In fact, I laid down the gauntlet for Meta to drop the need to say “Hey Meta, look and…” followed by your question. Well, not to say I’m the Mystic Meg of AI glasses, but that’s exactly what happened. The end result is probably the biggest feature update to the smart specs that you won’t even realize. Getting help to translate a menu, get recipe ideas based on ingredients in your kitchen, and finding out more about landmarks is so much more natural without needing to follow the standard prompting rubric.
Vision-Based Reminders
Ever see that episode of “Black Mirror” where everyone has that eye-based computer to be able to rewind to certain moments and conversations? Not only is that my fiancée’s dream gadget of the future to remind me when I said “yes” to doing that errand I forgot to do, but Meta’s taken a small step towards it with reminders. Your Ray-Bans can recall that. I asked “what book am I supposed to buy in two weeks?” And its answer was straight to the point of “You were supposed to buy The Last of Us Part II book in 2 weeks.”
Limitations and Future Potential
There are limitations, though. Visual reminders are simply to remind you to do something with the thing you capture. There’s no option to pin a location on an item, which seems to be a common misconception of the whole “remember where I parked” prompt demo at Connect 2024. This would be another huge step forward if location data could be used alongside visual data. Also, being able to ask Meta AI to scan a QR code and send the link to my phone was immensely helpful.