Meta's Ray-Ban glasses are about to get a lot smarter [Video]
Meta CEO Mark Zuckerberg showcased impressive upgrades to the company's popular Ray-Ban Meta Glasses during the Meta Connect event. The enhancements include the ability to translate live conversations, offer continuous talking features, and provide real-time information about the surroundings. At a starting price of $299, the glasses are now equipped with advanced technology comparable to smartphone-based AI apps like Google's Circle to Search, Gemini, and Apple's upcoming Siri improvements.
Enhanced Features
With the latest updates, users can now enjoy various functionalities such as asking the glasses to remember where they parked, learn more about landmarks, receive dinner suggestions based on grocery store activities, and engage in real-time translations. The live translation demo by Zuckerberg highlighted the glasses' natural translation capabilities, surpassing the conventional use of smartphones for such tasks.
New Transparent Design
Meta also introduced a limited-edition transparent design for the Ray-Ban Meta Glasses, offering a glimpse of the internal components. This design choice provides a nostalgic late-90s/early-2000s appearance that sets it apart from other tech gadgets in the market.
AI Vision Models
In addition to the glasses' updates, Meta unveiled its first AI vision models, Llama 3.2 11B and 90B. These open-source models are designed to enhance image understanding, catering to diverse needs such as interpreting charts and graphs. Developers can leverage these models to create innovative apps, like providing hikers with elevation information during a hike by analyzing a map of the route.
Meta's dedication to AI innovation has positioned it as a prominent player in the open-source AI landscape. The company's continuous strides in technology may lead to significant advancements in the AI industry's future.










