• AiNews.com
  • Posts
  • Meta Adds AI, Translations, and Shazam to Ray-Ban Smart Glasses

Meta Adds AI, Translations, and Shazam to Ray-Ban Smart Glasses

A person wearing Ray-Ban smart glasses interacts with holographic AI-powered overlays. The overlays include glowing visuals for language translations, music identification, and AI-powered suggestions. One section of the overlay highlights real-time language translation text, while another shows music notes and waveforms representing song recognition. The person is adjusting their glasses with one hand, surrounded by dynamic, futuristic icons like sound waves, language symbols, and a headset. The background features a sleek digital interface in cool blue and teal hues, emphasizing advanced AI capabilities and innovation.

Image Source: ChatGPT-4o

Meta Adds AI, Translations, and Shazam to Ray-Ban Smart Glasses

Meta has introduced three new features for its Ray-Ban smart glasses: live AI, live translations, and Shazam integration. While Shazam is now available for all users in the U.S. and Canada, live AI and live translations are currently limited to members of Meta’s Early Access Program.

Key Features

  1. Live AI

What It Does: Enables users to interact naturally with Meta’s AI assistant while the glasses continuously analyze their surroundings.

How It Works:

  • For example, at a grocery store, you could ask the AI for recipe suggestions based on the ingredients you’re viewing.

  • The feature works for approximately 30 minutes per charge.

  • Availability: Limited to Early Access Program members.

  1. Live Translations

What It Does: Translates speech in real time between English and Spanish, French, or Italian.

How It Works:

  • Users can hear translations through the glasses or view transcripts on their phones.

  • Language pairs must be downloaded in advance, and users must specify their own language and that of their conversation partner.

  • Availability: Also limited to Early Access Program members.

  1. Shazam Integration

What It Does: Identifies songs playing in your environment.

How It Works:

  • Users simply prompt the Meta AI assistant upon hearing a song. Meta CEO Mark Zuckerberg showcases the feature in a demonstration on Instagram.

  • Availability: Available to all users in the U.S. and Canada.

How to Access These Features

To use these features, users must:

  • Ensure their glasses are running v11 software.

  • Update the Meta View app to v196.

  • For Early Access Program features, apply via Meta’s dedicated website.

A Growing Push for AI-Powered Smart Glasses

These updates come as tech giants position AI as the defining feature for smart glasses:

Google’s Android XR and Gemini Assistant: Google recently unveiled its new Android XR OS for smart glasses, focusing on its Gemini AI assistant as a key feature.

Meta’s Vision for AI Glasses: Meta CTO Andrew Bosworth has described 2024 as the year AI-native devices like smart glasses hit their stride.

In a recent blog post, Bosworth argued that smart glasses are the ideal form factor for AI, offering hands-free interaction and context-aware functionality.

What This Means

Meta’s enhancements to its Ray-Ban smart glasses reflect the growing trend of integrating AI assistants into wearable tech. With features like live translations and recipe suggestions, the glasses offer practical, real-world use cases for users in daily life.

The inclusion of Shazam highlights Meta’s focus on making the technology fun and accessible for everyday consumers, even as the company pushes more advanced features for Early Access members.

As Meta, Google, and other tech giants continue to innovate, AI-native smart glasses are emerging as a major hardware category that could redefine how users interact with digital assistants and their environments.

Editor’s Note: This article was created by Alicia Shapiro, CMO of AiNews.com, with writing, image, and idea-generation support from ChatGPT, an AI assistant. However, the final perspective and editorial choices are solely Alicia Shapiro’s. Special thanks to ChatGPT for assistance with research and editorial support in crafting this article.