Inbizzy, California, – Meta has officially unveiled Ray-Ban Display, a new generation of AI glasses that combine a high-resolution, full-color microdisplay with the Meta Neural Band, an electromyography (EMG) wristband that translates muscle activity into digital commands.
The launch at Meta Connect 2025 highlights Meta’s push to redefine wearable computing by integrating display technology, cameras, microphones, audio systems, and AI processing into a single, lightweight, and familiar eyewear form factor.
A Contextual Display Built into the Lens
Ray-Ban Display introduces a microdisplay placed at the side of the lens to avoid obstructing the user’s field of vision. The screen activates only when needed, supporting short, contextual interactions such as reading messages, previewing photos, or checking translations.
This approach positions the glasses as a complement to smartphones, enabling quick and seamless digital access without disrupting real-world engagement.
Meta Neural Band: EMG-Powered Interaction
Each pair of Ray-Ban Display glasses comes with the Meta Neural Band, which uses EMG sensors to detect and interpret the body’s electrical muscle signals. With this system, subtle finger movements—sometimes even before they are visible—can be converted into digital commands.
The technology was refined through studies involving nearly 200,000 participants, allowing it to function reliably across diverse muscle types and conditions. From an accessibility perspective, the Neural Band also provides potential control options for individuals with spinal cord injuries, stroke-related impairments, tremors, or other mobility challenges.
Built from Vectran, the same high-strength material used in NASA’s Mars Rover landing systems, the wristband is both lightweight and durable. It is water-resistant with an IPX7 rating and supports up to 18 hours of continuous use.
AI Integration for Seamless Functionality
The integration of a display with EMG-based input expands the role of AI in everyday interaction. Key functions include:
- AI Assistant with Visual Output – delivering answers and step-by-step instructions directly in the lens.
- Live Transcription and Translation – converting speech into on-screen captions and translating select languages in real-time.
- Gesture-Based Control – navigating apps, controlling music playback, and adjusting volume using subtle finger and wrist movements.
This interaction model demonstrates EMG’s potential as a next-generation interface, moving beyond touchscreens and physical buttons toward more natural, intuitive inputs.
Material Design and Wearability
Ray-Ban Display maintains the appearance of everyday eyewear while incorporating advanced hardware. The integration of cameras, microphones, speakers, and AI systems within a familiar form factor represents Meta’s approach to making wearable computing accessible without compromising style or comfort.
Three Tiers of AI Glasses
Meta now categorizes its AI eyewear portfolio into three distinct product types:
- Camera AI Glasses – eyewear focused on capturing photos and videos.
- Display AI Glasses – represented by Ray-Ban Display, with contextual visual capabilities.
- Augmented Reality Glasses – prototypes like Orion, designed with holographic AR displays.
A Step Toward the Future of Wearable Computing
The introduction of Ray-Ban Display underscores Meta’s long-term vision of wearable devices as the foundation of a new computing platform. The company positions this innovation as a milestone in creating technology that is both practical and immersive, while keeping people engaged with their surroundings.
“Today marks the beginning of a new chapter—not only for AI glasses but for the future of wearable technology,” said Mark Zuckerberg during the launch









