Apple is reportedly fast-tracking the development of its new AI-powered smart glasses, a wearable device poised to compete directly with Meta’s Ray-Bans. According to Bloomberg, the tech giant is aiming for a late 2026 release and has already begun preparations for large-scale prototype production of these innovative Apple smart glasses.
This latest information provides clarity and updates on earlier reports, which had suggested Apple might unveil its smart glasses alongside custom chips based on Apple Watch architecture. While previous sources indicated chip production would commence in summer 2026, Bloomberg’s new report confirms a more aggressive timeline: Apple intends to start producing significant quantities of the smart glasses themselves, not merely components, by the end of 2025.
Unveiling “Vision Light”: Features and Focus
While Apple’s more ambitious augmented reality glasses are still some years away, this new wearable, codenamed N401, aims to bring real-world contextual awareness to Siri in a sleeker form factor than a full headset. Envisioned as a “Vision Light,” these smart glasses will integrate onboard cameras, speakers, and microphones. Key functionalities are expected to include live translation, call management, and turn-by-turn navigation.
Crucially, the N401 glasses will not feature augmented reality displays like the Vision Pro headset. Instead, they will rely on audio and voice feedback. This design choice is a deliberate move to minimize complexity, reduce cost, and ensure a less bulky device. This aligns with earlier indications that Apple was exploring both AR and non-AR models, with N401 representing the latter.
Strategic Imperatives in the AI Wearable Market
Apple’s venture into AI-powered smart glasses is a significant part of its broader strategy to maintain a competitive edge in the burgeoning market for AI-enabled devices. Meta and Google are already active participants, and the landscape is set to become even more crowded with OpenAI’s newly announced hardware partnership with Jony Ive.
It’s also noteworthy that this recent report from Bloomberg does not mention the specialized chips, Glennie for AirPods and Nevis for Apple Watch, which were named in a prior leak. However, it does reiterate that Apple is grappling with challenges related to integrating camera and sensor data within a lightweight frame. The concept of offloading processing tasks to an iPhone, mentioned in earlier discussions, remains pertinent as Apple continues to balance performance with power efficiency in its wearable technology.
Other Wearable Ventures and AI Challenges
Not all of Apple’s experimental wearable projects are advancing. Development of a smartwatch featuring an integrated camera and real-world analysis capabilities has reportedly been discontinued. This device, intended to bring environment-sensing features to the wrist, was shelved due to technical difficulties and privacy concerns.
Internally, the smart glasses project is reportedly encountering some of the same AI limitations that Apple has faced in other areas. While Meta’s Ray-Bans leverage Llama, and Android XR glasses utilize Google’s Gemini, Apple has, to date, relied on third-party AI solutions like OpenAI or Google Lens for visual understanding capabilities on the iPhone. Industry analysts anticipate that Apple will soon introduce its own proprietary AI models, possibly coinciding with the launch of the new smart glasses.
If Apple’s plans materialize as expected, these AI glasses will mark the company’s first significant entry into AI-first wearables. This development holds the promise of redefining how users interact with technology in their daily lives. Whether Apple can effectively challenge Meta in this specific arena remains to be seen, but it’s clear the company is not prepared to remain a bystander.