
While the world buzzes about generative AI and the race for smarter models, Apple is quietly doing what it does best: playing the long game. Behind the scenes, the company is doubling down on what might be its most important asset in the AI era: full-stack hardware control.
According to Bloomberg, Apple is deep in development on a new wave of custom processors tailored for everything from smart glasses and AirPods to future Macs and servers. This isn’t just about performance upgrades — it’s about reshaping Apple’s product strategy around the AI-infused future, one silicon slice at a time.
First up: smart glasses. But don’t expect full-blown AR spectacles just yet. Apple’s rumored initial version, powered by a new custom chip, is said to focus on more immediate use cases like voice control, taking photos, and audio playback. Think of it as Siri on your face, with a camera. What’s notable here is the chip design: it reportedly builds on the energy-efficient architecture used in Apple Watch chips, but is optimized to support multiple cameras and consume even less power. That’s a big deal for wearables, where battery life is everything.
Production for this smart glasses chip could begin as early as late 2026, with product rollout expected around 2027. Meanwhile, Apple continues its low-key arms race with Meta, which has already released Ray-Ban smart glasses and plans to launch a fully AR-enabled version by 2027. Apple’s strategy seems to be: hold the AR, serve the AI now, and wait for the tech (and the use cases) to mature.
But that’s just one piece of the puzzle.
Elsewhere, Apple’s chip roadmap includes AI server processors, camera-enabled AirPods (codename “Glennie”), and a future Apple Watch with built-in imaging (codename “Nevis”) — all planned for around 2027. On the Mac front, new M6 (“Komodo”) and M7 (“Borneo”) chips are in the works, along with a higher-end AI-focused processor internally dubbed “Sotra.” Expect an M5 refresh for iPad Pro and MacBook Pro to land later this year.
It’s all part of Apple’s grand design to own the full hardware-software-AI stack. The company’s internal silicon team, led by Johny Srouji, is building the infrastructure for AI-native devices — from your wrist to the cloud. Earlier this year, Apple also shipped its first in-house 5G modem (hello, iPhone 16e), with a premium variant expected in 2026.
So what’s the bigger picture here?
While companies like Nvidia dominate headlines with AI GPUs and Meta charges ahead with consumer smart glasses, Apple is quietly positioning itself for an AI-first future that doesn’t just live in the cloud — it lives in your pocket, your ears, your desk, and maybe, one day, your face. By controlling the chips, Apple controls the user experience — and, ultimately, its destiny in the AI era.
This isn’t just about devices. It’s about a company rewriting what AI hardware means — and making sure it never has to borrow anyone else’s blueprint to do it.