MegynKelly
FULL MEMBER
New Recruit
- Joined
- Jan 23, 2025
- Messages
- 69
- Reaction score
- 0
- Country
- Location
Apple’s next big move in wearables is officially out of the rumor mill and barreling toward reality: smart glasses that actually look and feel like glasses, but with a hefty dose of AI and Apple polish. If all goes to plan—and Apple doesn’t change course at the last minute—these glasses are set to arrive by late 2026. Think of them as a cross between Meta’s Ray-Ban smart glasses and your iPhone, but with the kind of design finesse Apple is famous for. The idea is simple: you’ll be able to take calls, play music, get directions, translate languages live, and interact with Siri, all without pulling anything out of your pocket. It’s like having your own personal assistant perched on your nose, quietly judging your outfit and helping you through the day.
The glasses are expected to come loaded with cameras, microphones, and speakers, letting you interact with the world in new ways. Need to answer a call while your hands are full? No problem. Want to get walking directions whispered in your ear as you stroll through a new city? Done. The glasses will also be able to translate foreign languages in real time, manage your music, and probably snap photos or videos—though Apple’s history with privacy means you can expect a lot of safeguards and visual cues to avoid any “creepy” factor. The company is reportedly working on a custom chip for these glasses, building on the low-power processors used in Apple Watches, but optimized to juggle multiple cameras and AI tasks while sipping battery.
Apple’s play here is clear: they want to leapfrog Meta’s Ray-Ban smart glasses, which have already sold over a million pairs, by offering a “classier” product with better build quality and a more seamless experience. The company is fast-tracking the project, with engineers racing to hit the 2026 deadline. Mass production of prototypes is set to begin later this year, and Apple’s supply chain partners are already gearing up for what could be the next must-have gadget. The Vision Products Group—the team behind the Vision Pro headset—is leading the charge, and while future AR features are in the pipeline, the first-generation glasses will focus on AI-powered smarts rather than full-blown augmented reality.
But as Apple charges ahead with its smart glasses, it’s also quietly shelving some other wild ideas. After years of rumors, patents, and internal prototypes, Apple has officially scrapped plans to put cameras in the Apple Watch. There won’t be any secret wrist selfies or Inspector Gadget moments in the near future. The original concept was to add cameras for “Visual Intelligence” features—like world monitoring or even FaceTime calls from your wrist—but the project hit too many technical and privacy roadblocks, instead Apple is shifting its focus to other wearables, including AirPods with built-in cameras, which could arrive as soon as 2026.
Although the AirPods camera project is still in the early stages of development, the aim is to use infrared cameras, like those used in Face ID, to enable capabilities like gesture controls and spatial audio upgrades when in the air, Apple's larger goal is to make AI and spatial computing a seamless part of your everyday life by fusing hardware and software in ways that are nearly imperceptible. Imagine being able to change music or accept calls with a wave of your hand—no taps or buttons required.
Of course, there’s a catch: Apple’s AI still lags behind the likes of Google and Meta when it comes to generative smarts and real-time language processing. Siri has improved, but it’s still not as conversational or context-aware as Google Assistant or Meta’s AI. That’s why Apple is pouring resources into building a new, more powerful chip for the glasses, hoping that on-device processing and tighter integration with the Apple ecosystem will make up for any gaps in AI wizardry. If they pull it off, the glasses could be a game-changer. If not, there’s a risk they end up as expensive Bluetooth sunglasses with a fancy voice assistant.
Apple’s privacy-first approach will be a big selling point, especially as the world grows more wary of always-on cameras and microphones, expect features that process data locally, with clear indicators when the cameras or mics are active, and robust controls over what gets shared or stored. Apple knows it needs to balance convenience with trust if it wants people to wear these glasses all day, every day.
The competition is fierce. Meta’s Ray-Ban glasses are already popular, and Google, Samsung, and others are racing to launch their own smart eyewear. OpenAI, with ex-Apple designer Jony Ive, is also reportedly working on an AI wearable for 2026, aiming to set a new standard for the category. Apple’s edge will be its integration with the broader Apple ecosystem—think seamless handoff between your iPhone, Mac, and glasses, plus access to Apple Music, Maps, and more.
So, while your wrist won’t be getting a camera anytime soon, your face just might, Apple’s 2026 smart glasses are shaping up to be the company’s next big swing at the future of personal tech. Whether they’ll change the world or just become the latest status symbol remains to be seen, but one thing’s for sure: the race for your face is officially on.
The glasses are expected to come loaded with cameras, microphones, and speakers, letting you interact with the world in new ways. Need to answer a call while your hands are full? No problem. Want to get walking directions whispered in your ear as you stroll through a new city? Done. The glasses will also be able to translate foreign languages in real time, manage your music, and probably snap photos or videos—though Apple’s history with privacy means you can expect a lot of safeguards and visual cues to avoid any “creepy” factor. The company is reportedly working on a custom chip for these glasses, building on the low-power processors used in Apple Watches, but optimized to juggle multiple cameras and AI tasks while sipping battery.
Apple’s play here is clear: they want to leapfrog Meta’s Ray-Ban smart glasses, which have already sold over a million pairs, by offering a “classier” product with better build quality and a more seamless experience. The company is fast-tracking the project, with engineers racing to hit the 2026 deadline. Mass production of prototypes is set to begin later this year, and Apple’s supply chain partners are already gearing up for what could be the next must-have gadget. The Vision Products Group—the team behind the Vision Pro headset—is leading the charge, and while future AR features are in the pipeline, the first-generation glasses will focus on AI-powered smarts rather than full-blown augmented reality.
But as Apple charges ahead with its smart glasses, it’s also quietly shelving some other wild ideas. After years of rumors, patents, and internal prototypes, Apple has officially scrapped plans to put cameras in the Apple Watch. There won’t be any secret wrist selfies or Inspector Gadget moments in the near future. The original concept was to add cameras for “Visual Intelligence” features—like world monitoring or even FaceTime calls from your wrist—but the project hit too many technical and privacy roadblocks, instead Apple is shifting its focus to other wearables, including AirPods with built-in cameras, which could arrive as soon as 2026.
Although the AirPods camera project is still in the early stages of development, the aim is to use infrared cameras, like those used in Face ID, to enable capabilities like gesture controls and spatial audio upgrades when in the air, Apple's larger goal is to make AI and spatial computing a seamless part of your everyday life by fusing hardware and software in ways that are nearly imperceptible. Imagine being able to change music or accept calls with a wave of your hand—no taps or buttons required.
Of course, there’s a catch: Apple’s AI still lags behind the likes of Google and Meta when it comes to generative smarts and real-time language processing. Siri has improved, but it’s still not as conversational or context-aware as Google Assistant or Meta’s AI. That’s why Apple is pouring resources into building a new, more powerful chip for the glasses, hoping that on-device processing and tighter integration with the Apple ecosystem will make up for any gaps in AI wizardry. If they pull it off, the glasses could be a game-changer. If not, there’s a risk they end up as expensive Bluetooth sunglasses with a fancy voice assistant.
Apple’s privacy-first approach will be a big selling point, especially as the world grows more wary of always-on cameras and microphones, expect features that process data locally, with clear indicators when the cameras or mics are active, and robust controls over what gets shared or stored. Apple knows it needs to balance convenience with trust if it wants people to wear these glasses all day, every day.
The competition is fierce. Meta’s Ray-Ban glasses are already popular, and Google, Samsung, and others are racing to launch their own smart eyewear. OpenAI, with ex-Apple designer Jony Ive, is also reportedly working on an AI wearable for 2026, aiming to set a new standard for the category. Apple’s edge will be its integration with the broader Apple ecosystem—think seamless handoff between your iPhone, Mac, and glasses, plus access to Apple Music, Maps, and more.
So, while your wrist won’t be getting a camera anytime soon, your face just might, Apple’s 2026 smart glasses are shaping up to be the company’s next big swing at the future of personal tech. Whether they’ll change the world or just become the latest status symbol remains to be seen, but one thing’s for sure: the race for your face is officially on.