Close-up of stylish acetate smart glasses featuring a small oval camera lens and a subtle LED indicator on the front frame.

So, picture this. You’re walking down a busy street, your hands are full of groceries, and you want to snap a photo of something cool you just walked past. No fumbling for your phone. No unlocking, no opening the camera app. You just say “Hey Siri, take a photo” — and your glasses do it. That’s exactly the kind of everyday magic Apple is cooking up with its upcoming AI-powered smart glasses, codenamed “N50.”

Let me walk you through what we know, and why this might genuinely be one of the most exciting gadgets of 2027.

Apple’s playing a very specific game here

Forget the Vision Pro — that’s a whole different beast. These glasses aren’t trying to replace your reality with holograms. Think of them more like AirPods, but for your eyes. Apple is going straight after Meta’s Ray-Ban smart glasses, the ones you’ve probably seen people wearing that look almost completely normal. Same idea, but with Apple’s polish, privacy obsession, and ecosystem firepower behind them.

The internal team reportedly calls this device the anchor of a three-piece AI-wearable lineup: the glasses, a camera-equipped pendant, and next-gen AirPods. Together, they’re meant to give Siri a full set of senses — sight, sound, and context — all the time.

What can these glasses actually do?

Here’s where it gets genuinely interesting. The glasses pack two cameras. One is a high-resolution shooter for photos and videos. The other is a quieter, always-on computer vision sensor that’s constantly reading your environment — recognizing objects, tracking scenes, and building context.

Combine that with an upgraded Siri, and suddenly your glasses can answer questions like “What’s that plant?” or “What does that sign say?” in real time. They can guide you through pedestrian navigation by voice. They can remind you that you left your keys on the kitchen counter. They can translate street signs on the fly. All without you ever touching your phone.

And importantly — there’s no screen built into the lenses. All visual feedback goes to your iPhone or Apple Watch. The glasses are the sensor; your devices are the display.

It looks like actual eyewear, not a sci-fi prop

Apple apparently learned its lesson from the bulky, “obvious tech gadget” look that’s haunted smart glasses for years. These are being designed with premium acetate frames — the same material used in high-end optical stores — with four different frame styles to choose from. Think Wayfarer-style, slim rectangular, large oval, and a smaller circular option. Color choices include black, ocean blue, and a warm tortoiseshell brown.

The camera sits in a vertical oval pattern on the front, surrounded by a ring of subtle indicator lights. Those lights glow when the camera is recording or when Siri is listening. That’s not just a design choice — it’s Apple’s way of making sure people around you know when they’re being filmed. Smart, and probably very necessary in a world that’s increasingly skeptical of wearable cameras.

The privacy story is front and center

Apple is leaning hard into the “we respect your data” angle. Basic tasks — object recognition, voice commands — happen directly on the device using a custom chip inspired by Apple Watch silicon. More complex AI queries get sent to the cloud, but with encryption and user-controlled settings. You’ll be able to check logs of when the camera was active, limit what third-party apps can access, and even flip a privacy mode that disables the camera entirely.

When can you actually buy them?

Production is expected to kick off in late 2026, with the glasses hitting store shelves sometime in 2027. Pricing will likely land somewhere in the range of high-end Ray-Ban Meta models — not cheap, but nowhere near Vision Pro territory either.

Analysts already predict Apple could become the second-biggest player in AI smart glasses by 2028, right behind Meta.