Apple Glass parts already exist

Photo of author

By [email protected]


Smart glasses They spend a moment. Latest dead Ray-Ban specifications Available now, the latest Oakley devices are coming soon, and Google and Samsung will likely jump into the face-to-face technology race next year with Android. With all this momentum, it always felt like it was inevitable for me Apple will introduce its own smart glasses Sooner or later – and the latest signs point to sooner.

Reports claim Apple has paused its rumored Vision Air hardware — a smaller, lighter successor to the current Vision Pro VR headset — in favor of smart glasses. To me, this seems like a pivot to compete with the wave of AI-powered glasses that everyone from Meta and Samsung to Google, popAnd Amazon, Xreal, Rokid, and even OpenAI are either selling, developing, or rumored to be exploring.

A pair of Meta Ray-Ban glasses on top of Apple's Vision Pro headset

Apple doesn’t have Ray-Ban smart glasses yet, but the Vision Pro is already starting to explore the limits of XR on our faces.

Scott Stein/CNET

like I test several smart glasses This fall, I see the pieces coming together for Apple. It already has the product catalog and wearable technology to make a splash, and it’s much further along than you might realize. Here’s how Apple’s existing headphones, watches, phones, and software could shape the first pair of smart glasses.

AirPods audio technology

Apple has been working on the technology for our faces for more than a decade. When i He wore his first AirPods Back in 2016 and being ridiculed for its strange appearance, it looked like Apple was testing a device Flexible design for our faces. It worked: Today, everyone wears AirPods and other wireless headphones — and they don’t get ridiculed for it.

090716-apple-airpods-music-7021-2.jpg

A long time ago, having these things in my ears was surprising. Look where we are now.

Scott Stein/CNET

Since then, Apple has unleashed Computational audio features It can fit perfectly with smart glasses. He thinks Live translation In the latest AirPods firmware, head gestures for quick responses, heart rate tracking, ambient noise filtering to increase focus or help with hearing loss and 3D spatial audio. There’s also new open-ear noise cancellation technology AirPods 4in addition to approval from the Food and Drug Administration Hearing aid – A feature that has already appeared in smart glasses from companies such as Nuance.

All of these technologies could apply to smart augmented reality glasses, which have small outdoor speakers built into the audio frames. AirPods could be just the beginning.

Image of two hands, one wearing an Apple Watch, the other wearing a Meta Neural Band

My wrist last week: On my right, there’s a neural band to control the Meta Ray-Ban screens. On the left, there’s an Apple Watch that already has some gestures, but no glasses to control it yet.

Scott Stein/CNET

Control technology via Apple Watch

Meta’s latest display glasses come equipped with Neural Band, which controls the on-lens display using electrodes to read small muscle impulses and turn them into mid-air gestures. Apple already has a foot in the door with its own wrist-based controls.

Apple watches Already supports double click and shake-to-dismiss gestures to quickly reply to messages, answer calls, or stop timers. I was impressed by how early Double Tap looked on the watch and immediately thought of how easily it would connect to VR and AR headsets.

Apple Glasses can also be attached directly to the Watch for quick access to on-screen readouts, allowing them to bypass the built-in display entirely. You can think of it as a viewfinder for camera glasses, or a wearable touchscreen for identifying connected apps. Mita has already hinted that her own band Neural may be as well Flex to become an hourAnd Google has plans to Watches and glasses to intersectalso.

Apple iPhone Air 17-1

iPhone cameras continue to pack more into a smaller frame. Next: Glasses?

Joseph Maldonado/CNET

Camera technology via iPhone Air (and Vision Pro)

Apple has a long history of shrinking high-performance cameras into small spaces. Ultra slim iPhone Air It managed the most impressive compactness yet this fall, and the glasses require smaller cameras.

Apple has experience putting cameras and other sensors on headphones already. the Vision Pro camera range It’s likely more complex than anything Apple Glasses might include.

Apple can also borrow from its existing controls. iPhone Camera button It already has a capacitive touch sensor, which could hint at how its glasses navigate using the frame lever.

Maybe Apple will add 3D recording, letting you take photos Spatial videos On the glasses to live later with the Vision headset. It’s the same fantasy of recording your memories that Vision Pro tried to sell with in-headphone recording.

Apple needs to up its visual AI game

Apple Glasses will need camera-aware AI services, like iPhones Visual intelligence. There is still a lot of work to be done to catch up with Google Gemini and Meta AI. But glasses could be the perfect place to introduce that technology, and perhaps even train AI models on what they pick up over time.

As Meta does, solving the AI ​​problem on glasses could lead to improved AI in other Apple projects in the future, such as cars.

Apple Stores are a natural fit for eyewear demos

Meta is building retail experiences to showcase the new display glasses, but Apple already has a global fleet of stores — the same stores it used for complex tech demos during the Vision Pro launch. Apple Stores would make perfect sense for eyewear supplies, filling prescriptions online, much as Vision Pro already does with lens partner Zeiss.

Xreal One glasses connected by cable to iPhone 15

Connect a pair of Xreal One viewing glasses to your iPhone. Smart glasses and display glasses work with phones now, but they need to be able to connect better.

Scott Stein/CNET

Better connectivity with phones is one of Apple’s specialties

Current smart glasses are inadequate when it comes to connectivity with phones and app stores, and Apple can solve this problem as well as anyone. As Google and Apple control the pipelines for phone operating systems — Android and iOS — eyeglass makers are at their mercy to build the connections that make phones, smartwatches and other devices work together seamlessly.

Meta glasses must be operated through a phone app, dispensing with Siri and Gemini services. Google Android XR It should help deepen glasses’ connections on Android, and Apple should do the same on iOS. Apple making its own glasses could also pave the way for better support for other brands — or encourage iOS app developers to start thinking about glasses in general.

We probably won’t know anything for sure about the debut of Apple Glasses until at least next year, so for now it’s all just speculation. But put all the pieces together, and you can imagine some very special specifications. Now Apple just needs to put it on my face.





https://www.cnet.com/a/img/resize/9364dcc54abca59e6ba1bccec31c005f5bb65448/hub/2024/03/22/5154084a-2625-4940-be28-3d6a2dbf4e44/meta-rayban-apple-vision.jpg?auto=webp&fit=crop&height=675&width=1200

Source link

Leave a Comment