Nvidia’s CEO explains how new AI models could power future smart glasses

Photo of author

By [email protected]


Technical tools – whether they are Phones, Robots Or self-driving vehicles — getting better at understanding the world around us, thanks to artificial intelligence. That message has echoed loud and clear throughout 2024 and is getting louder Consumer Electronics Show 2025in which chip maker Nvidia unveiled a new AI model for understanding the physical world and a set of large language models to power it Future AI agents.

Nvidia CEO Jensen Huang positions these global foundational models as ideal for robotics and self-driving vehicles. But there’s another category of device that could benefit from a better understanding of the real world: smart glasses. Tech-enabled eyewear like Meta’s Ray-Bans have quickly become the hot new AI tool, with shipments of Meta eyewear surpassing the million mark in November according to Counterpoint Research.

Such devices seem like the perfect vessel for AI agents, or AI assistants that can understand the world around you by using cameras, speech processing, and visual input to help you get things done instead of just answering questions.

Huang did not clarify whether Nvidia-powered smart glasses are on the horizon. But he explained how the company’s new model would power future smart glasses if partners adopted the technology for that purpose.

“Using AI as it connects to wearables and virtual presence technology like glasses, that’s all very exciting,” Huang said in response to a question about whether its prototypes would work on smart glasses during a press Q&A at CES.

Read more: Smartglasses will work this time, Google’s Android chief told CNET

Watch this: These new smart glasses want to be your next AI companion

Huang pointed to cloud processing as an option, which means queries that use Nvidia’s Cosmos model can be handled in the cloud rather than on the device itself. Small devices such as smartphones often use this method to reduce the processing burden when running demanding AI models. If the hardware maker wants to create glasses that can leverage Nvidia’s AI models on the device instead of relying on the cloud, Huang said Cosmos will distill its knowledge into a smaller model that is less generalized and optimized for specific tasks.

Nvidia’s new Cosmos model is being promoted as a platform for collecting data about the physical world to train models for robots and self-driving cars — similar to the way a large language model learns how to generate text responses after being trained on written media.

“The ChatGPT moment for bots is coming,” Huang said in a press release.

Nvidia also announced a set of new AI models built using Meta’s Llama technology called the Llama Nemotron, which are designed to accelerate the development of AI agents. But it’s interesting to think about how these AI tools and models can be applied to smart glasses as well.

Nvidia latest Patent filing It has sparked speculation about upcoming smart glasses, although the chipmaker has not made any announcements about future products in this area. But Nvidia’s new models and Huang’s comments come as Google, Samsung and Qualcomm announced last month that they are building a new mixed reality platform for smart glasses and headsets called AndroidWhich suggests that smart glasses may soon become more prominent.

Several new types of smart glasses were also displayed at CES 2025, such as Ray New X3 Pro and Halliday smart glasses. the International Data Corporation It also predicted in September that smart glasses shipments would grow 73.1% in 2024. Nvidia’s moves are another area to watch as well.





https://www.cnet.com/a/img/resize/8ca50937e5a8f094dfcbf8b7814601e033be1d3d/hub/2024/12/20/f3451556-a5d7-45ff-bd01-9128c72b96cd/meta-live-ai-2.jpg?auto=webp&fit=crop&height=675&width=1200

Source link

Leave a Comment