The Meta Smart Glasses will soon provide detailed information on visual stimuli

Photo of author

By [email protected]


Meta Ray Ban glasses To better help the blind and low vision community. Artificial intelligence assistant will now provide “detailed responses” regarding what is in front of users. Meta says he will start “when people ask about their environment.” To start, users only have to subscribe to the device settings section in the Meta Ai application.

The company shared a video clip of the tool in the work in which one of the blind users asked Meta AI to describe a herbal area in a garden. He quickly jumped to work and referred to a path, trees, and body of water at the distance. Artificial intelligence assistant has also been shown describing the contents of the kitchen.

I can see that this is an enjoyable addition even for those who have no visual weakness. However, it starts to all users in the United States and Canada in the coming weeks. Meta plans to expand to additional markets in the near future.

It is the day of awareness of global accessibility (GAAD), so this is not the only tool that Meta announced today. There is a NIFTY invitation to the volunteer, which automatically connects blind people or lows with a “network of actual eager volunteers” to help complete daily tasks. Volunteers come from the Bei Eyes Foundation and later later this month in 18 countries.

The company recently announced a more accurate system of direct illustrations Like the VR headphones line. This turns spoken words into an actual text, so that users can “read the content while delivering it”. The feature is already available for winning headphones and within the worlds of Meta Horizon.



https://s.yimg.com/ny/api/res/1.2/QSCHY3ett55.EXFp4tGQvQ–/YXBwaWQ9aGlnaGxhbmRlcjt3PTEyMDA7aD02NzU-/https://s.yimg.com/os/creatr-uploaded-images/2025-05/02f1d690-319f-11f0-be3e-4101fe6df34d

Source link

Leave a Comment