IOS 26 screenshots can be an interesting preview of the late Siri Reflection from Apple

Photo of author

By [email protected]


When it was launched, Apple visual intelligence The feature allowed you to direct your compatible phone camera to the things around you or to make a search for Google photos or ask questions via ChatGPT. At WWDC 2025, the company has offered updates to expand the benefit of visual intelligence, to a large extent by including it in the screenshot system. To The company’s press statement“Visual intelligence already helps users to know the objects and places around them using their iPhone camera, and users are now able to do more, with content on their iPhone screen.”

This reminded me of “awareness on the screen” apple Described As one of Siri’s capabilities when Apple Intelligence announced last year. in Which – which The company said, “With awareness on the screen, Siri will be able to understand and take action with the content of users in more applications over time.” Although it is not the same completely, the visual screen -based screen intelligence allows your iPhone to serve the contextual procedures from the content on your screen, not only via Siri.

Somehow, of logical. Most people are already accustomed to taking a screenshot when they want to share or save the important information they have seen on a web or Instagram post. Merging the Apple Intelligence procedures here will theoret the tools you expect, instead of making users speak to Siri (or wait for the update).

Basically, at iOS 26 (on Apple intelligence devices) The pressure on the power buttons and the reduction to capture a screenshot will pulle a new page. Instead of the mini image of your reserved image that appears at the bottom of the left, you will see the image you take almost all the screen, with options around it to edit, share or save the file, in addition to obtaining answers and procedures based on Apple intelligence below. At the bottom of the left and right, the corners sit to order ChatGPT and search for Google images, respectively.

Depending on what is in the screenshot, Apple Intelligence can suggest different procedures below your image. This can be asked about where to buy a similar component, add an event to your calendar or specify the types of plants, animals or food, for example. If there is a lot of what happens in your screen shot, you can draw on an element to highlight it (similar to how to determine an object to erase the images) and get information about this part of the image.

Applications or services belonging to external parties that enable the application, such as Google, ETSY and Pinterest, can appear here so that you can implement the procedures inside this area as well. For example, if you have found Bokend you want, or pick up a screenshot and select it, you can shop on it on ETsy or install it on Pinterest.

One aspect of this update to the visual intelligence that gives me stopping is that for people who like me a screenshot without thinking and do not want to do anything other than getting receipts, this may add a frustrating step between taking a screenshot and keeping it on the pictures. It seems that you may be able to turn off this interface and stick to the existing screen shot system.

The examples that Apple gave to Siri’s ability to understand what is on your screen was somewhat similar. In a press release of last year, Apple said, “For example, if a friend sent a user address about his new address in messages, the future can say,” Add this address to his contact card. “

Like visual intelligence in screenshots, this includes scanning content on the screen for relevant information and helping you put it in a place (such as contacts or calendar) where it is more useful. However, the new Siri era promise was more about interacting with all parts of your phone, through both the first and third party applications. So you can ask the assistant to open an article that you added to your reading menu in Safari or send photos from a specific event to a contact.

It is clear that Apple did not provide these developments to Siri, and as Craig Federigi said at WWDC 2025 Keynote, that power It is only discussed later this year. However, while we are waiting for this case update, the changes in screenshots may be a preview of the upcoming things.

If you buy something through a link in this article, we may win the commission.



https://s.yimg.com/ny/api/res/1.2/CxIR5rsnItdCyRYu0A2iXQ–/YXBwaWQ9aGlnaGxhbmRlcjt3PTEyMDA7aD03ODg-/https://s.yimg.com/os/creatr-uploaded-images/2025-06/cc9e8dc0-46ea-11f0-bb7b-38fc5cce00c7

Source link

Leave a Comment