Google’s artificial intelligence mode gets a new “fabric” feature, actual time help in direct search, and more

Photo of author

By [email protected]


Google announced on Tuesday that it adds new capabilities to Artificial intelligence modeIts experimental feature that allows users to ask complex questions and complicated follow -up to the search in a deeper way on a topic in the research directly.

One of the new features, fabric, helps you build study plans and organize information on multiple sessions on a side board. For example, if you want to create a study plan for a coming test, you can click the new “Create Canvas” button to start. From there, artificial intelligence mode will start collecting things in the Canvas Side panel, and you can continue to improve the directing of follow -up demands to fit what you are looking for.

Soon, you will also be able to download files such as class notes or a curriculum to customize your study guide. Users registered in the experience of artificial intelligence laboratories in the United States will see a fabric in the coming weeks.

Image credits:Google

Google also brings the capabilities of Astra Project directly to the AI mode via SEARCH LIVE, which was combined with Google Lens, TECH GIANT visual search tool.

“When you go directly with the research, it is similar to the presence of an expert in the rapid communication that you can see what you see and speak through difficult concepts in an actual time, all with easy access to useful links on the web.”

To use the feature, open the lens in the Google app, click Live icon, and ask a question while directing the camera to something. Using this feature, users can have a conversation with a search in AI mode using the visible context of their camera extract.

Image credits:Google

Search directly with the entry of the video this week is released on the US mobile phone for users registered in the Mode Labs experience.

TECHRUNCH event

San Francisco
|
27-29 October, 2025

In addition, Google announced that users will soon be able to use the lens in artificial intelligence mode to ask what is on the desktop screen.

“You may look at an engineering problem and want to better understand one of the graphs,” Stein said. “Click” ask Google about this page “from the address bar and select the graph. You will get an artificial intelligence overview with a snapshot of the main information directly in the side panel. This week, you will be able to follow up with more questions by placing artificial intelligence, by determining AI’s position at the highest search results for the lens or by clicking the” deeper “button at the bottom of the bottom.

Image credits:Google

In addition, although you can already use the artificial intelligence mode in the Google app to ask questions about the images, you can now do this on the desktop as well. Google also adds support for PDF downloads on the desktop, allowing you to ask detailed questions about documents.

For example, you can download PDF slices from a school lecture and ask follow -up questions to deepen your understanding beyond the separation materials.

Google says artificial intelligence will support additional files that exceed PDFS and images later this year, including Google Drive files.



https://techcrunch.com/wp-content/uploads/2025/06/GettyImages-2206888090.jpg?resize=1200,800

Source link

Leave a Comment