Google I/O 2025: Google’s AI Search speeds up shopping smart and creating charts

The search engine is going full throttle into its AI Mode feature. Here’s a summary of the six new things that will make searching for stuff online easier and faster. #google #shopping #gemini #agenticai

Note: This article was first published on 21 May 2025.

Google I/O 2025 brings about massive changes to how people will be using search engines in the future. Image: Google.

Google I/O 2025 brings about massive changes to how people will be using search engines in the future. Image: Google.

Google I/O 2025 just dropped its key announcements, and we saw Google expounding on its AI search capabilities across its many products and services. Here’s the quick down-low on Google’s AI Mode and why it changes how you conduct searches on Google’s engine in the future.

What’s AI Mode?

AI Mode. Image: Google.

AI Mode. Image: Google.

In a nutshell, AI Mode is Google AI Search with multimodal capabilities. It’s much more than AI Overview summaries you already see in your search results, because you can adopt a ChatGPT-style conversation to conduct a string of searches based on your previous query. We’ve previously covered what AI Mode does and its progress from beta-testing to gradual rollout.

As of Google I/O 2025’s announcement, it’s still only available to U.S.-based users, but it no longer requires Google Labs access.

Also coming to U.S. users is a custom version of Gemini 2.5 in AI Mode.

Deep Search in minutes

Deep Search isn't only for academic research. You can save plenty of time with just a single query. Image: Google.

Deep Search isn't only for academic research. You can save plenty of time with just a single query. Image: Google.

As an evolution of its “past searches” feature, AI Mode now offers Deep Search.

For the everyday user, AI Mode can now search beyond one-dimensional queries, bring every possible piece of information, and even create fully cited reports in minutes. Without this capability, the same search journey would have taken “hours,” per Google’s statement on this feature.

The technical explanation is that Deep Search adopts the same query fan-out technique, but it’s powered by AI to help issue hundreds of searches.

Pointing and asking is now possible with Live Visual Search via Google Lens. Image: Google.

Pointing and asking is now possible with Live Visual Search via Google Lens. Image: Google.

Using the camera, photos, and images for visual search is not new, but thanks to multimodal AI input, visual search is now made real-time.

The Live visual search appears as an icon in AI Mode. Point the camera at your desired object and ask questions, and Live visual search will answer. Google said it works on tricky concepts and is also capable of pulling resources (websites, videos, forums, and more) to help the user out.

According to Google, this real-time aspect was made possible by its work on Project Astra, which is the same project that brought us Gemini Live.

Another of Google’s projects is Project Mariner, which is responsible for agent-based web browser interactions. Google’s work in Project Mariner has resulted in the underpinnings of agentic AI inside Google Search via AI Mode.

Not much was shared about this, save for explaining how it works. Google gave the example of how its Search would consider your query to offer the right results. The example they used was affordable tickets to a specific sporting event, and the agentic aspect of Search will present results that match your affordability criteria.

All the user needs to do is pick their preferred ticketing website to complete the search journey.

Let’s go AI shopping

In our view, this is probably Google's most exciting AI tool for its search engine.

An example of AI Mode's search results when you're out shopping on the web. Image: Google.

An example of AI Mode's search results when you're out shopping on the web. Image: Google.

As an evolution of its Shopping Graph tool, Google has taken it upon itself to improve the best kind of therapy (retail therapy, a.k.a shopping).

The revamped AI Mode shopping experience combines Shopping Graph with Gemini to offer an online shopping experience that roughly looks like this:

If asked, AI Mode itself will help the user ideate on style. That means that the AI will understand you’re keeping options open and looking to see if anything you’re searching for tickles your fancy.

The user can refine their search by adding more to their original query, like the travel destination or occasion they are shopping for. AI Mode uses its query fan-out technique to run several searches simultaneously to achieve this.

The virtual try-on feature takes some of the guesswork of style shopping out. Image: Google.

The virtual try-on feature takes some of the guesswork of style shopping out. Image: Google.

During the shopping process, you can also upload a single image of yourself for a virtual try-on to determine the fit and style.

When you’ve finally decided on the item, you can also use agentic AI to stay notified of price drops and buy when the price is right.

Google is giving nobody any excuses to look less than their best.

Personalisation in AI Mode

From the above examples we can see that AI Mode would only be meaningful if it can remember your personal preferences. 

AI Mode will be compatible across Google’s app ecosystem and can remember past searches to add context to search results. The personalisation can be disabled when needed, via your settings.

Visualise data with auto-generated charts

AI Mode can help visualise certain search results to make data easier to read. Image: Google.

AI Mode can help visualise certain search results to make data easier to read. Image: Google.

If you’re numbers-blind or simply not willing to exert more brain cells than necessary, Google’s AI Mode now can offer graphics for datasets, and it can do so entirely via your queries. The graph can be interactive if you want further breakdowns or callouts. 

For now, this feature is available to sports and finance search queries.

Wow, when would all this arrive?

At Google I/O 2025, Google said it’s rolling out to all U.S.-based Google users. There’s no news for other regions, but we’re confident it’ll come later. Stay tuned.



Google Blog (1, 2)

Our articles may contain affiliate links. If you buy through these links, we may earn a small commission.

Share this article