LOGO

Google AI Mode Now Available: Enhanced Research & Shopping

May 20, 2025
Google AI Mode Now Available: Enhanced Research & Shopping

Google's AI Mode: Expanded Rollout and New Features

The experimental Google Search feature, AI Mode, which enables users to pose intricate, multi-faceted questions through an AI-driven interface, is set to become accessible to all users within the United States beginning this week. This announcement was made by the company during its annual developer conference, Google I/O 2025, held on Tuesday.

Building Upon Existing AI Capabilities

This new feature expands upon Google’s pre-existing AI-enhanced search experience, known as AI Overviews. AI Overviews present AI-generated summaries at the very top of standard search results pages.

Launched the previous year, AI Overviews initially received a varied reception. Instances of the AI providing questionable responses and advice – including a suggestion involving adhesive for pizza – were noted.

Despite these early challenges, Google reports substantial adoption of AI Overviews, with over 1.5 billion monthly users engaging with the feature. It is now transitioning out of its testing phase ("Labs") and will be extended to over 200 countries and territories, supporting more than 40 languages.

Introducing AI Mode and Deep Search

AI Mode allows users to formulate complex inquiries and subsequently request follow-up information. Initially available for testing within Google’s Search Labs, its arrival coincided with other AI companies, such as Perplexity and OpenAI, venturing into Google’s core search territory with their own web search functionalities.

Recognizing the potential for losing search market share, Google positions AI Mode as its vision for the future of search technology.

A key component of the broader AI Mode rollout is Deep Search. While AI Mode dissects a question into subtopics to provide answers, Deep Search performs this analysis on a much larger scale.

It can initiate dozens, or even hundreds, of individual queries to comprehensively address a user’s request, and importantly, includes links for further independent research.

The outcome is a fully referenced report generated rapidly, potentially saving users significant time in research, according to Google.

Enhanced Shopping Features

Google suggests utilizing Deep Search for tasks like detailed comparison shopping, whether for major appliances or activities like selecting a summer camp.

An additional AI-driven shopping feature within AI Mode is a virtual “try-on” capability for clothing. This utilizes a user-uploaded photograph to generate an image of the user wearing the selected item.

The system accounts for 3D shapes, fabric characteristics, and material stretch, Google explains, and will begin its rollout through Search Labs today.

Looking ahead, Google plans to introduce a shopping tool for U.S. users that will autonomously purchase items when they reach a specified price point. However, users will still need to initiate the process by clicking a “buy for me” prompt.

Underlying Technology and Future Developments

Both AI Overviews and AI Mode will now leverage a customized version of Gemini 2.5. Google anticipates that the capabilities of AI Mode will gradually be integrated into AI Overviews over time.

AI Mode will also incorporate support for complex data analysis in sports and finance queries, becoming available through Labs in the near future. This will allow users to ask intricate questions, such as comparing the home game win percentages of the Phillies and White Sox over the past five seasons.

The AI will consolidate data from multiple sources, present a unified answer, and even generate dynamic visualizations to aid in understanding the information.

Leveraging Project Mariner and Search Live

Another new feature utilizes Project Mariner, Google’s agent designed to interact with the web and execute actions on a user’s behalf. Initially focused on local services like restaurants and events, AI Mode will streamline the process of researching prices and availability across various websites.

Search Live, scheduled for release later this summer, will enable users to ask questions based on the real-time visual input from their smartphone’s camera. This extends beyond the capabilities of Google Lens, allowing for an interactive conversation with the AI using both video and audio, mirroring the functionality of Google’s multimodal AI system, Project Astra.

Personalized Search Results

Search results will also be tailored to individual user preferences based on their previous search history. This personalization will be further enhanced by the option to connect Google Apps, a feature launching this summer.

For example, connecting Gmail could allow Google to identify travel dates from booking confirmations and subsequently recommend relevant events in the destination city.

Google acknowledges potential privacy concerns and assures users that they can connect or disconnect their apps at any time.

Gmail is the first app to be supported with this personalized context feature.

#google ai#ai mode#google search#research tools#comparison shopping#ai features