Search Engine Journal
Google's AI Overviews Cut Clicks Without Satisfaction Gain: Report
A randomised field experiment involving 1,065 US desktop Chrome users found that Google's AI Overviews reduce organic clicks by 38% on queries where they appear. Zero-click searches rose from 54% to 72% when AI Overviews were present. Researchers recruited participants via Prolific and deployed a Chrome extension that randomly assigned them to one of three groups: standard Google Search, AI Overviews hidden, or all searches redirected to AI Mode. The experiment ran for two weeks per participant between January and February 2026 and was pre-registered with the AEA RCT Registry, making it the first randomised field experiment on AI Overview impact run under real browsing conditions.
The finding that closes off the user-benefit argument: self-reported satisfaction, perceived quality, and ease of finding information were nearly identical whether AI Overviews were present or removed. The study authors concluded that AI Overviews "divert traffic away from publishers without delivering measurable improvements in user experience." The effect was strongest when AI Overviews appeared at the top of the page, which happened in 85% of cases. For publishers and SEOs, this is the clearest evidence yet that the click reduction from AI Overviews is structural, not a temporary side effect of a new feature finding its footing.
Key points
- AI Overviews reduce organic clicks by 38% on triggered queries, based on a randomised field experiment
- Zero-click searches rose from 54% to 72% when AI Overviews appeared
- 1,065 US desktop Chrome users, randomised into three groups, January to February 2026
- User satisfaction, perceived quality, and ease of finding information were nearly identical with and without AI Overviews
- AI Overviews appeared at the top of the page in 85% of cases, where the click reduction effect was strongest
- First pre-registered randomised field experiment measuring AI Overview impact in real browsing conditions
Key takeaway
The 38% click reduction is now backed by experimental evidence, not correlation. If a meaningful share of your organic traffic comes from queries that trigger AI Overviews, plan for that decline as a structural feature rather than a phase. The most actionable response is to identify which of your pages rank on queries that consistently trigger AI Overviews, and assess whether your content and conversion strategy reflects the reduced click opportunity on those queries.
Also worth considering
The combination of 38% fewer clicks and no measurable user benefit is the data point that regulators and publishers will cite. It removes the argument that AI Overviews serve users better at the expense of publishers. They reduce publisher traffic and produce no improvement in user experience. That is a different kind of finding, and it changes the political and legal context around how AI Overviews are framed.
What I'm testing
Segmenting top organic pages against AI Overview trigger data to estimate what proportion of that traffic is structurally at risk. The goal is to build a traffic model that treats AI Overviews as a stable feature, not an experiment, and to identify which content categories have the highest AI Overview trigger rates so I know where the click exposure is concentrated.