For months, the SEO industry has debated how much traffic Google's AI Overviews actually steal from publishers. The numbers floated around ranged widely: 20% here, 40% there, 60% in edge cases. Most of those estimates came from observational studies, correlation analyses, or anecdotal reports from individual sites. Useful, but not conclusive.

That debate is now over.

On April 3, 2026, researchers Saharsh Agarwal and Ananya Sen published a working paper on SSRN titled "Google AI Overviews and Publisher Traffic: Evidence from a Field Experiment." On April 30, Search Engine Journal broke the story to the broader industry. What makes this study different from everything that came before is the methodology: this was a randomized field experiment, not an observational study. The researchers had a treatment group and a control group. They could isolate the effect of AI Overviews from every other variable.

The headline number is 38%. When AI Overviews appeared on a query, outbound organic clicks dropped by 38%. Zero-click search rose from 54% to 72%. And here is the detail that should make every publisher, brand, and SEO professional stop scrolling: removing AI Overviews did not reduce user satisfaction.

Google built a feature that keeps users on Google. It does not send them to better answers. It does not make them happier with the results they see. It simply absorbs the click that would have gone to a publisher.

What the experiment actually measured

Agarwal and Sen designed the study with the kind of rigor that academic peer review demands and that most industry reports lack. Participants were randomly assigned to either see AI Overviews or not see them when they performed Google searches. The researchers then measured three things: outbound clicks, zero-click rates, and user satisfaction scores.

The AI Overviews in the study appeared on approximately 42% of queries. That is a substantial share of search volume. When AI Overviews were present, the average number of outbound clicks per search dropped from 0.61 to 0.38. That is not a marginal decline. It is a structural shift in how search behaves.

The zero-click rate tells the same story from the other direction. Without AI Overviews, 54% of searches ended without the user clicking any external link. With AI Overviews, that figure jumped to 72%. Put differently: nearly three out of four searches ended with the user staying on Google's own page.

Ghostly publisher doorways dissolving into mist while a luminous answer panel absorbs attention above

The satisfaction finding is the one that matters most for the policy debate. Google has consistently argued that AI Overviews improve the search experience. The company's VP of Search, Liz Reid, has publicly claimed that AI Overviews help users find what they need more efficiently. The Agarwal and Sen experiment directly tested this claim and found no measurable difference in user satisfaction between the treatment group (AI Overviews visible) and the control group (AI Overviews removed). Users were not happier. They were not more satisfied. They simply clicked less.

The 38% number in context

To understand what a 38% reduction means, consider what happens to a publisher who currently receives 100,000 organic visits per month from Google. If AI Overviews trigger on 42% of the queries that previously drove that traffic, and those queries lose 38% of their clicks, the publisher loses roughly 16,000 visits per month. That is not a rounding error. It is a headcount reduction, a content budget cut, or a revenue decline.

The effect compounds when you layer in the other data points that have emerged in recent months.

Seer Interactive reported that paid click-through rates on informational queries dropped 68% when AI Overviews were present in their September 2025 analysis. Ahrefs found that AI Overviews reduce organic CTR for position one by up to 58%. Position Digital's April 2026 update to their AI SEO statistics compendium shows organic CTR within AI Overviews themselves at just 2.4%, up from a floor of 1.3% in December 2025 but still dramatically below traditional organic CTR rates.

For brands that have built their acquisition strategy around ranking in Google's top three results, the math is unforgiving. You can rank first, appear in an AI Overview, and still see your traffic decline because the AI Overview answered the user's question before they ever needed to visit your page.

Why this study matters more than the others

The SEO industry has seen plenty of data about AI Overviews and traffic decline. The Seer Interactive analysis was widely cited. The Tyneside Marketing study on position one CTR got attention. Various tool providers have published before-and-after case studies from individual sites.

All of those studies were observational. They looked at what happened to traffic after AI Overviews appeared. The problem with observational data is confounding variables. Maybe traffic declined because of a Google algorithm update. Maybe the queries that trigger AI Overviews are structurally different from those that do not. Maybe seasonal trends distorted the picture.

Agarwal and Sen solved this problem with randomization. By randomly assigning users to see or not see AI Overviews, they eliminated confounding variables. The only difference between the two groups was the presence of AI Overviews. When the group without AI Overviews clicked 61% more often and reported the same satisfaction, the causal arrow was clear: AI Overviews cause the click reduction, and they do not cause a satisfaction improvement.

This is the difference between "we think AI Overviews reduce traffic" and "we can prove AI Overviews reduce traffic." The SEO industry now has proof. The burden of evidence has shifted to Google to demonstrate that AI Overviews provide a user benefit that justifies the publisher cost.

The zero-click trajectory: from feature to default

The Agarwal and Sen data does not exist in isolation. It is part of a trajectory that has been visible for months.

Search Engine Land published a strategic framework on April 30 called "From Paid Clicks to Answer Equity." The framing captures the shift precisely: the old search economy was built on clicks. Publishers created content, Google sent clicks, publishers monetized those clicks through ads, subscriptions, or commerce. The new search economy is built on answers. Google creates the answer using publisher content as raw material, presents it in an AI Overview, and the click never happens.

Position Digital's data adds another layer. Google's AI Mode, the deeper AI search experience the company launched in early 2026, has a zero-click rate of approximately 93%. That means nine out of ten AI Mode searches end without any external click. AI Overviews, by comparison, look almost generous at a 43% zero-click rate.

The trajectory is clear. Google is moving from a search engine that sends traffic to an answer engine that keeps traffic. The question is no longer whether this is happening. The Agarwal and Sen study proves it is. The question is what brands and publishers do about it.

What the data means for brands

The strategic implications of the 38% figure are straightforward, even if the operational response is not.

First, measuring success by organic click volume is no longer sufficient. A brand could maintain its rankings and still lose traffic because AI Overviews intercept the click. The metric that matters now is visibility within AI answers, not just ranking position. Are you cited in the AI Overview? Are you the source that Google's model draws from when generating its summary? If you are not, your ranking is invisible to the user who reads the AI Overview and never scrolls down.

Second, the distinction between being cited and being summarized matters enormously. Google's AI Overviews sometimes link to sources and sometimes do not. The Agarwal and Sen study measures overall click reduction, which includes both scenarios. But a brand that is cited as a source within an AI Overview at least gets a small fraction of users who click through. A brand that is merely summarized without attribution gets nothing. Citation visibility, not just keyword ranking, is the metric that determines whether you lose 38% or 100% of your traffic on AI Overview queries.

Third, the conversion value of the remaining clicks is likely higher. This is the paradox of AI search traffic: fewer clicks, but higher intent. Users who click through an AI Overview to visit a source have already read the summary and decided they want more detail. They are further down the funnel than a user who clicks a blue link from a traditional SERP. Multiple independent studies, including data from Ahrefs and Digital Bloom, show AI referral traffic converting at 10-16%, compared to 2-5% for traditional organic. The 38% click reduction hurts volume, but the remaining clicks are worth more per visit.

Fourth, the satisfaction finding changes the competitive narrative. Google cannot credibly argue that AI Overviews are a net positive for users when a randomized experiment shows no satisfaction improvement. This matters for regulatory conversations, for publisher negotiations, and for the broader public understanding of what AI search actually does.

The evidence gap Google needs to close

Google's public position on AI Overviews has been consistent: they improve the search experience. The company points to internal data showing that users spend more time on pages they click from AI Overviews, suggesting higher engagement. It points to the breadth of queries where AI Overviews appear, suggesting utility.

The Agarwal and Sen study challenges both of these claims. The satisfaction data shows no improvement. The click data shows substantial traffic diversion. And the zero-click data shows that the vast majority of AI Overview queries never result in a click at all.

Google's strongest remaining defense is that AI Overviews serve queries that previously had no good answer. If a user asks a complex, multi-part question and the AI Overview synthesizes information from five sources into a coherent response, that might represent a genuine improvement over the pre-AI Overview experience of visiting five separate pages.

But the Agarwal and Sen study did not find evidence for this. The satisfaction scores were flat across the board, including for complex queries. If AI Overviews were genuinely helping users with difficult questions, satisfaction should have increased for those query types. It did not.

Where this fits in the broader AI search landscape

The 38% figure is not happening in isolation. The Datos Q1 2026 State of Search report, published the same week, shows AI tools at 1.72% of desktop visits, up from 0.98% a year ago. That is 76% year-over-year growth. Google still holds 94.3% of search market share, but the AI search category is expanding fast enough that total search volume grew 26% worldwide.

This means two things are true simultaneously. AI Overviews are reducing organic click-through rates on Google, and AI search engines like ChatGPT, Perplexity, and Claude are creating new referral traffic channels that convert at significantly higher rates. The brands that win in this transition will be the ones that stop treating AI search as a threat to their Google traffic and start treating it as a new distribution channel with its own economics.

The 38% study is the clearest evidence yet that the old playbook is obsolete. Ranking well in Google is still necessary, but it is no longer sufficient. Visibility inside AI answers, across both Google's own AI Overviews and standalone AI platforms, is now a separate and critical requirement.

What to do now

For brands and publishers trying to navigate this shift, the Agarwal and Sen study provides both a warning and a framework.

The warning is that traffic loss from AI Overviews is real, causal, and substantial. This is not speculation. It is experimental evidence. Brands that have not yet measured their AI visibility, meaning whether they are being cited or summarized in AI Overviews and AI search results, are operating blind.

The framework is that the brands mentioned in AI Overviews earn significantly more organic clicks than those outside them. Industry data suggests a 35% uplift in clicks for cited sources. This means that AI visibility is not just a defensive play. It is an offensive one. Being the source that AI engines cite is the new version of ranking first.

Start with an AI visibility audit to understand where your brand stands. The 38% figure tells you the scale of the traffic at risk. The audit tells you whether your brand is on the right side of that equation.

Sources

  1. Agarwal, S. & Sen, A. "Google AI Overviews and Publisher Traffic: Evidence from a Field Experiment." SSRN Working Paper, April 3, 2026.
  2. Search Engine Journal. "Randomized Field Experiment Shows AI Overviews Reduce Organic Clicks 38%." April 30, 2026.
  3. Position Digital. "AI SEO Statistics: 2026 Update." April 29, 2026.
  4. Search Engine Land. "From Paid Clicks to Answer Equity." April 30, 2026.
  5. Tyneside Marketing. "Position 1 CTR Drops 58% with AI Overviews." 2026.
  6. Datos. "Q1 2026 State of Search Report." April 2026.
  7. Digital Bloom. "Gen AI Traffic Report." February 2026.

Frequently Asked Questions

What did the Agarwal and Sen field experiment measure? The study randomly assigned users to either see or not see Google AI Overviews during search, then measured outbound clicks, zero-click rates, and user satisfaction scores across both groups. This randomized design allowed the researchers to isolate the causal effect of AI Overviews from other variables.

Does the 38% click reduction apply to all searches? No. The 38% reduction applies specifically to queries where AI Overviews appear. The study found AI Overviews triggered on approximately 42% of queries. Queries without AI Overviews were unaffected. The overall impact on a given website depends on what share of its Google traffic comes from AI Overview-eligible queries.

Why is user satisfaction flat if AI Overviews reduce clicks? This is the most important finding in the study. Google has argued that AI Overviews improve the search experience. The data shows they do not. Users are not more satisfied when they see AI Overviews. They simply click less. This suggests AI Overviews are substituting for outbound navigation rather than enhancing answer quality.

How does this compare to AI Mode's zero-click rate? AI Mode has a much higher zero-click rate, approximately 93% according to Position Digital. AI Overviews have a lower zero-click rate of around 43%. The 38% figure from the field experiment measures the incremental click reduction specifically attributable to AI Overviews on queries where they appear. Both products reduce outbound traffic, but AI Mode does so far more aggressively.

What should brands do in response? Measure AI visibility first. Understand whether your brand is being cited inside AI Overviews and AI search results, or merely summarized without attribution. Then optimize for citation: structured data, clear definitions, original research, and answer-first content formatting. The brands that are cited in AI answers earn more clicks than those that are not, even in an AI Overview environment.

See the full AI search statistics dashboard for more data points on how AI search is reshaping traffic and conversion.