Query fan-out is the mechanism Google AI Mode uses to split a single search into 9 to 11 parallel sub-queries before deciding what to cite. Only 38% of AI Overview citations now pull from pages ranking in the top 10 for the same query, down from 76% seven months ago. If your brand ranks first on a primary keyword but doesn't appear in the sub-queries the AI generates from it, you are invisible in AI Overviews.
BrandCited is an AI visibility intelligence platform that monitors 9 AI platforms for brand citations and shows you exactly what to fix to appear in AI-generated answers.
The shift accelerated on January 27, 2026, when Google upgraded AI Overviews to run on Gemini 3. The new model performs more aggressive query expansion than its predecessors, generating broader and more varied sub-queries per search. An Ahrefs study of 863,000 keywords and 4 million AI Overview URLs found the citation overlap with top-10 rankings dropped from 76% in mid-2025 to 38% by February 2026. That 38-point drop is query fan-out in action.
What is query fan-out in Google AI search?#
Query fan-out is the process by which Google AI Mode breaks a single user query into multiple sub-queries before generating an answer. A user searching for "best CRM for B2B SaaS" triggers a fan-out into sub-queries like "B2B SaaS CRM comparison 2026," "CRM tools for sales teams," "Salesforce vs HubSpot for SaaS," and "CRM software features," according to upGrowth's query fan-out research.
Google AI Mode fires 9 to 11 sub-queries per search. ChatGPT's web search generates 2.3 to 2.8. The AI model uses a process called FastSearch to retrieve the top three organic results for each sub-query in parallel. Pages appearing across multiple sub-query SERPs get selected as citation sources. Your ranking on the original query is one data point among the 9 to 11 the AI evaluates.
This mechanism replaced the approach AI Overviews used in mid-2025, when citation selection drew from the primary query's top-10 results. The Gemini 3 upgrade changed what the AI considers when building its answer, and the citation numbers shifted fast.
Why your #1 ranking stopped guaranteeing an AI Overview citation#
In mid-2025, ranking in the top 10 for a query produced consistent AI Overview citations. Ahrefs confirmed that 76% of cited pages also ranked in the top 10 for the same query. Organic SEO and AI visibility were roughly synonymous.
By February 2026, that overlap fell to 38%. Some datasets tracked by Search Engine Journal put the figure as low as 17%.
The reason is sub-query expansion. Google's AI no longer evaluates your content only in the context of the query you rank for. It evaluates your content in the context of every sub-query it generates from that query. A brand ranking #1 for "CRM software" but absent from "CRM for sales teams," "CRM comparison," and "CRM 2026" is eligible for citation on roughly one of nine sub-queries the AI evaluates.
“[ALM Corp's query fan-out impact analysis](https://almcorp.com/blog/the-query-fan-out-impact/): "Ranking for fan-out sub-queries only, without ranking for the main keyword at all, makes a page 49% more likely to earn AI Overview citations than ranking solely for the primary query."
That finding is counterintuitive but consistent. Sub-query coverage across a topic cluster generates more citation exposure than dominance on a single primary term.
Rand Fishkin, founder of SparkToro, put the underlying dynamic clearly: "The currency of large language models is not links. The currency is mentions — specifically, words that appear frequently near other words across training data." For AI Overviews, which use live retrieval, the same principle applies across sub-queries: frequency of presence across the topic cluster beats depth of presence on one term.
The citation math: coverage across the sub-query cluster#
ALM Corp's query fan-out research quantified the citation premium from different coverage patterns:
| Coverage type | Citation likelihood vs. primary-only ranking |
|---|
| Primary query only | Baseline |
| Fan-out sub-queries only | +49% |
| Primary query + fan-out sub-queries | +161% |
Combined coverage is not just additive. Pages ranking for both the main query and its sub-query cluster are 161% more likely to appear in AI Overview citations compared to pages ranking for the primary term alone.
Publishing one comprehensive page on a primary keyword while neglecting adjacent sub-queries leaves most of your potential AI Overview citation exposure unrealized.
Content freshness adds a second multiplier. [Qwairy's 2026 content freshness analysis](https://www.qwairy.co/blog/content-freshness-ai-citations-guide) found that content updated within 30 days receives 3.2 times more AI citations than older content on the same topic. AI systems automatically add the current year to 28.1% of sub-queries even when the user's original query contained no date, according to [position.digital's April 2026 AI SEO statistics](https://www.position.digital/blog/ai-seo-statistics/). Pages without a recent update signal lose citation eligibility on roughly one in three sub-queries before their content is evaluated.
How to map the sub-queries your brand needs to cover#
Google does not publish its fan-out sub-queries, but the structure follows five consistent patterns. Use this template for each of your primary topics:
text
Primary keyword: [your topic]
Fan-out sub-query map:
1. Category query: "[broad topic] overview" or "[topic] explained"
2. Comparison query: "[your brand] vs [competitor]" or "best [topic] tools"
3. Use-case query: "[topic] for [specific job/industry/team]"
4. Feature query: "[topic] with [specific feature]"
5. Recency query: "[topic] 2026" or "best [topic] this year"
For each sub-query variant:
- What is your current ranking position?
- Does a specific page on your site answer it directly?
- Does that page open with a self-contained direct answer?
This map shows citation gaps immediately. Any sub-query variant where you have no ranking and no dedicated page is a gap preventing AI Overview citation for that topic cluster.
Track your AI visibility for free
See how ChatGPT, Claude, Gemini, and 4 other AI platforms mention your brand.
Start free scanThe comparison query variant deserves specific attention. AI Overviews generate "X vs Y" sub-queries from almost every category search. Brands without comparison pages miss those citation slots. Aleyda Solis's March 2026 core update analysis found the update rewarded destination brands with original perspectives, including comparison content, while penalizing intermediary and aggregator sites.
44.2% of all AI Overview citations come from the first 30% of a page's text, according to position.digital's 2026 data. Each sub-query page needs its direct answer in the opening paragraph, not after setup.
AI search updates from the last 24 hours#
- ChatGPT outage: OpenAI confirmed a partial service outage on April 20 affecting users globally for approximately 90 minutes. The OpenAI status page noted "monitoring recovery" through the incident. (Tom's Guide)
- GPT-5.4 mini rollout: OpenAI deployed GPT-5.4 mini as a fallback for Plus and paid users during high-demand periods for GPT-5.4 Thinking. (OpenAI release notes)
- Perplexity Comet enterprise: Perplexity's AI-native browser Comet is now available for enterprise organizations with MDM deployment across macOS and Windows. (Perplexity changelog)
- Google AI Mode growth: Google AI Mode has reached 75 million active users, with ads now appearing in 25% of AI-generated results. (Digital Applied)
How BrandCited audits query fan-out coverage#
BrandCited's visibility audit tests your brand across multiple prompt types per AI platform, including category-level prompts, comparison prompts, and use-case-specific prompts. The audit scores reflect coverage across the query cluster, not just a single branded or primary keyword search.
If your brand scores well on direct brand queries but below 60 on category and comparison prompts, query fan-out coverage is the structural gap. The audit flags this with specific content recommendations for the missing prompt types. Run a free AI visibility audit at brandcited.ai.
What to do right now#
- 1Map the fan-out sub-queries for your five most important keywords. Use the template above to identify the category, comparison, use-case, feature, and recency variants for each topic. These are your citation coverage targets.
- 2Check your ranking on each sub-query variant. A missing ranking is a missing citation slot. Use your rank tracker or a manual Google check for each variant.
- 3Create comparison pages if you don't have them. "X vs Y" sub-queries appear in fan-out from almost every category search. Brands without comparison pages miss those citation opportunities entirely.
- 4Update your highest-value pages within the next 30 days. Add one new data point, update a statistic, and change the
dateModified in your schema markup. Content updated within 30 days gets 3.2 times more AI citations. - 5Write the opening paragraph of every sub-query page as a self-contained direct answer. 44.2% of AI citations come from the first 30% of text. The answer belongs in the first paragraph.
- 6Run a BrandCited audit to see your coverage score across category, comparison, and use-case prompt types on all 9 AI platforms. The audit shows every gap ranked by impact.
Run a free AI visibility audit at brandcited.ai. You'll see your citation coverage across 9 AI platforms in 30 seconds, with every gap ranked by impact.
Frequently asked questions#
What is query fan-out in Google AI search?
Query fan-out is Google AI Mode's process of splitting a single user search into 9 to 11 parallel sub-queries before generating an answer. The AI retrieves the top three organic results for each sub-query via FastSearch, then synthesizes them into the AI Overview response. Pages appearing across multiple sub-query results are more likely to be cited than pages ranked only for the primary query.
Why don't my top-10 Google rankings guarantee AI Overview citations?
An Ahrefs study of 863,000 keywords found that only 38% of AI Overview citations now come from top-10-ranked pages, down from 76% seven months ago. The drop followed Google's upgrade of AI Overviews to Gemini 3 in January 2026, which introduced broader query fan-out. Your ranking on the primary query is one of 9 to 11 sub-queries the AI evaluates, and missing coverage on the others removes you from most citation consideration.
How do I find the sub-queries Google generates for my keywords?
Google does not publish fan-out sub-queries, but they follow five consistent patterns: a category query, a comparison query, a use-case query, a feature query, and a recency query. The "People Also Ask" section in standard Google results is a close proxy for the sub-queries AI Mode generates. BrandCited tests your brand against multiple prompt types across 9 AI platforms and shows which categories have citation gaps.
BrandCited vs a standard rank tracker for query fan-out coverage?
A standard rank tracker shows one position for one query. Query fan-out means your AI citation potential depends on coverage across 9 to 11 sub-queries per topic. BrandCited monitors brand citations using multiple prompt types per topic across 9 AI platforms, giving you coverage data that rank trackers cannot produce. The gap between rank tracker data and actual AI citation coverage is exactly the query fan-out gap.
How do I get my brand mentioned in Google AI Overviews?
Map the sub-query cluster for your primary keywords and build or update pages targeting the comparison, use-case, and recency variants. Write the opening paragraph of each page as a self-contained direct answer. Update that content every 30 days with new data, since content updated within 30 days receives 3.2 times more AI citations. Add FAQ schema to pages answering question-based queries, and submit your sitemap to Bing Webmaster Tools for ChatGPT and Copilot indexing.
How long until AI Overview citations change after fixing sub-query coverage?
Content updates typically get indexed within 7 to 14 days after a substantive change. Citation pattern shifts in AI Overviews appear within 2 to 4 weeks from indexing. Building rankings on new sub-query pages takes 4 to 12 weeks depending on domain authority and competition. BrandCited tracks citation changes weekly, so you can see movement as each fix takes effect.