Search isn’t ten blue links anymore.
Google AI Overviews, ChatGPT search, Perplexity, Claude, featured snippets, knowledge panels. The way people find and consume information is fragmenting across surfaces that summarize, extract, and repackage your content instead of linking to it.
The question isn’t whether AI search matters. It’s whether your content is structured to survive summarization. If an AI system can’t parse your page, extract the key information, and attribute it correctly, you’re invisible on the fastest-growing search surfaces.
Things are changing faster than anyone can keep up with right now. Google ships AI Overview changes monthly. ChatGPT search keeps expanding. Perplexity went from curiosity to real competitor in under a year. Claude has web browsing, Claude Code, and full computer-use agents. AI systems are already doing product research, vendor evaluation, and purchasing on behalf of users, and your content is either structured for that world or invisible in it. The barrier to “good enough” content just dropped to zero, which means the only moat left is original data, real expertise, and structure machines can actually parse. If you’re not building that, someone in your industry is.
Modern search lives primarily in Get Chosen. Your content might be crawlable and well-structured, but the new question is: does an AI system choose your content as the source for its answer? That selection depends on clarity, specificity, authority, and whether your content provides information gain over competing sources.
What’s Actually Changing
Not everything. But the things that are changing matter a lot.
Answers replace links
AI Overviews, featured snippets, and answer boxes give users the information without requiring a click. Your content still powers the answer, but the traffic pattern changes. Visibility increasingly means being the source of an answer, not just a result on the page.
Summarization compresses content
AI systems don’t show your whole page. They extract, compress, and repackage the parts they consider most relevant. If your key information is buried in paragraph 12 or locked behind a tab click, it won’t make the summary. Structure and clarity determine what survives extraction.
Multi-source synthesis is the norm
AI answers pull from multiple pages simultaneously. Your content competes not just for a ranking position but for inclusion in a synthesized answer alongside other sources. The content that provides the clearest, most specific answer to a component of the query wins that slot.
Query complexity is increasing
People are asking longer, more specific questions because AI can handle them. “Best excavator for residential demolition under 20 tons with thumb attachment” is a real query now. Pages that answer narrow, specific questions with genuine expertise win in this environment.
What hasn’t changed
Good technical SEO is still the foundation. AI crawlers need the same things Google’s crawler needs: rendered HTML, clean URLs, logical hierarchy, fast load times. If your technical foundation is broken, you’re invisible to AI systems too. Strong content strategy still wins. Pages that clearly answer a specific question with genuine expertise and unique information still outperform generic content. AI systems are better at evaluating quality than keyword-matching algorithms ever were. The fundamentals haven’t changed. The surfaces have.
The top of your funnel is getting eaten.
AI search is not disrupting every stage of the user journey equally. The stage getting hit hardest right now is top-of-funnel informational search. The “what is,” “how does,” “why should I” queries that used to drive awareness traffic to your site. AI Overviews answer them directly. ChatGPT answers them in a conversation. The user never clicks.
Top of funnel / Informational
“What is crawl budget” gets answered in the AI Overview. “How does hard water affect pipes” gets a Perplexity summary. These queries still generate impressions in Search Console, but the clicks are evaporating. Google’s own data shows impressions rising while clicks stay flat. That’s the top of funnel compressing. If your traffic strategy depends on informational queries driving awareness, you’re watching the foundation erode.
Mid-funnel / Commercial investigation
“Best water softener for hard water” is starting to get AI-synthesized answers that compare products without the user ever visiting a review site. “CAT 320 vs Deere 200G” gets a side-by-side summary. These comparison and evaluation queries still drive clicks today, but the window is narrowing. Content that earns citation in AI answers retains influence. Content that doesn’t gets bypassed.
Bottom of funnel / Transactional
“Buy CAT 320 excavator Portland” still drives a click. The user needs to actually transact, schedule, or contact someone. AI can’t close the deal for you. But even here, AI is starting to mediate: Google’s AI shopping experiences, Perplexity’s product cards, and ChatGPT’s browse-and-buy suggestions are inserting a layer between the query and your site. Local search is especially exposed. “Near me” queries are getting AI-filtered before anyone sees your map pack listing. Transactional is safest, but not untouchable.
So what do you do about it?
Stop depending on top-of-funnel traffic for business results.
Informational content still matters for authority and topical coverage. But if your strategy relies on “what is X” queries driving traffic that converts, you need to shift. Those clicks are disappearing. Informational content should build entity authority and earn AI citations, not be your primary traffic source.
Invest hard in mid-funnel content now, while it still clicks.
Comparisons, evaluations, “best for,” decision-support content. This is where the real opportunity is right now. Users doing commercial investigation still want to click through, compare options, and evaluate. But AI is coming for this layer next. The brands that build strong, specific, data-backed comparison content now will be the ones AI systems cite when this layer compresses. Build the intent bridge between awareness and purchase with content that’s too specific and too useful for AI to fully replace.
Make your bottom-of-funnel experience frictionless.
When someone does click through from an AI answer or a transactional search, the landing experience needs to convert immediately. No friction, no confusion, no “call for a quote” when the user expected pricing. The fewer clicks there are, the more each one matters. Transactional pages that fail are an even bigger problem when traffic is declining.
Optimize for citation, not just clicks.
When AI answers a top-of-funnel question using your content, your brand appears in the answer even if no one clicks. That’s still brand visibility. Structure your informational content to earn that citation: clear answers, original data, named sources. The metric shifts from clicks to presence in AI-generated answers. That presence compounds into brand familiarity that influences mid and bottom-funnel decisions later.
How I Approach Modern Search
Not as a separate discipline. As an extension of the same structural principles that make traditional SEO work.
Answer-first content structure
Put the answer at the top. Lead with the conclusion, then provide the evidence and context below. AI extraction favors content that frontloads the answer rather than building to it. This isn’t dumbing things down. It’s making content structurally parseable while keeping the depth for people who want it.
Explicit heading hierarchy
Every H2 should be a question someone actually asks or a clear topic label. Every paragraph under it should directly address that heading. AI systems use heading structure to segment content and extract relevant sections. Vague headings like “Our Approach” give them nothing to work with.
Structured data as machine context
Schema markup tells AI systems what type of content this is before they read it. FAQPage, HowTo, LocalBusiness, Product. It’s metadata that helps machines classify and extract accurately. Every page template should have appropriate schema built in, not bolted on.
Information gain over information repetition
AI systems increasingly evaluate whether a page adds something new to the index. If your page says the same thing as 50 others, it won’t be selected as a source. Original data, unique analysis, specific examples, and genuine expertise are what earn citation in AI answers.
Entity clarity
AI systems understand entities, not keywords. They need to know that “CheckMyTap” is a water quality site, that “Portland” means Oregon not Maine in this context, that “E7018” is a welding electrode classification. Consistent naming, structured data, and contextual clarity help AI systems resolve entities correctly.
Multi-surface awareness
Content now appears in Google AI Overviews, ChatGPT, Perplexity, Bing Copilot, and featured snippets. Each surface has slightly different extraction behavior. The best approach isn’t optimizing for each one separately. It’s building content that’s structurally clear enough to work across all of them.
None of this makes sense without understanding how people search now.
AI is changing the surfaces, but search behavior was already shifting before AI Overviews existed. People don’t search in keywords anymore. They ask full questions. They refine conversationally. They expect answers, not links. Intent shifts depending on context, location, and where someone is in a decision. If you don’t understand the behavioral layer underneath all the AI hype, you’ll optimize for the wrong things.
I write about this extensively. How intent types actually work. Why misaligned intent transitions kill conversion paths. What happens when transactional pages fail ready-to-buy users. The difference between informational and transactional page design. How commercial intent sits in the messy middle where most purchase decisions actually happen.
Where You Can See This Working
These sites are built with AI-readable structure. The decisions are visible in the source.
CheckMyTap
When someone asks ChatGPT “is Phoenix water safe to drink”, the answer needs to come from somewhere. CheckMyTap is structured to be that source.
Answer-first city pages
Every city page leads with the key data: hardness level, PFAS status, lead readings, overall safety assessment. The answer is in the first 100 words. Supporting context, treatment recommendations, and detailed contaminant tables follow below. An AI system extracting “Phoenix water quality” gets a clear, specific, citable answer immediately.
Structured data for extraction
Schema markup on every page declares what the content is and what location it covers. Consistent heading hierarchy means an AI system can pull the PFAS section without reading the whole page. The template enforces this structure across 1,000+ pages so extraction is reliable at scale.
Information gain through real data
The actual water quality numbers are information that doesn’t exist in this structured format anywhere else. That’s genuine information gain. When an AI system synthesizes an answer about water quality in a specific city, CheckMyTap provides data points that other sources can’t. That’s what earns citation.
How the system shows up here
Get Found: Server-side rendered HTML. AI crawlers see the same complete content as users on first request.
Get Understood: Structured data, consistent heading hierarchy, and explicit entity signals (city name, state, contaminant types) across every page.
Get Chosen: Answer-first layout and unique data points make it a high-value extraction source for AI answers about water quality.
WireRef
Reference queries like “6 AWG copper ampacity” are exactly what AI systems want to answer directly. WireRef is built to be the source they pull from.
Semantic table markup
Specification data lives in proper HTML tables, not div grids. AI systems can parse table structure, identify column headers, and extract specific values. When Perplexity answers “what is the ampacity of 6 AWG THHN copper at 75C,” it needs a table it can read. Semantic markup provides that.
One URL per answer
Each ampacity page, each wire sizing page, each comparison page targets one specific reference query. No single page tries to cover everything about wire. This granularity means AI systems can cite the exact right page for the exact right question. Precision over comprehensiveness.
How the system shows up here
Get Found: Progressive enhancement means all data is in the initial HTML. No JS dependency for content visibility.
Get Understood: Semantic tables, explicit heading hierarchy per page type, structured data for technical specifications.
Get Chosen: NEC-sourced values with explicit edition references. Authoritative data beats generic calculators for AI citation.
Applied Work
Real work, not hypothetical.
Enterprise site launches with schema systems
Built structured data templates into five enterprise site launches from the start. Schema wasn’t an afterthought. It was part of the template system so every page type shipped with appropriate markup, making the content machine-readable at scale from day one.
Portfolio built for AI extraction
Designed haydenschuster.com with modern search in mind: FAQPage schema on pillar pages, answer-first content structure, explicit heading hierarchy, and topical hub architecture that helps AI systems understand relationships between concepts.
Common Questions
Is SEO dead because of AI search?
No. But the version of SEO where you chase keywords and build thin pages to rank is dying fast. AI search still runs on traditional search infrastructure. AI systems need crawlable, indexable, well-structured content to generate answers. Technical SEO and content strategy remain foundational. What’s dead is the idea that ranking #1 automatically means traffic. What’s alive is the work that makes your content the source AI systems choose to cite.
How do I get my content cited in AI Overviews, ChatGPT, and Perplexity?
All three pull from web content and evaluate specificity, authority, and structural clarity. The basics: lead with a direct answer under a clear heading, use structured data so machines know what your content is, provide original information that doesn’t exist elsewhere, and make sure your technical foundation is solid. A Brightedge study found that 52% of AI Overview citations come from pages already ranking in the top 10 organically. Being well-structured and well-ranked gives you the best shot across all surfaces.
What is agentic commerce and how does it affect SEO?
Agentic commerce is when AI agents research, compare, and complete purchases on behalf of users without the user ever visiting your site. Google, OpenAI, Shopify, and Amazon are all building this. McKinsey estimates it could influence $3-5 trillion in global retail by 2030. For SEO, it means your product data needs to be machine-readable: clean structured data, accurate inventory, explicit pricing, and schema markup that agents can parse. If an AI agent can’t read your catalog, it can’t recommend your product.
My impressions are going up but clicks are flat or declining. What’s happening?
AI Overviews are answering the query before the user clicks. Your content is being shown (impression counted) but the user gets what they need from the AI summary. This is the single most visible signal that AI search is compressing your top-of-funnel traffic. It’s not a ranking problem. It’s a structural shift in how answers are delivered. I wrote about this in detail and what it actually means for your strategy.
What’s the difference between AEO, GEO, and traditional SEO?
SEO gets you ranked. AEO (Answer Engine Optimization) gets your content selected as the direct answer in featured snippets and AI responses. GEO (Generative Engine Optimization) gets you cited when AI systems synthesize new text from multiple sources. In practice, the tactics overlap heavily: structured content, clear authority, original information, schema markup. Think of them as layers. SEO is the foundation. AEO and GEO are what happens on top when the answer surfaces change.
How do I measure AI search visibility if clicks don’t tell the full story anymore?
This is the hardest measurement problem in SEO right now. Clicks alone miss AI-sourced brand visibility entirely. Start with GSC impression-to-click ratio trends to spot AI compression. Track referral traffic from chatgpt.com and perplexity.ai directly. Manually test your key queries across AI surfaces monthly. Tools like SE Visible and Semrush are adding AI citation tracking, but the space is immature. The metric shift is from “did they click” to “did we show up in the answer.”
Should I block AI crawlers like GPTBot and ClaudeBot?
Only if your business model depends on page visits for revenue (ad-supported publishers, mainly). For everyone else, being cited in AI answers is brand visibility you can’t buy. Blocking GPTBot means ChatGPT can’t reference your content. Blocking ClaudeBot means Claude can’t either. That’s forfeiting a growing discovery channel. You can selectively block training crawlers while allowing search crawlers if the distinction matters to you, but most businesses benefit from full AI visibility.
How fast is AI search actually growing?
Faster than any channel shift since mobile. Adobe’s Digital Economy Index reported a 1,200% increase in retail traffic from AI sources. IBM found 45% of consumers already use AI for part of the buying journey. Google’s AI Overviews now appear for hundreds of millions of queries. ChatGPT and Perplexity are processing millions of search-intent queries daily. This isn’t a trend to watch anymore. It’s infrastructure you need to be building on.
What content should I prioritize if AI is eating my top-of-funnel traffic?
Mid-funnel commercial investigation content. Right now. Comparisons, evaluations, “best for” guides, decision-support pages. These queries still drive clicks because users want to evaluate options before committing. But AI is coming for this layer next, so the window to build strong mid-funnel content that earns AI citations is narrowing. Don’t abandon informational content entirely. Restructure it for citation value rather than click-through traffic. And make your transactional pages frictionless, because every click matters more when there are fewer of them.
Is this all going to change again in six months?
Probably. The specific surfaces and tools will keep evolving. Google will ship new AI features. New competitors will emerge. Agent capabilities will expand. But the underlying principles won’t change: content that is clearly structured, genuinely useful, backed by original data, and technically accessible to machines will win regardless of which AI system is doing the extracting. That’s the bet worth making. Build for structural clarity, not for any single platform’s quirks.
Go Deeper
Articles on AI search behavior, content structure, and measurement.