Your website ranks on page one. Your meta tags are dialed in, your Core Web Vitals are green, and your sitemap pings Google every time you publish. Congratulations. You have optimized for a world where humans type queries into search boxes and click blue links.
That world is shrinking.
Right now, AI agents — Claude, ChatGPT, Perplexity, Gemini, Copilot — are crawling, reading, and citing web content to answer questions for millions of users. They do not click your links. They do not see your carefully designed landing page. They extract, synthesize, and represent your information directly in their responses. If your site is not structured for these agents to understand, you are invisible in the fastest-growing channel for information discovery.
This is Agent Engine Optimization: the practice of making your website not just findable by search engines, but understandable and usable by AI agents.
SEO got you to page one. AEO gets you into the agent’s toolkit.
AEO Is Not Replacing SEO. It Is the Next Layer.
Let’s be clear about what AEO is and is not. It is not a replacement for search engine optimization. Google is not going away, and traditional SEO fundamentals — quality content, fast load times, mobile responsiveness, good link profiles — remain essential.
AEO is the recognition that a new class of “search engine” has arrived, and these engines work fundamentally differently:
| SEO (Search Engines) | AEO (AI Agents) | |
|---|---|---|
| Goal | Rank in a list of links | Get represented in agent responses |
| How they read | Crawl HTML, follow links, index keywords | Parse content for meaning, extract facts, understand relationships |
| What they value | Keywords, backlinks, authority signals | Structured data, semantic clarity, machine-readable formats |
| User interaction | Human clicks a link, visits your site | Agent cites your content without the user ever visiting |
| Success metric | Click-through rate | Citation and accurate representation |
The shift is profound. In SEO, you compete for attention. In AEO, you compete for representation. An agent that misunderstands your product, omits your pricing, or attributes your capabilities incorrectly is worse than one that ignores you entirely.
AEO is about making sure AI agents get you right.
The Five Pillars of AEO
After implementing AEO across production sites and API platforms, I have distilled the practice into five pillars. Each one is platform-agnostic — whether you run WordPress, Shopify, Next.js, a static site, or a custom-built application, these apply to you.
1. Discovery Files: llms.txt and llms-full.txt
The robots.txt file told search engine crawlers what they could access. The llms.txt file tells AI agents what they should understand.
Proposed by Jeremy Howard of Answer.AI, the /llms.txt standard is a Markdown file placed at your web root that provides a concise, structured map of your site’s most important content. Think of it as a curated briefing document for AI — not everything on your site, but the things that matter most for accurate representation.
Over 600 websites have adopted the standard, including Stripe, Anthropic, Cloudflare, Zapier, Perplexity, and Hugging Face. This is not theoretical. This is infrastructure.
How it works:
/llms.txt— A concise overview: who you are, what you do, and links to your most important pages/llms-full.txt— An expanded version with deeper context, full descriptions, and detailed documentation
Create yours. Here is a practical template:
# Your Company Name
> One-sentence description of what your company does and who it serves.
## Key Information
- Your core product or service description
- Pricing model (if applicable)
- Primary use cases
## Documentation
- [Getting Started](https://yoursite.com/docs/getting-started): Quick setup guide
- [API Reference](https://yoursite.com/docs/api): Complete API documentation
- [Pricing](https://yoursite.com/pricing): Plans and pricing details
## Products
- [Product A](https://yoursite.com/products/a): Description of product A
- [Product B](https://yoursite.com/products/b): Description of product B
## Optional
- [Blog](https://yoursite.com/blog): Latest articles and announcements
- [Changelog](https://yoursite.com/changelog): Recent updates
Where to put it: Your web root, served at https://yoursite.com/llms.txt. On WordPress, drop it in your theme’s root directory or use a plugin. On Shopify, add it as a file in your theme’s assets. On Next.js or static sites, place it in your public/ folder. On any server, just make sure it is accessible at the root URL path.
Key principles:
- Write in plain, factual language. No marketing fluff — agents do not respond to superlatives
- Be specific about what your product does, not what it is like
- Include pricing, capabilities, and limitations
- Update it when your product changes
2. Structured Data: JSON-LD and Schema.org
If llms.txt is your briefing document, JSON-LD structured data is your machine-readable fact sheet. It is the most direct way to tell AI agents exactly what your pages contain in a format they can parse without ambiguity.
Schema.org vocabulary, encoded as JSON-LD in your page’s <head>, provides typed, structured information about your content. Search engines already use this for rich snippets. AI agents use it for accurate fact extraction.
For an e-commerce product page:
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "Product",
"name": "Cloud Monitoring Pro",
"description": "Real-time infrastructure monitoring with AI-powered anomaly detection",
"brand": { "@type": "Brand", "name": "Acme Cloud" },
"offers": {
"@type": "Offer",
"price": "49.00",
"priceCurrency": "USD",
"priceValidUntil": "2026-12-31",
"availability": "https://schema.org/InStock"
},
"aggregateRating": {
"@type": "AggregateRating",
"ratingValue": "4.7",
"reviewCount": "312"
}
}
</script>
For a SaaS product page:
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "SoftwareApplication",
"name": "Acme Analytics",
"applicationCategory": "BusinessApplication",
"operatingSystem": "Web",
"offers": {
"@type": "AggregateOffer",
"lowPrice": "29",
"highPrice": "299",
"priceCurrency": "USD",
"offerCount": "3"
},
"featureList": "Real-time dashboards, Custom reports, API access, Team collaboration"
}
</script>
For a blog article:
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "Article",
"headline": "Understanding Agent Engine Optimization",
"author": {
"@type": "Person",
"name": "Mike Rahel"
},
"datePublished": "2026-03-11",
"publisher": {
"@type": "Organization",
"name": "Refined Element"
}
}
</script>
For a local business or service:
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "LocalBusiness",
"name": "Downtown Coffee Roasters",
"address": {
"@type": "PostalAddress",
"streetAddress": "123 Main St",
"addressLocality": "Portland",
"addressRegion": "OR"
},
"openingHours": "Mo-Fr 06:00-18:00, Sa-Su 07:00-16:00",
"priceRange": "$$",
"servesCuisine": "Coffee, Pastries"
}
</script>
The pattern is the same regardless of platform: add a <script type="application/ld+json"> block to your page’s head or body. WordPress has plugins like Yoast and RankMath that generate this automatically. Shopify themes support it natively. For custom sites, you write it by hand or generate it at build time.
What to prioritize: Start with your homepage (Organization), your product or service pages (Product, SoftwareApplication, Service), and your articles (Article). These are the pages agents are most likely to reference.
3. Content Architecture: Semantic HTML and Clean Hierarchy
AI agents parse your HTML. The cleaner and more semantic your markup, the better they understand your content’s structure and meaning.
This is not exotic. It is HTML done properly — something many sites still get wrong.
Heading hierarchy matters. An agent parsing your page builds a mental model from your heading structure. A page with a clear h1 > h2 > h3 hierarchy communicates its information architecture far more effectively than one with headings chosen for font size.
<!-- Good: Clear hierarchy, scannable by agents -->
<article>
<h1>Cloud Monitoring Pro</h1>
<p>Real-time infrastructure monitoring for teams of any size.</p>
<h2>Features</h2>
<h3>Anomaly Detection</h3>
<p>ML-powered alerts that reduce false positives by 73%.</p>
<h3>Custom Dashboards</h3>
<p>Build monitoring views with drag-and-drop widgets.</p>
<h2>Pricing</h2>
<p>Plans start at $49/month. Enterprise pricing available.</p>
</article>
<!-- Bad: Divs and spans with no semantic meaning -->
<div class="hero-section">
<span class="big-text">Cloud Monitoring Pro</span>
<div class="features-carousel" id="js-carousel">
<!-- Content only rendered by JavaScript -->
</div>
</div>
The JavaScript rendering problem. If your critical content is only available after JavaScript executes — hidden behind client-side rendering, carousels, tabs, or infinite scroll — many AI crawlers will not see it. Agents do not always execute JavaScript the way a browser does.
Practical fixes:
- Use server-side rendering (SSR) or static site generation (SSG) for important content
- Ensure key product information, pricing, and descriptions are in the initial HTML response
- If you use a JavaScript framework (React, Vue, Svelte), make sure your critical pages are pre-rendered
- Test by viewing your page source (not the inspected DOM) — if the content is not there, agents probably cannot see it either
Use semantic elements. <article>, <section>, <nav>, <aside>, <main>, <header>, <footer> — these are not just for accessibility. They signal content purpose to any parser, human or machine.
4. Feeds and API Exposure: Sitemaps, RSS, and OpenAPI
AI agents are voracious readers. The more structured entry points you provide, the more completely they can understand your content.
XML Sitemap. You probably already have one for SEO. Make sure it is current and complete. Agents use sitemaps as discovery indexes just like search engines do.
<!-- /sitemap.xml -->
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://yoursite.com/</loc>
<lastmod>2026-03-01</lastmod>
<priority>1.0</priority>
</url>
<url>
<loc>https://yoursite.com/products/monitoring-pro</loc>
<lastmod>2026-02-15</lastmod>
<priority>0.8</priority>
</url>
</urlset>
RSS/Atom Feeds. If you publish content — blog posts, documentation updates, changelogs — expose an RSS or Atom feed. These are structured, chronological, and trivially parseable. WordPress and most blogging platforms generate these automatically. For static sites, tools like Eleventy and Hugo include feed generation.
OpenAPI Specifications. If you have an API, an OpenAPI (Swagger) spec is the gold standard of machine-readable documentation. It tells agents not just what your API does, but exactly how to use it — endpoints, parameters, authentication, response schemas.
# /openapi.yaml (abbreviated)
openapi: 3.0.0
info:
title: Your API
version: 1.0.0
description: Brief description of what your API does
paths:
/api/products:
get:
summary: List all products
parameters:
- name: category
in: query
schema:
type: string
responses:
'200':
description: A list of products
Even if your primary audience is human developers, an OpenAPI spec makes your API discoverable by the growing ecosystem of AI agents that can autonomously find and call APIs.
5. Machine-Payable Content: The L402 Protocol
Here is where AEO moves from discovery to commerce.
The four pillars above make your content visible and understandable to AI agents. But what if an agent does not just want to read your content — it wants to buy it?
The L402 protocol (formerly LSAT) enables exactly this. It extends HTTP with a native payment flow: when an agent requests a paid resource, the server returns HTTP 402 (Payment Required) along with a Lightning invoice. The agent pays the invoice — a Bitcoin micropayment settling in milliseconds — and receives access. No accounts, no API keys, no credit cards, no human in the loop.
The flow looks like this:
Agent requests resource
--> Server returns 402 + Lightning invoice + macaroon
Agent pays invoice (fractions of a cent to dollars)
--> Agent receives preimage (proof of payment)
Agent retries request with Authorization: L402 <macaroon>:<preimage>
--> Server grants access
This matters because the agent economy is emerging now. AI agents are already being deployed to research, compare, purchase, and act on behalf of users. An agent that can autonomously pay $0.002 for a weather API call, $0.05 for a premium data feed, or $5.00 for a research report creates an entirely new revenue channel — one where your content earns money without a human ever visiting your site.
At Refined Element, we build Lightning Enable, which provides the infrastructure for L402 machine-payable APIs. We have seen this firsthand: AI agents purchasing premium guides, paying for API access, and settling invoices without any human intervention. It is not a concept. It is running in production.
For most sites today, L402 is forward-looking rather than immediately actionable. But if you run an API, publish premium content, or offer data services, it is worth understanding now. The websites that are machine-payable when agents come looking will have a significant first-mover advantage over those scrambling to add payment infrastructure later.
Getting Started: A Practical AEO Checklist
You do not need to implement everything at once. Here is a prioritized action list:
This week (30 minutes):
- Create an
/llms.txtfile and deploy it to your web root - Verify your sitemap is current and accessible at
/sitemap.xml - Check that your homepage has JSON-LD
Organizationmarkup
This month:
- Add JSON-LD structured data to your top 10 pages by traffic
- Audit your heading hierarchy — ensure every page has exactly one
h1and a logicalh2/h3structure - Confirm that critical content is in the initial HTML, not JavaScript-rendered only
- Create
/llms-full.txtwith detailed information about your products and services
This quarter:
- Expose RSS or Atom feeds for any regularly updated content
- If you have an API, publish an OpenAPI spec
- Review how AI agents currently represent your brand (ask Claude, ChatGPT, and Perplexity about your company — the answers may surprise you)
- Evaluate whether any of your content or APIs could be monetized via L402
The Window Is Open
We are in the early days of AEO the same way we were in the early days of SEO in the mid-2000s. Back then, the sites that took search optimization seriously early reaped compound advantages for years. The same dynamic is playing out now with AI agents.
The difference is speed. SEO evolved over a decade. AEO is moving in months. Claude, ChatGPT, Perplexity, and Gemini are already the primary research tools for millions of users. When those users ask “what is the best monitoring tool for Kubernetes?” or “which payment API has the lowest fees?”, the agent’s answer is shaped by the content it can find, parse, and trust.
If your site speaks the language of agents — structured data, semantic markup, clean discovery files, machine-readable APIs — you are in the conversation.
If it does not, you are not.
The tools are straightforward. The standards are emerging but clear. The time to start is now.