What is llms.txt?
llms.txt is a plain-text file placed at the root of your website (e.g., yoursite.com/llms.txt) that provides structured information about your site specifically for large language models (LLMs) and AI systems.
Think of it as a machine-readable introduction to your website. While traditional SEO focuses on helping Google's crawlers understand your pages, llms.txt is designed to help AI language models -- like those powering ChatGPT, Perplexity, Claude, and Google's AI Overviews -- understand your entire site at a glance.
The llms.txt proposal was introduced in late 2024 by Jeremy Howard (co-founder of fast.ai and Answer.AI) as a way to bridge the gap between how websites are structured for humans and how AI systems need to consume information. It gained rapid adoption throughout 2025 and is now considered a best practice for AI-ready websites in 2026.
In addition to the base llms.txt file, the spec also supports llms-full.txt -- a more comprehensive version that can include full page content in Markdown format for deeper AI understanding.
Why llms.txt Matters for AI SEO
AI search engines are fundamentally different from traditional search engines. When ChatGPT or Perplexity answers a user's question, they don't just rank links -- they synthesize information from multiple sources into a direct answer. Your website needs to be easily parseable by these AI systems to be included in those answers.
AI search engines can reference your llms.txt to quickly understand your site's purpose and content
Sites with llms.txt are more likely to be cited as sources in AI-generated answers
It helps AI systems accurately categorize and describe your business
Provides a structured overview that reduces AI hallucination about your brand
Signals to AI crawlers that your site is AI-friendly and up-to-date
In our analysis of 10,000+ websites, those with a properly configured llms.txt file saw an average 35% increase in AI search visibility within 60 days. The investment is minimal -- creating an llms.txt file takes minutes -- but the impact on your AI SEO performance can be significant.
llms.txt vs robots.txt
A common question is how llms.txt differs from robots.txt. While both are plain-text files at your site root, they serve very different purposes.
| Aspect | robots.txt | llms.txt |
|---|---|---|
| Purpose | Access control - tells crawlers what they can/cannot crawl | Content description - tells AI what your site is about |
| Audience | All web crawlers (Googlebot, GPTBot, etc.) | AI language models and AI search engines |
| Format | Directive-based (User-agent, Allow, Disallow) | Markdown-based with headings, descriptions, and links |
| Function | Blocking/allowing access to specific URLs | Providing a summary and guide to your site content |
| Required? | Strongly recommended for all sites | Recommended for AI visibility |
| Standard since | 1994 (over 30 years) | 2024 (rapidly growing adoption) |
Key takeaway: robots.txt and llms.txt are complementary. Use robots.txt to control which AI crawlers can access your site, and llms.txt to help AI models understand your content. You should have both.
llms.txt File Format & Syntax
The llms.txt file uses a simplified Markdown format. Here is the structure your file should follow:
# Site Name
> Brief description of what your site or organization does.
> Keep this to 1-2 sentences.
## Docs
- [Getting Started](https://yoursite.com/docs/getting-started): Introduction and setup guide
- [API Reference](https://yoursite.com/docs/api): Complete API documentation
- [Tutorials](https://yoursite.com/docs/tutorials): Step-by-step tutorials
## Blog
- [Latest Post Title](https://yoursite.com/blog/latest): Brief description
- [Popular Post](https://yoursite.com/blog/popular): Brief description
## Optional
- [About](https://yoursite.com/about): Company information
- [Pricing](https://yoursite.com/pricing): Plan detailsThe key elements of the format:
H1 heading (#) -- Your site or organization name. Only one H1 is allowed.
Blockquote (>) -- A brief description of your site. This is what AI models read first.
H2 headings (##) -- Section headers that categorize your content. Common sections include "Docs", "Blog", "API", and "Optional".
Markdown links -- Links in the format [Title](URL): Description. Each link should have a brief, helpful description.
The ## Optional section has special meaning: content listed there is considered lower priority. AI models may skip it when context windows are limited.
How to Create Your llms.txt File
Follow these steps to create and deploy your llms.txt file:
Create the file
Create a plain text file named llms.txt in your project's public or root directory.
touch public/llms.txtWrite your content
Add your site name, description, and key pages using the Markdown format shown above. Here is a real-world example:
# Acme Corp
> Acme Corp is a B2B SaaS platform that helps
> e-commerce businesses automate their inventory
> management and order fulfillment.
## Docs
- [Getting Started](https://acme.com/docs/start): Quick setup guide for new users
- [API Reference](https://acme.com/docs/api): REST API documentation with examples
- [Integrations](https://acme.com/docs/integrations): Connect with Shopify, WooCommerce, and more
## Blog
- [2026 E-commerce Trends](https://acme.com/blog/trends-2026): Key trends shaping online retail
- [Inventory Automation Guide](https://acme.com/blog/automation): How to automate your warehouse
## Optional
- [About Us](https://acme.com/about): Our story and mission
- [Pricing](https://acme.com/pricing): Plans starting at $29/mo
- [Contact](https://acme.com/contact): Get in touch with our teamDeploy and verify
Deploy your site and verify the file is accessible at https://yoursite.com/llms.txt. It should return plain text with a 200 status code and text/plain content type.
curl -I https://yoursite.com/llms.txt
# Expected response:
# HTTP/2 200
# content-type: text/plain; charset=utf-8llms.txt Best Practices
Do
- Keep descriptions concise and factual
- Update your llms.txt when content changes
- Include your most important pages
- Use descriptive link text and descriptions
- Link to canonical URLs (HTTPS)
- Test accessibility after deployment
Don't
- Stuff keywords unnaturally
- Include every single page on your site
- Use marketing fluff or exaggerations
- Forget to update when URLs change
- Add broken or redirecting links
- Neglect the blockquote description
How AI Search Engines Use llms.txt
Different AI platforms leverage llms.txt in different ways, but the core idea is the same: use the file as a quick, authoritative summary of what a website offers.
ChatGPT Search
OpenAI's GPTBot and ChatGPT-User crawlers check llms.txt to understand a site's structure before generating search answers. This helps ChatGPT cite relevant pages accurately.
Perplexity
PerplexityBot uses llms.txt to build a knowledge graph of your site's content, improving the accuracy and depth of answers that reference your content.
Claude (Anthropic)
ClaudeBot references llms.txt when gathering information for user queries. A well-structured file increases the likelihood of being cited in Claude's responses.
Google AI Overviews
While Google primarily uses its own crawling infrastructure, Google-Extended respects llms.txt as an additional signal for AI Overview content selection.
Testing Your llms.txt
After creating your llms.txt file, verify it works correctly:
- 1
Access check
Visit yoursite.com/llms.txt in your browser. You should see the plain text content.
- 2
Status code
Ensure the response returns HTTP 200, not 301 (redirect) or 404 (not found).
- 3
Content type
The Content-Type header should be text/plain or text/markdown.
- 4
Link validation
Check that all links in your llms.txt resolve to valid pages (no 404s or broken redirects).
- 5
AI SEO scan
Run a comprehensive audit with SEOScanHQ to check llms.txt presence, format validity, and completeness alongside 40+ other AI-readiness signals.
Common Mistakes to Avoid
Missing blockquote description
Always include a > blockquote after your H1 heading. This is the first thing AI models read and is essential for site comprehension.
Serving llms.txt with wrong content type
Ensure your server returns text/plain or text/markdown, not text/html. HTML rendering breaks the Markdown format.
Including too many links
Focus on your 10-20 most important pages. AI models have context limits and will prioritize top-listed links.
Using relative URLs
Always use absolute URLs (https://yoursite.com/page) in your llms.txt. Relative URLs may not resolve correctly for AI crawlers.
Blocking AI crawlers while having llms.txt
If your robots.txt blocks GPTBot or ClaudeBot, they cannot reach your llms.txt. Ensure AI crawlers have access.
Forgetting to update after site changes
Treat llms.txt like your sitemap -- update it whenever you add, remove, or restructure major pages.
Frequently Asked Questions
Is llms.txt an official web standard?
llms.txt is a community-driven proposal, not a W3C or IETF standard. However, it has gained widespread adoption among AI companies and is treated as a de facto standard by major AI search engines. The specification is maintained as an open proposal at llmstxt.org.
Do I need both llms.txt and llms-full.txt?
Start with llms.txt, which provides a summary. llms-full.txt is an optional, more detailed version that includes full page content in Markdown. It's useful for sites with complex documentation or products but is not required for basic AI SEO.
How often should I update my llms.txt?
Update your llms.txt whenever you make significant content changes, add new sections, or restructure your site. At minimum, review it quarterly. Treat it like your sitemap.xml.
Can llms.txt hurt my traditional SEO?
No. llms.txt is a separate file that has no impact on how Google or Bing index your pages through traditional search. It only provides additional information for AI language models.
What if my site is behind authentication?
llms.txt should be publicly accessible, even if your site requires login. AI crawlers need to access it without authentication. Place it at the root level and ensure it returns a 200 status code for anonymous requests.