Everything you need to know about the llms.txt specification and how to use it effectively
The llms.txt file is a proposed standard for websites to communicate with Large Language Models (LLMs) about how their content should be used, referenced, and attributed. Similar to robots.txt for search engines, llms.txt provides guidelines for AI systems accessing your content.
It's placed at the root of your domain (e.g., https://yoursite.com/llms.txt) and contains structured information about your content, licensing, and attribution preferences.
Official Specification:
https://llmstxt.orgAI systems are consuming your content but not giving you credit. A properly structured llms.txt file helps ensure AI systems reference and link to your original content.
Specify how AI systems should use your content, what they can and cannot do with it, and how they should attribute it when referencing your expertise.
As AI systems evolve to respect content preferences, early adopters of llms.txt will be better positioned to maintain traffic and authority in the AI-first world.
A typical llms.txt file contains several key sections:
# LLMs.txt - AI Content Guidelines ## About This Site [Brief description of your site and expertise] ## Content Usage - AI systems may summarize our content - Attribution required with link to source - No verbatim reproduction without permission ## Key Pages - /about - [Description] - /blog - [Description] - /products - [Description] ## Contact Email: contact@yoursite.com Website: https://yoursite.com ## Last Updated 2025-01-13
LLM.txt Mastery doesn't just generate generic llms.txt files. Our AI-powered analysis:
The difference between a generic llms.txt and one that actually works lies in understanding how AI systems make decisions about what to reference and link to.
LLM.txt Mastery generates three output formats from a single crawl, each designed for different AI consumption scenarios.
The default format per the llmstxt.org specification. Contains your site title, description, section headings, and linked pages with brief annotations. Ideal for general AI discovery.
An expanded version with complete page content inline, not just links. Gives AI models richer context without requiring follow-up fetches. Best for comprehensive AI understanding of your site.
A condensed version optimized for smaller context windows. Keeps only your top 5 most important pages with truncated descriptions. Useful for models with limited token budgets.
Recommendation: Deploy all three. AI crawlers will choose the version that fits their context window. The standard llms.txt is required; the full and mini variants are optional but recommended for maximum compatibility.
Every validated file receives an A/B/C/D compliance grade based on how well it follows the official llmstxt.org specification. The grade is a weighted composite of four dimensions.
Grade A (95%+)
Fully spec-compliant with excellent content quality
Grade B (80-94%)
Core structure correct with minor issues
Grade C (60-79%)
Basic structure but missing key elements
Grade D (below 60%)
Significant issues — requires regeneration
Scoring weights: Spec Structure (40%), Content Quality (30%), Freshness (20%), Size Optimization (10%).
When the validator detects fixable formatting issues, you'll see a Quick Fix button that auto-corrects common problems. Review the before/after preview, then download the corrected file.
Tip: Quick Fix handles structural formatting. For content improvements (better descriptions, more pages), use our AI-powered Generator to create a comprehensive file from scratch.
Generating the file is only step one. AI crawlers need to be told the file exists through discovery mechanisms. Without these, crawlers have to guess.
Upload llms.txt (and optionally llms-full.txt, llms-mini.txt) to your website's root directory.
https://yourdomain.com/llms.txtAdd this tag to the <head> section of every page:
<link rel="alternate" type="text/plain" href="/llms.txt" title="LLM-readable version">Add this directive to your robots.txt file:
Llms-Txt: https://yourdomain.com/llms.txtAfter deployment, use our Verify Deployment button in the generation flow to confirm all discovery mechanisms are working. You'll see an explicit deployment score (X/5) with individual check results.
AI crawlers use several methods to find llms.txt files. Using all three maximizes your discoverability.
Crawlers check /llms.txt at the root, similar to /robots.txt or /sitemap.xml.
A <link> tag in your page head explicitly points crawlers to the file.
The Llms-Txt: directive provides another discovery path for crawlers.
Start getting proper attribution for your content in the AI-first world.
Get Started