Validate your existing llms.txt or auto-generate one from your sitemap. Based on the llmstxt.org specification.
A Markdown file placed at the root of your website (/llms.txt) that helps AI assistants and LLMs understand your site content without parsing raw HTML.
Plain Markdown that ChatGPT, Claude, Perplexity and other LLMs can parse in a single context window.
Just like robots.txt guides search crawlers, llms.txt guides AI systems to the right content on your site.
Defined at llmstxt.org — an open, evolving specification supported by the developer community.
Use the Generate tab above — we parse your sitemap and produce a ready llms.txt file automatically.
# Project Name
> Short summary after the H1
https://)
## Section headings
[Getting Started Guide](url) is better than just [Page 1](url)
: Brief description for context
llms.txt: https://yourdomain.com/llms.txt to your robots.txt
llms.txt is a Markdown file at the root of your website that provides a structured, AI-friendly index of your content. As AI tools like ChatGPT and Claude become primary research tools, having llms.txt helps ensure AI can accurately represent your site — improving discoverability, reducing hallucinations, and giving you control over what AI knows about your project.
robots.txt tells crawlers which pages to skip. sitemap.xml lists all URLs for indexing. llms.txt is different — it's for AI systems reading your content directly. It provides human-readable titles, descriptions, and context that help LLMs understand what your site is about and navigate to its most important content.
llms.txt has no direct effect on traditional search engine rankings — Google and Bing do not use it for indexing. Its value is in AI search and conversational tools: as AI-powered search (Perplexity, ChatGPT Search, Claude.ai) becomes more common, having a well-structured llms.txt can improve how your site appears in AI-generated answers.
Aim for Grade A (85–100 points). The most important checks are: H1 heading present, blockquote description, at least one H2 section, and valid absolute URLs. A score of 70+ (Grade B) is perfectly functional. Below 50 means critical structure issues that may prevent AI tools from properly parsing your file.
Update llms.txt whenever you add major new sections, change site structure, or update key documentation. For blogs and news sites, monthly updates are sufficient. Use our Generator — it rebuilds the file from your sitemap in seconds, so keeping it fresh takes minimal effort.
Place it at the root of your domain: https://yourdomain.com/llms.txt. Make sure it's publicly accessible (no auth required) and returns Content-Type text/plain or text/markdown. Also consider adding Sitemap: https://yourdomain.com/llms.txt in your robots.txt so AI crawlers can discover it automatically.
llms.txt is a file describing website content specifically for language models. Similar to robots.txt, but for AI bots and LLMs.
Helps AI systems better understand your site. Can improve citations in ChatGPT, Perplexity, and other AI search engines.