TL;DR :
Large Language Models like ChatGPT and Claude are changing how people discover products and services — bypassing traditional search.
llms.txt is a new standard (inspired by robots.txt) that helps AI models understand your site by offering a clean, structured summary of your key content.
- It’s easy to implement and SEO-friendly.
- Adoption is growing among tech-forward brands like Stripe, Cursor, and Anthropic.
- It’s not about SEO traffic, it’s about being ready for AI-first discovery.
- Paired with ZeroClick.app, you can even track how LLMs mention your brand.
Learn how to implement it in this article.
What Is llms.txt and Why Does It Matter?The originllms.txt vs robots.txt vs sitemap.xmlHow Does It Work? (Technical Overview)The Structure — Think of It Like a Handcrafted Sitemap for AIWhat About llms-full.txt ?Why Not Just Use HTML?How to implement itStep 1: Generate Your llms.txt File AutomaticallyStep 2: Host the FileCurrent adoption and Use casesLimitations and criticism⚠️ 1. LLMs Aren’t Reading It (Yet)⚠️ 2. It Doesn’t Block Anything⚠️ 3. No Enforcement, No Standardization
Conclusion: Should you implement it ?
We’re living through a revolution in the search era.
LLMs like ChatGPT and Claude are becoming go-to tools to benchmark products, discover new services, and guide consumer choices, often replacing traditional search engines in the process.
Because of this shift, website owners are starting to explore new ways to communicate their content and preferences to these models.
llms.txt is one of the latest proposals in that space : an idea inspired by robots.txt used for search engines, but built for the age of generative AI.
Its promise? To help LLMs better understand your website and highlight what really matters.
But how does it actually work? Is it worth implementing? And most importantly—is there any real ROI behind it?
Let’s dive in.
What Is llms.txt and Why Does It Matter?
The origin
llms.txt is a simple, human-readable file you can add to your website to help large language models (LLMs) understand your most important content. Think of it as a curated guide for AI: a way to highlight your key pages, summarize what your site is about, and make sure models don’t miss the information that matters most.
The concept was introduced in late 2024 by Jeremy Howard (co-founder of Fast.ai and Answer.AI) as a response to a growing challenge:
“Large language models increasingly rely on website information, but face a critical limitation: context windows are too small to handle most websites in their entirety.”

Unlike users or search engine crawlers, LLMs don’t navigate complex HTML well.
They struggle with sidebars, ads, buttons, and JavaScript. Worse: their context windows are limited, meaning they can’t “read” a full site the way a human can.
So even if your site has great content, there’s no guarantee an AI model will find or understand it properly.
llms.txt vs robots.txt vs sitemap.xml
File | Purpose | Audience |
robots.txt | Tells bots what not to crawl | Search engine crawlers |
sitemap.xml | Lists all pages to help bots crawl everything | Search engine crawlers |
llms.txt | Curates the most important content for LLMs | AI models at inference time |
So while robots.txt and sitemaps help Googlebot index your site for search, llms.txt helps ChatGPT and other LLMs understand your site when they answer user queries.
In a world where fewer people click through search results and more rely on AI summaries, this could be your best shot at staying relevant in the answers.
You should not forget about basic SEO methods with proven way to structure your content for both search engines and LLMs by implementing Schema Markup.
How Does It Work? (Technical Overview)
Implementing llms.txt is surprisingly simple, which is one reason it’s gaining traction in developer and marketing communities alike.
At its core, llms.txt is a plain-text file written in Markdown, hosted at the root of your website (typically at https://yourdomain.com/llms.txt). Its job?
To offer a clean, minimal, and AI-friendly version of your site’s structure, designed for easy parsing by large language models.
The Structure — Think of It Like a Handcrafted Sitemap for AI
A typical llms.txt file follows a few simple rules:
- Start with a title (H1): The name of your product or website.
- Add a short summary (blockquote): A one-line description of what your site does.
- List your most important pages (bullets + links): Organize them into sections (e.g. “Core Docs”, “FAQs”) with a brief description for each.
Here’s what it might look like in practice:
# Acme Analytics > Real-time analytics for modern SaaS teams. ## Core Documentation - [Getting Started](https://acme.com/docs/getting-started): Learn the basics of setting up Acme. - [API Reference](https://acme.com/docs/api): Full API documentation with examples. ## FAQs - [Pricing](https://acme.com/pricing): Details on plans and billing. - [Security](https://acme.com/security): Our approach to data protection.
The format is intentionally simple : no JSON, no schema.org, no tags to learn.
Just clean markdown that humans can write and LLMs can read. It’s designed to complement your existing SEO efforts, not replace them.
What About llms-full.txt ?
Some websites also include a companion file called llms-full.txt, which contains the full plain-text content of all core pages in one place.
This is useful for models with large context windows or retrieval-augmented generation (RAG) pipelines, where having all your content available in a single chunk can make answers more accurate and detailed.
Note: llms-full.txt can be huge (hundreds of thousands of tokens), so it’s not a must-have but it’s a nice-to-have for advanced use cases.
Want to know if ChatGPT already brings traffic ? → Check out How to track your rankings in ChatGPT
Why Not Just Use HTML?
You might wonder: “Why can’t LLMs just read my website like a human or a bot?”
In theory, they can. But in practice, they struggle. Here’s why:
- Most websites are filled with UI noise: headers, menus, footers, popups, scripts.
- Parsing this clutter is expensive in terms of tokens.
- LLMs have limited context windows and can’t afford to waste it.
- And worse: they often miss what you consider the important stuff.
llms.txt solves this by stripping your content down to the essentials—making it faster and cheaper for LLMs to use your site as a reliable knowledge source.

How to implement it
Step 1: Generate Your llms.txt File Automatically
Creating an llms.txt file doesn’t have to be a manual process. You can use a free online tool to generate it based on your website’s content.
Use the LLMs.txt Generator by llmstxt-generator.org
This tool allows you to generate an llms.txt file by simply entering your website URL. It will crawl your site and suggest a base structure for your llms.txt file.
How it works:
- Enter your website URL: Provide the URL of your website to the tool.
- Generate the file: The tool will process your site and generate a basic llms.txt structure.
- Customize as needed: Review and edit the generated content to accurately reflect your site’s structure and key pages.
- Save the file: Once satisfied, save the content as a plain text file named llms.txt.
Access the tool here: LLMs.txt Generator
After generating and customizing your llms.txt file, proceed to host it on your website as outlined in the next steps.
And you can learn here how to do it in seconds :
Step 2: Host the File
Choose your platform and let’s go !
If you use Webflow
Webflow doesn’t natively support uploading files to the root domain, but there’s a smart workaround using a Assets Collection + page redirect.
- Host the file on Webflow
→ Upload the .txt file in the Assets panel, publish your site, and copy the public URL of the file.
- Redirect /llms.txt to your hosted file
→ Go to Publishing > Add redirect, and set up a redirect from /llms.txt to the hosted file URL. Publish again to activate it.
If you use Shopify
Shopify doesn’t natively support uploading custom .txt files to the root directory of your domain. However, you can achieve similar functionality through a workaround:
- Upload the llms.txt File to Shopify’s Files Section:
- From your Shopify admin, navigate to Settings > Files.
- Click Upload files and select your llms.txt file.
- After uploading, copy the URL of the file. It will look something like https://cdn.shopify.com/s/files/1/XXXX/XXXX/files/llms.txt?v=123456.
- Create a URL Redirect:
- Go to Online Store > Navigation.
- Click on URL Redirects.
- Click Add URL redirect.
- In the Redirect from field, enter /llms.txt.
- In the Redirect to field, paste the URL of the uploaded llms.txt file.
- Click Save redirect.
With this setup, when someone accesses https://yourstore.com/llms.txt, they’ll be redirected to the hosted file, effectively mimicking a root-level llms.txt file.
If you use Cloudflare
Cloudflare Workers enable you to serve custom content at specific routes without modifying your origin server. Here’s a step-by-step guide to set this up:
1. Create a New Worker
- Log in to your Cloudflare Dashboard.
- Navigate to Workers & Pages > Create Application.
- Select Create Worker.
- Name your Worker (e.g., llms-txt-handler).
2. Edit the Worker Script
Replace the default code with the following script, which serves your llms.txt content:
export default { async fetch(request) { const url = new URL(request.url); if (url.pathname === '/llms.txt') { const llmsTxtContent = ` # Your Site Name > Brief description of your site. ## Key Pages - [Home](https://yourdomain.com): Overview of our services. - [Documentation](https://yourdomain.com/docs): Comprehensive guides and API references. - [Pricing](https://yourdomain.com/pricing): Details about our pricing plans. - [FAQ](https://yourdomain.com/faq): Frequently asked questions. ## Additional Resources - [Blog](https://yourdomain.com/blog): Latest news and updates. - [Contact](https://yourdomain.com/contact): Get in touch with us. `; return new Response(llmsTxtContent.trim(), { headers: { 'Content-Type': 'text/plain' }, }); } // For all other requests, proceed as usual return fetch(request); }, };
Note: Replace the placeholder content with information relevant to your website.
3. Deploy the Worker
- Click Save and Deploy to publish your Worker.
4. Set Up a Route for the Worker
- In the Cloudflare Dashboard, go to Workers & Pages > Manage Workers.
- Find your deployed Worker and click on it.
- Navigate to the Triggers tab.
- Click Add Route.
- In the Route field, enter your domain followed by /llms.txt (e.g., yourdomain.com/llms.txt).
- In the Zone dropdown, select your domain.
- Click Save.
Now, when someone accesses https://yourdomain.com/llms.txt, Cloudflare will serve the content defined in your Worker script.
Other Platforms (WordPress, Vercel, etc.)
- WordPress: Use an FTP client or a plugin like File Manager to upload llms.txt directly to your root directory.
- Netlify: Place the file in your public/ folder; it will be available at the root automatically after deployment.
- Vercel: Same as Netlify—drop the file into public/ and deploy.
Current adoption and Use cases
While llms.txt is still in its early stages, it’s already gaining traction especially among developer-centric platforms and documentation-heavy SaaS tools.
The concept quickly caught the attention of the developer and AI tooling communities. Within weeks of its introduction in late 2024, early adopters started to emerge—notably the documentation platform Mintlify, which integrated llms.txt support on November 14, 2024.
This move was a turning point: Mintlify powers docs for thousands of popular dev tools. With a single update, it made entire ecosystems including companies like Anthropic (makers of Claude) and Cursor (an AI coding assistant) instantly LLM-friendly.
Shortly after, these same companies began sharing their new llms.txt files on social media, positioning themselves as AI-ready and encouraging others to follow. The signal was clear: if your documentation is important, make it easy for AI to understand.
This sparked a minor trend:
- 🌐 Public directories were created to catalog sites using llms.txt.
- 🛠️ Generators and open-source tools popped up to help anyone create their file from a URL.
- 🤖 Developers integrated it into their workflows, particularly in LLM applications using tools like LangChain or RAG-based bots.
💡 Use Case: Developer Docs
LLMs are increasingly used by developers to answer technical questions, explain SDKs, or troubleshoot APIs. But HTML-heavy documentation often breaks in AI tools—especially when it contains tabs, sidebars, or interactive code blocks.
By offering a clean, structured version of the content through llms.txt, these companies reduce friction and ensure the AI gives accurate, up-to-date answers.
Instead of hallucinating outdated methods or skipping over important instructions, the LLM now has a shortcut to the relevant information.
💡 Use Case: Support Deflection
Companies like Anthropic and Cursor see another major benefit: support deflection.
If LLMs can reliably answer user questions using your docs (because of llms.txt or llms-full.txt), you save costs on support tickets—and improve the user experience.
Some platforms are even integrating llms.txt into custom GPTs, AI chat widgets, or IDE copilots, feeding it directly as a source of truth. The clearer and cleaner the input, the better the AI performs and the fewer “Sorry, I’m not sure” moments for your users.
💡 Use Case: Strategic Positioning
In a broader sense, llms.txt is becoming a signal of AI readiness—particularly for tech-forward brands.
Companies that add it today aren’t necessarily doing it for SEO traffic. They’re doing it because:
- They want LLMs to reflect their messaging accurately.
- They want to be cited when users query tools like ChatGPT or Perplexity.
- They want to future-proof their docs for a world where LLMs are primary information gateways.
And in some cases, they’re already seeing results.
Platforms like Vercel, which heavily optimize for generative search, report that up to 10% of new signups now come via ChatGPT recommendations. That’s not because of llms.txt specifically but it shows the growing potential of being “LLM-visible.”
In short: 🧠 llms.txt has moved from obscure idea to small-but-serious movement, particularly in tech. 📈 The early use cases show promise in improving AI answers, boosting product visibility, and reducing user friction. And while mainstream adoption is still limited, the momentum is building—driven by practical needs, not hype.
Limitations and criticism
Despite its simplicity and growing buzz, llms.txt comes with serious limitations, both technical and strategic. While it’s easy to implement, it’s far from a silver bullet.
Here’s what you need to know before adopting it blindly :
⚠️ 1. LLMs Aren’t Reading It (Yet)
The biggest issue?
No major AI model is automatically consuming llms.txt today.
- ChatGPT doesn’t check it.
- Claude doesn’t load it at inference time.
- Google’s Gemini has no announced support.
Even though some of these companies (like Anthropic) have published their own llms.txt, their models still don’t systematically read it when generating answers.
Unlike robots.txt or sitemap.xml, which are standard parts of the web crawling stack, llms.txt is not yet integrated into LLMs’ inference pipelines. It requires explicit fetching, custom integrations, or RAG (retrieval-augmented generation) setups.
👉 Translation: You can publish it, but there’s no guarantee any AI will look at it.
⚠️ 2. It Doesn’t Block Anything
There’s often confusion around llms.txt being a kind of AI robots.txt.
It’s not.
- It doesn’t block scrapers.
- It doesn’t opt you out of training.
- It doesn’t control what LLMs can or can’t use.
It’s an “offer,” not a “command.”
You’re giving AI models a cleaner version of your site but there’s nothing forcing them to use it. And there’s definitely nothing stopping them from training on your raw HTML anyway (unless you’ve taken separate steps via robots.txt or legal notice).
⚠️ 3. No Enforcement, No Standardization
Right now, llms.txt is not an official spec.
There’s:
- No RFC.
- No industry adoption guidelines.
- No W3C backing.
- No penalty for ignoring it.
It’s an informal convention, not a protocol. AI companies can simply choose to ignore it — and most do.
Compare this to robots.txt, which is widely respected across the web due to decades of established norms and SEO implications. llms.txt has none of that built-in weight… yet.

Finally, there’s no guaranteed upside.
You won’t suddenly gain more traffic or better rankings because you added an llms.txt.
If no AI reads it and users aren’t referred through LLMs, the return might be zero. For now, it’s a strategic hedge, not a traffic play.
That’s not a reason to avoid it but it is a reason to be realistic.
Conclusion: Should you implement it ?
Yes, but not for clicks.
For control.
llms.txt won’t boost your SEO overnight. Most LLMs don’t read it yet. But that’s not the point.
It’s about positioning your brand as AI-ready, making sure when ChatGPT, Claude or Perplexity talk about you, they get it right.
And with tools like ZeroClick.app, you can track how LLMs mention your brand, even when there’s zero Google traffic involved. It’s a new kind of visibility — invisible in search, but dominant in conversations.
Adding llms.txt today is like publishing a press kit for AI:
Clean, structured, ready-to-quote. It costs nothing. Takes 10 minutes. And positions your brand where attention is shifting.
So yes, implement it.
Not for SEO.
But to be ready for the future of search.
Author : Antoine Payre - Co-founder of Zeroclick.app