6 LLMs  ·  5 languages  ·  Quarterly index  ·  Independent research  ·  Updated Q2 2026
CEAVERS
Centre for European AI Visibility Evaluation & Research Standards

Glossary

llms.txt

Last reviewed: 2026-05-12

llms.txt is a proposed convention for a site-root markdown file summarising what AI crawlers should read. It complements robots.txt: robots.txt grants or denies access, llms.txt prioritises content.

Frequently asked

What is llms.txt?
llms.txt is a proposed convention for a markdown file at a site's root that summarises the site's most important pages for AI crawlers in a compact, parseable format.
How is llms.txt different from robots.txt?
robots.txt controls crawler access; llms.txt suggests what crawlers should focus on. Robots.txt is a directive, llms.txt is a hint.
Do LLMs actually read llms.txt?
Adoption is uneven and informal. The file is cheap to maintain and signals editorial intent; whether a given engine acts on it is empirical.

Related terms