By clicking “Accept All”, you agree to the storing of cookies on your device to enhance site navigation, analyse site usage, and assist in our marketing efforts
These cookies enable our website and App to remember things such as your region or country, language, accessibility options and your preferences and settings.
Analytic cookies help website owners to understand how visitors interact with websites by collecting and reporting information anonymously.
Marketing cookies are used to track visitors across websites. The intention is to display ads that are relevant and engaging for the individual user and thereby more valuable for publishers and third party advertisers.
Indeed. "Here's a text file so the most advanced computer in the world can read it rather than structured text in HTML which it can read even better"Welcome to 2026!
So if its someone in the UK and they have inside knowledge they probably work for Google Deepmind? All the other big LLMs are very much based in the US or China.I have it on very qualified authority from a leading UK AI programmer, that this is where LLMs are heading.
Interesting. Is that only because its often badly structured or for other reasons as well?I’ve been reading reports that schema isn’t as effective as structured content. And above the fold content even better. Schema is so often badly implemented as to be worthless. Automated creation tools are often to blame.
It’s the automated tools. They just don’t provide the granularity. And because they are so generic the schema is often almost identical on every page. If you look at the huge range of things and itemprops available schema rarely scratch the surface.Interesting. Is that only because its often badly structured or for other reasons as well?
Are we talking about the same thing? Schema.org semantic markup is often added to the body.
Which you can also do in the body. It just has to go in an outer element to all the elements with itemprop for that item.Itemprop can be attached to an item but it’s not part of the page schema unless you also have item scope and define the thing etc.
Yes you can. But the chances of anyone managing to do this are virtually zero. Apart from you and maybe one or two others everyone uses a CMS. And most of those will be using some sort of automation to generate their schema.Which you can also do in the body. It just has to go in an outer element to all the elements with itemprop for that item.
I do not understand what you are saying here.Yes you can. But the chances of anyone managing to do this are virtually zero. Apart from you and maybe one or two others everyone uses a CMS. And most of those will be using some sort of automation to generate their schema.
Customising a theme template isn’t easy if you plan to include schema data. For starters, you would need to input all the itemprop information and you average content editor doesn’t have this as an option. Which is why most people use the automated generators.
And as far as I can tell llms.txt doesn’t accept schema markup.
That would be an argument for expecting LLMs to be setup to prefer microdata markup where you add scopes and properties to visible body text to to JSON-LD which is not visible, or to makdown which may differ.A comment I've seen a number of times is: if schema/structured data is important for AI all you need to do is spam your schema and ignore the on page content.
So you might as well do it anyway. This is something Google has confirmed.Schema is great for normal searches (if done properly)
Good point, but using a known vocabulary probably makes it more likely to be understood. Maybe other vocabularies will work well too. Google seems to encourage schema.org and everyone supports it so its probably the best for SEO.Also: schema can use microdata but you don't need schema to add microdata to your html
This is the key part. You need a well structured site with properly organised content that the AI bots can easily index/scrape and tokenise. This is just good SEO. The llms.txt can then do its thing.Takes about an hour to set up properly if your site structure is already clean.
Well done on the forward thinking.Results are difficult to attribute directly since AI citation is not tracked the same way as organic clicks, but since implementing it we have seen an increase in the site appearing in AI Overview results for relevant local search queries.
Manualy created and keep updating it every 3 monthsWell done on the forward thinking.
Two questions:
(a) Was the file AI generated?
(b) Did you manually tweak it?
Can you share how you first heard about llms.txt files? Interested to know how the message is getting out.So not purely manual, nor purely AI-generated. More of a collaborative process where the business knowledge and verification are mine, and the AI helps with structure and wording.
If you have a clear and clean url structure its much easy to write a crafted manual llms.txt.This is the key part. You need a well structured site with properly organised content that the AI bots can easily index/scrape and tokenise. This is just good SEO. The llms.txt can then do its thing.
Many moons ago the code to content ratio was an important SEO signal - it has been long forgotten but now may be worth looking at again. Less data means faster pages and everyone wants a fast loading site.
I first came across it through a video by Imran from Web Squadron on YouTube — he does very practical, no-nonsense content on technical SEO and GEO topics. The video walked through what llms.txt is, why it matters for AI visibility, and the exact steps to create and upload the file. Worth watching if anyone wants a straightforward walkthrough rather than just reading about it.Can you share how you first heard about llms.txt files? Interested to know how the message is getting out.