A proposal to standardise on using an /llms.txt file to provide information to help LLMs use a website at inference time.
This is a proposal by some AI bro to add a file called llms.txt that contains a version of your websites text that is easier to process for LLMs. Its a similar idea to the robots.txt file for webcrawlers.
Place output from another LLM in there that has thematically the same content as what's on the website, but full of absolutely wrong information. Straight up hallucinations.
I've had a page that bans by ip listed as 'dont visit here' on my robots.txt file for seven months now. It's not listed anywhere else. I have no banned IPs on there yet. Admittedly, i've only had 15 visitors in that past six months though.