Is there any size limit for llms.txt indexed as Docs?

I’ve been using a 4MB llms.txt file indexed as Docs to provide full context for an API. The llms.txt file contains the docs for the dll of the API including all the properties, classes, methods…

Is this ok?

Any best practices for that?

Any limits for both free and paid tier?

Hey, thanks for the request.

There aren’t any documented file size limits for @Docs in the official docs. But from community experience, files bigger than 50 to 60k tokens can be unstable during indexing.

Your 4 MB file, depending on what’s inside, is roughly 800k to 1.2M tokens (using ~5 characters per token). That’s a lot.

Based on this thread: Tutorial: Adding full repo context, pdfs and other docs, it’s recommended to split large files into multiple parts. This helps the system give the AI only the most relevant sections, which usually improves answer quality.

You can try:

  • Split llms.txt into several logical files (by API sections, classes, etc.)
  • Create a separate Gist for each part
  • Add them as multiple @Docs with clear names

Is indexing working right now, or are you seeing issues? If it’s indexing successfully, you can keep it as-is, but splitting it usually gives better results in practice.

On free vs paid, the file size limits don’t depend on your plan. They’re technical indexing limits.

Thank you, Dean

I’ve tested the 4MB llms.txt and it has been working great (single file indexing). Now I got worried and curious, as it shouldn’t work, right :thinking: .
Anyway, I’ll follow this guidance and find a way to split it or maybe add a search mechanism on top of that.