I’m familiar with the “Cursor Settings → Docs” feature for indexing publicly available URLs that contain documentation. Unfortunately that only seems to allow indexing publicly available URLs. Our team has many gated documentation sources for private APIs that are not public, along with various Confluence standards docs about the different types of code we write.
How are other teams indexing this private information and making it available to Cursor?
We’ve considered trying to reverse engineer what IPs the Cursor indexing spider uses and making those docs available only to those IPs, but that’s generally not a good idea from a security perspective.
Should we be scraping these docs ourselves and curating them into Markdown format and making available in each repo that our engineers are working on? One giant .cursorrules file?
This seems like a common business use case and have been surprised to not come across any documentation or discussion of others doing this in my own searching.
Thanks!
Only public URLs are available currently. What you can do is just like you say, convert these to markdown then store them in a folder somwhere in you codebase.
Then it’s very easy to just include that in the context by writing @docs/your-private-doc.md
in the prompt.
Would that work for you?
It works, but it’s slow and cumbersome when trying to manage that across a team of dozens of people.
Being able to index private docs centrally and share them across the team from a central point would be more ideal. I stumbled on this one from another user asking about indexing Confluence docs. Let Cursor crawl our Confluence documentation? - #6 by wm9
Ultimately it boils down to whether we treat each individual engineer using Cursor as independent to one another and require and hope that they setup their environment in the most efficient way… verus, setting the defaults in a sane way that works for our environment with the tools they need already at hand. There’s significant business value in streamlining that process and getting our docs that are locked up into LLM tools to make them as useful as possible.