How can I add an entire project's code to an AI for reference all at once?

In a new conversation using Agent mode, I want the AI to help me generate a new script, but this script requires the AI to search the existing code throughout the entire project. My project has over 40 .py scripts, and using @Files requires more than 40 clicks to add all the files to the conversation.

I need to add the entire project’s code so the AI can understand the context and help me generate a new script. In the old version, I could use @Codebase, but now I don’t know how to add the whole project to the AI at once?

When using Agent, the AI should go through the entire Codebase by default. I always tell it to go through the entire codebase and look at all the files.

4 Likes

You can use jinni mcp. Just ask the model to read your context.

2 Likes

I use yek (search on github) to serialize my codebase. Pro tip: use AI to write a yek.yaml config file for you. Ask it to configure it for your stack, eliminating the boilerplate / generic library files and prioritizing files that are core to your app’s custom logic. Works great for me.

3 Likes

Many thanks to @TP511, @kaktusss123, and @rustem for introducing me to such a powerful and practical tool.
I have a question: when I manually add all the files one by one to the chat as context, I find that Claude 3.7 works well and meets my expectations.
However, when I use the jinni mcp or yek tools together with Cursor, are they really compatible with Cursor? Is the context retrieval effect of these tools worse than Cursor’s own system? Because Cursor itself should have its own retrieval mechanism (when I manually add all the files one by one to the chat).

You can also use @[folder_name].

3 Likes

can it get content from different repo to new repo so that i can ask to replicate something complex is there mcp for that ?

1 Like

Cursor works great with yek. I added link to github readme to custom docs. Normally i would tell cursor agent sth like “use yek to make a snapshot of my project codebase and save output to ‘docs/codebase-snapshot.txt’ (replace if exists)”

2 Likes

Cool that both can be run locally!

@[folder_name], is folder_name the name of my project’s root directory? The @ in cursor doesn’t recognize the folder_name of my project root directory, and when I manually enter @[folder_name], cursor shows “Files Folders No available options.”

You can also use RepoMix as an alternative option. Could ask your agent to code a file consolidator that combines your file base into one txt file or your preferred file type then feed that to your agent as well; however this is essentially what repomix does.

2 Likes

If it’s a public repository, you can also use gitingest. Simply replace ‘github’ with ‘gitingest’ in the repository URL, and you’ll get the entire repo as a text output. You can also adjust options to ignore certain files or limit based on file size.

For referencing library documentation I found an MCP like context 7 much more reliable than Cursor’s build-in docs referencing. Context7 MCP pulls up-to-date, version-specific documentation and code examples straight from the source — and places them directly into your prompt.

3 Likes

Dude, write more, great tips, TY!

gitingest has issue if the pbulic repo is too large. While context 7 seems good for doc, but not for repo… Sometimes it isneeded to refer to local/online repositories, and for me, I also need to refer to some idea in a pdf.

if public repo, I use uithub.(replace the letter g in the github link with u). This gives you control over which files to include, how many tokens etc.

Just drag and drop the folder in the Agent Chat Window and it will add the folder with the files you need for reference or just use @codebase then add your prompt and boom it will reference the files.

Try to be specific with your prompt too like “I need to create X script to do Y function based upon these .py files as reference, please review the files and generate the new python.py script.”

Thank you very much for your answer. I think this is a great tool and easy to use.

Of course! Glad to be of service.

Happy hunting