I used to be able to give Cursor a React project and ask it a question that it would subsequently find an answer to by looking into files within the project (which happened to be in subdirectories within the React project I referenced).
That is no longer the case - when I give cursor a folder it only sees the files in the folder I gave it, it is no longer indexing files within subdirectories of the folder I referenced. Its lack of awareness can be confirmed by the “Long-file details” message which does not include any files from any subdirectories.
In the chat, while you can @ specific files and folders, you still have the ability to use the Ctrl/Cmd + Enter shortcut to get the chat to look at the relevant files itself.
In Composer, you have the two modes:
Normal - In this mode, you can @codebase to get Cursor to pick it’s own context. Also, for more granular control, when you type @, you have a folder called “Recommended” which can suggest files to add to context.
Agent - In agent mode, Composer will find and look at whatever files it thinks it needs. Agent mode is still being worked on, so may have imperfect results for now.
@danperks I believe I was forgetting to hit command + enter, I was just hitting enter. I’m getting better results after using command+enter.
I think there’s a UX question here of why the product wouldn’t assume an end user wants to index code if they explicitly mentioned a code artifact, but my problem is solved - thank you.
Agreed. I want to have the AI review an entire subdirectory tree, but without worrying that it’s going to wander the codebase picking other files to look at ■■■■■-nilly. Cursor is so effective in part because of the granular control you have over the context you feed to the AI. Please give us that control. Perhaps a separate syntax such as @/path/to/dir/**/*?
ETA: I did just give it a try giving it permission to choose its own context, and it did indeed pull in all sorts of context unrelated to the target directory.
I feel like this is particularly noticeable in chat using o1 where I simply mention one or two files and several get pulled into the context which probably contributes to how slow that model is to respond.