Adding files/folders in context

Where does the bug appear (feature/product)?

Cursor IDE

Describe the Bug

Two things:

  1. When I create a new folder, say /Users/ygoldberg/dir, and then try to add this to the context of a prompt with @dir, it won’t show up. I have to refresh the IDE, and then it shows up.

  2. When a file is too large and I try to reference it with @large_file, it shows up and then I hit enter and the text @large_file goes away, instead of being highlighted in the blue background. I guess if the file is too large it’s a problem for the model’s context, but maybe there’s a better way to deal with this in UI? Maybe highlight the @large_file in red to indicate that the model is just gonna grep? IDK.

Steps to Reproduce

  1. Create a new directory, try to add to context, it won’t work, refresh IDE and it does work.

  2. Create a really large file and try to add to context.

Operating System

MacOS

Current Cursor Version (Menu → About Cursor → Copy)

Version: 2.2.44 (Universal)
VSCode Version: 1.105.1

Does this stop you from using Cursor

No - Cursor works, but with this issue

Hey there!

I can reproduce this when there’s no new file in the directory, as soon as I add one it’s fine. I wonder if this is more or less the same problem as having an empty folder in Git.

For me, the behavior is the same whether I’ve restarted or not. Are you sure you haven’t added a new file to that folder in the meantime?

How big are we talking?

  1. For me this even happens when I create a new file, not just a folder

  2. IDK what your cutoff is, but the one I just tried is 9.9MiB, ~300k lines. IK you’re not going to fit the whole thing in context, but I’m just asking for cleaner UI. This is something I new relatively often, where I have a massive logfile and I ask cursor to grep through it to find something.

    Thanks!

Honestly the UI with adding files with @ is just really buggy in general, lot’s of weird things, I’ll take a screen recording sometime today to show you.

Thanks Yonah!

I also got an update on this today – the large files issue should be fixed soon, if not now, in the next few weeks