Debugging - Is there a limit to the file size that can be uploaded to cursor

To debug some issue, I need to upload the log file. Is there a limit on the size of the file that can be uploaded. Some of my log files are pretty large (few GB). What is the best way to do such debugging?

Hey, good question.

Cursor isn’t meant to load and analyze multi-GB files all at once. Here are the main limits:

  1. Indexing: Cursor automatically skips files larger than 1 MB during indexing, and it ignores *.log files by default.
  2. Context window: Even the biggest models have a max context window of about 1M tokens. A few GB of text is way more than any LLM can handle in one go.

What works in practice for debugging big logs:

  • Pre-filter first: use grep, awk, sed, etc. to pull out only the relevant lines from the log, like error patterns, a timestamp range, or a specific request ID. Save the filtered output as a separate file, ideally under about 100 to 200 KB.
  • Share chunks in chat: copy the relevant part of the log and paste it straight into Agent or Chat. Or save the filtered log as a file in your project and reference it with @file.
  • Iterate: start by describing the issue and ask Cursor to help you write grep or awk commands to find the right patterns in the log. Then paste the results back in.

So the key is not uploading the whole log, but narrowing it down to what actually fits in context.

1 Like