Message Too Long?

Where does the bug appear (feature/product)?

Cursor IDE

Describe the Bug

Your message is too long. Please try again with a shorter message and fewer/smaller attached items.

Steps to Reproduce

Press “Fix” button on one of Agent Review Issues

Operating System

Linux

Current Cursor Version (Menu → About Cursor → Copy)

Version: 2.1.46
VSCode Version: 1.105.1
Commit: ab326d0767c02fb9847b342c43ea58275c4b1680
Date: 2025-12-02T03:59:29.283Z
Electron: 37.7.0
Chromium: 138.0.7204.251
Node.js: 22.20.0
V8: 13.8.258.32-electron.0
OS: Linux x64 6.17.8-300.fc43.x86_64

For AI issues: which model did you use?

Auto, Grok Code

Does this stop you from using Cursor

Sometimes - I can sometimes use Cursor

That message is too long is on all models. I’ve found a way to get around it, tell Cursor to check the folder e.g. Temp which contains a file called 123blah.whatever and it will check the file, regardless of length.

1 Like

Hey, thanks for the report. The issue when clicking the “Fix” button in Agent Review triggers the error “Your message is too long. Please try again with a shorter message and fewer/smaller attached items.” It might be related to large context.

This happens when Cursor tries to send too much context (files, code, problem description) in one request and exceeds the model’s context window limit.

Temporary solutions:

  • As @jokerfool already suggested - instead of the Fix button, write in the chat: “check file X in folder Y and fix the issue” (Cursor will read the file itself via the @ symbol)
  • Try enabling Max Mode before clicking Fix (model picker > toggle Max Mode) - this will increase the context window
  • If the file is very large - break down the fix: open the file, select the problematic section, use Cmd+K and describe the issue manually
  • Check if MCP servers with large output are connected - temporarily disable MCP via Settings > Features > Model Context Protocol, restart Cursor and try again

Could you please share:

  • the size of the file that Agent Review is trying to fix (number of lines)
  • Request ID, if available (chat menu > Copy Request ID)
  • if MCP servers are enabled (Settings > Features > Model Context Protocol)

Please provide this data and I’ll pass it on to the team.

2 Likes

thank you. Also one question, Max mode (i use AzureOpenAI’s keys) gives the same output quality as cursor’s already added models? like Cursor’s Opus 4.5 same quality as Opus 4.5 from Anthropic API? in everythign like tool calling etc?

Glad the problem is resolved. Regarding the question about quality: with the same model and version, the quality shouldn’t change. Max Mode just provides more context, without changing the model itself, as well as with tool calls and such.

1 Like

thx for support

2 Likes

This topic was automatically closed 22 days after the last reply. New replies are no longer allowed.