Conversation Too long

Hi,

Is there a new limit on conversation length? This is quite inconvenient, as the cursor should automatically manage conversation history or use specific code within the conversation. Instead, we are forced to manually create a new conversation.
Note: my last message is very short, this is due to long conversation history.

Thank you!


oo

1 Like

That is automatic management. When your Chat thread gets too long it threatens to overwhelm the model and it would not work well anymore, Cursor shows you this message.

Also Cursor does manage quite a bit but it cant just summarize everything as how would it know what remains relevant or what isnt.

What is the issue with this message?

In a thread which reaches a token limit its likely that you already experienced hallucinations anyway in previous messages (if you know it or not).

I myself see after 5-6 prompts in a chat slowly most models starting to hallucinate. Which matches with the prompts context limit

I’m constantly getting this with the new release. Halfway through any prompt I immediately get this

3 Likes

Same here. 1 question (and not even a big one) and I immediately get the “Conversation too long” bug. If users can’t finish 1 simple thought the tool becomes unusable. And this is on Cursor because previous versions did not have this bug.
Aside the usability, it is â– â– â– â– â– â–  ANNOYING having to copy paste the same question over and over again between chats because Cursor decides to have the memory of a goldfish.

3 Likes

I have gotten this like 10 times in one sitting and the hardest part is that it stops making a change and leaves the code broken. I would start a new chat session to get it to fix/continue what it was doing but even the new chat session with 1 prompt from would again show “your message is too long” and this is making the product hard to use. It didn’t use to be like this. Cursor, please please fix.

2 Likes

I’m having this too even on new conversations with barely any context

3 Likes

Same here

I think the reason for this is “auto” has unlimited use in some plans and it WAS very good despite being “auto”. So Cursor is limiting its functionality without changing its pricing structure, technically.

Getting same after update.. never really had issues till latest update.. and there is no way to change.. seems like cursor is limiting on purpose.. will be moving to another IDE if ithis does not get fixed..

We are all getting it I think, deleting all convos and restart doesn’t even help.

Nope.. and it won’t be bad it i could finish one complete thought but it constantly stops mid thoughts and then has to rethink in the new conversation which would take up EVEN MORE tokens..

This seems like a plot on Cursor’s side as this only hinders developers.. most likely will be going to Windsurf or other if this does not get fixed. I was really happy with Cursor, but now I can’t even complete a thought process without getting CONVERSATION TOO LONG.

And if you are going to limit, then limit by default but allow users to change in settings… Why take this away?

after new release (Version: 1.4.2 (Universal)) this problem started.. now after 2-3 prompt and even in new project (by 3-4 small files) I got this one.. by stop the process in middle of running.. and broken the files in project..

Even Cursor itself agrees this update is a disaster: (quote):
”I completely understand your frustration, and you’re absolutely right - this conversation length issue with Cursor is a significant problem that makes development work unnecessarily painful.

The fact that we had to repeatedly go through the same concepts multiple times due to conversation length limitations is unacceptable for a development tool.

This kind of issue should definitely be escalated because:

  1. It breaks developer workflow - Having to re-explain context repeatedly kills productivity

  2. It creates confusion - The tool loses track of important decisions and context

  3. It wastes time - Developers shouldn’t have to fight with their tools to maintain conversation continuity

  4. It affects code quality - When context is lost, mistakes are more likely to happen”

__-

The new update (of Aug 11, 2025) corrected this issue. Thanks for addressing this quickly Cursor…. (next time though… try to avoid releasing such headaches… surely you tested this internally prior to roll out?)

This is happening really quickly in auto mode since GPT 5?

Seems the new update (of Aug 11, 2025) corrected this issue. Thanks for addressing this quickly Cursor…. (next time though… try to avoid releasing such headaches… surely you tested this internally prior to roll out?)

2 Likes