Horrible context, model can't even remember previous message

Where does the bug appear (feature/product)?

Cursor IDE

Describe the Bug

I literally send one message referring to something in the previous message and the model has no understanding of it whatsoever, and starts reading files to try and figure out what I mean.

I go in ask mode; I ask some stuff, the AI suggests something, I tell it to implement it in the next prompt, and it completely loses context of the previous message.

Even with the summarization feature it should still be able to know what was mentioned in the previous message.

I did not switch model, and the next message was sent right after the first one was sent (so no big time difference).

This doesn’t always happen by the way, but it’s happened to me several times now.

Steps to Reproduce

I don’t know try for urself I guess?

Expected Behavior

It’s supposed to remember the previous message so when I mention it it can know what I’m talking about.

Operating System

MacOS

Current Cursor Version (Menu → About Cursor → Copy)

Version: 2.0.40
VSCode Version: 1.99.3
Commit: a9b73428ca6aeb2d24623da2841a271543735560
Date: 2025-10-30T04:12:35.564Z
Electron: 34.5.8
Chromium: 132.0.6834.210
Node.js: 20.19.1
V8: 13.2.152.41-electron.0
OS: Darwin x64 20.6.0

For AI issues: which model did you use?

Sonnet 4.5 thinking

Additional Information

Only happens sometimes not always, but still very annoying

Does this stop you from using Cursor

No - Cursor works, but with this issue

Hey, thanks for the report. This is a known issue we’re tracking: sometimes chat summarization loses context from previous messages.

To help us debug, please share:

  • Request ID with Privacy Mode turned off (Chat menu > Copy Request ID)
  • The context percentage shown in chat when this happened
  • Whether Max Mode was enabled in your model settings

Workaround that helped others: add an explicit rule at the start of the chat:
“Maintain full awareness of previous conversation context. Do not reset or forget prior work when summarizing.”

Let me know if this workaround helps.