Show Current Context Window Size - Feature Request

Hey guys!
We know that different LLM models have different context windows within Cursor.
However, now it’s very tricky to see how many token the model is used and when you need to start a fresh chat. It would be great if there were some sort of indications about the context size window somewhere near the bottom of the Cursor interface (maybe something similar to what google uses in Google Ai Studio)

Also, it’d be great to know what happens to the context window if we switch in the mid chat Claude with Gemini or something else?

Thanks!

19 Likes

Agreed, that would really help us see the limit.

The Context window would most likely be of the Cursor integration context window.

1 Like

Yes please that will be great

1 Like

Isso seria ótimo


Something like Roo Code would be great!

Hey, thanks for the feature request. We’ll consider it.

5 Likes

I created an account to just like this post

1 Like

Having a progress bar showing context + current window cost Would be super nice, current view context is too hidden to be convenient

2 Likes


I am not sure but for now there is a “tokens” at three dot menu after the response is finished.

As you can see in the picture 8,957 tokens. But I’d love to see a token consumption tracking even a context window size on current chat too!

Edit : clarification