Show Current Context Window Size - Feature Request

Hey guys!
We know that different LLM models have different context windows within Cursor.
However, now it’s very tricky to see how many token the model is used and when you need to start a fresh chat. It would be great if there were some sort of indications about the context size window somewhere near the bottom of the Cursor interface (maybe something similar to what google uses in Google Ai Studio)

Also, it’d be great to know what happens to the context window if we switch in the mid chat Claude with Gemini or something else?

Thanks!

21 Likes

Agreed, that would really help us see the limit.

The Context window would most likely be of the Cursor integration context window.

1 Like

Yes please that will be great

1 Like

Isso seria ótimo


Something like Roo Code would be great!

Hey, thanks for the feature request. We’ll consider it.

5 Likes

I created an account to just like this post

1 Like

Having a progress bar showing context + current window cost Would be super nice, current view context is too hidden to be convenient

2 Likes


I am not sure but for now there is a “tokens” at three dot menu after the response is finished.

As you can see in the picture 8,957 tokens. But I’d love to see a token consumption tracking even a context window size on current chat too!

Edit : clarification

Just created an account to say this would be the perfect feature to implement. Context is King, it’s so foundational to working with LLM’s and to know you are within bounds can save some headache and though you can be proactive to create a new chat you’re giving up your conversational context/it’s understanding of how/what it was trying to solve. Trying to summarize it and make extra files to recap seems paradoxical, as you’re now dedicating your tokens to recapping rather than using them towards the project. Even a simple count of tokens in the context window vs. the current context window size (based on the currently selected model + whether Max mode is on/off) would suffice.