We’d like to see an approximate token count or context size, or even just a rough estimation of the amount of ‘requests’ a MAX query will take, based on how much context we’re attaching.
I realize that cost estimation is hard because output tokens & tool calls can’t be predicted well, but at least if we can see the current total context size we can have control over input token cost. Other AI IDEs like Cline and RooCode show detailed breakdown of token counts and context management, I would love to see this implemented in Cursor too.
In my opinion it would help to see how many tokens will the provided context take for each request, I dont know if I attach a few files if they take up 1K or 10K tokens. But it might be difficult to implement as there are options e.g. to provide full file content when attaching a folder to the context or just the file structure etc. However anything that gives us more control over the context, tokens, request count, I am all for it.
Due to both caching and output tokens, it’s basically impossible to estimate this. However, as people have found, you can see how many tokens were used after each query is complete, which may help you estimate yourself!