Reduce AI Costs by 50%: Control the Context Sent to the Model

As a heavy user of cursor , I’m currently paying $200 yet I find myself hitting usage limits far too quickly due to how the system manages context. I understand the shift to a token-based model is financially necessary on your end, but as a paying customer, I need more transparency and control over what is being sent to the model with each call.

The Problem:
Right now, the context window is populated automatically and opaquely. It includes entire conversations and file contents — many of which I have no interest in sending again and again. For example:

  • Old or irrelevant messages are sent repeatedly, even if they’re no longer needed.
  • Files I didn’t explicitly add are included in the context without notice.
  • There’s no way to trim, edit, or exclude portions of content that are inflating the token count.

This leads to massive inefficiency in token usage and a lack of control over the experience.

What I’m Requesting:

  1. Message-level Control

    • For every message in the conversation history, I want to toggle whether it should be included in the next call to the model.
    • I also want the option to edit any message, cutting unnecessary sections, so that only the relevant parts are sent.
  2. File Context Visibility

    • Show me a full list of all files being sent to the model each time.
    • Allow me to deselect or exclude individual files or file ranges from the next context window.
  3. Live Preview of Context Payload

    • Before each model request is sent, I want to be able to see exactly what is being sent — messages, files, system instructions, etc.
    • This should include both human-readable and token-based views to understand how much each element contributes to cost.

By giving me the ability to choose, edit, and manage the context window, you empower me to control my usage and reduce cost without reducing productivity. I’m confident that with these tools, I could reduce my token usage — and my overall spend — by more than 50%.

3 Likes

But also have the ability to turn this feature off or on… as it seems also like something that could potentially “get in the way” for certain folks if it were on all the time.

1 Like

More control and visibility would definitely bs great to have. Including on model parameters such as temperature

1 Like