Content Suggestion:
Issue Description:
When using Cursor for development, the context token limit of the LLM (Large Language Model) may affect code generation or the continuity of conversations. When the context tokens approach the model’s upper limit, developers may not realize it in time, leading to incomplete code generation or interrupted conversations.
Proposal:
It is suggested to add a feature in Cursor that notifies developers when the context tokens are close to the LLM’s limit (e.g., reaching 80% or 90%). The notification could include the following information:
- The current number of tokens used.
- The remaining number of tokens available.
- Suggested actions (e.g., clearing the context, shortening the input, or starting a new conversation).
Implementation Details:
- Display the current token usage in the editor’s status bar (e.g., “Tokens Used: 1200/1500”).
- When token usage approaches the limit, show a lightweight notification or highlight the status bar.
- Provide a one-click button to clear the context, allowing developers to quickly free up tokens.
Expected Outcomes:
- Help developers better manage context tokens, avoiding interruptions caused by exceeding the limit.
- Enhance the development experience by reducing inconveniences related to token limits.
Additional Suggestions:
- Provide a settings option to allow developers to customize the notification threshold (e.g., 70%, 80%, 90%).
- Include relevant documentation links in the notification to help developers optimize token usage.
We hope this feature can be implemented to further improve the user experience of Cursor! Thank you to the development team for your hard work!