Is Context size still around 20k?

From docs: Cursor - Build Software Faster
“In chat, we limit to around 20,000 tokens at the moment (or less if the model does not support that much context). For cmd-K, we limit to around 10,000 tokens, to balance TTFT and quality. Long-context chat uses the model’s maximum context window.”

From Github Copilot update:

As long as the difference became huge (20k vs 128k) and Copilot doesn’t have a 500 message limit, maybe it’s a good time to change something? Are Cursor docs updated?

2 Likes