Using Own LLM API with Cursor Fast Request

Hello there,

Can I use my own LLM model (e.g., Claude Sonnet 3.7 via OpenRouter) instead of Cursor’s default models while still utilizing fast requests efficiently? If I use my own API key, does it still count towards fast requests, or is that only for Cursor’s provided models?

Hey, if you connect your API key, you cover all expenses with your provider. Your fast requests from the Pro subscription aren’t counted in this case. However, please note that your API keys don’t work in agent mode, only chat is available to you.

2 Likes