I’ve been using my own OpenRouter API key for the last months without any problem. Suddenly, since last week I’m getting the error the model does not work with your current plan or api key. It happens with some models (gemini, llama, deepseek). I’ve tried with different namings (google/gemini-2.5-pro-preview, gemini-2.5-pro-preview) but still doesn’t work.
Weirdly gpt-4o works but openai/gpt-4o fails.
I’m on a free plan but using custom API keys. Without this feature it doesn’t make sense for me to use Cursor.
With the OpenRouter key, it still show the errors like:
The model anthropic/claude-3.5-sonnet does not work with your current plan or api key
But gpt-4o works well, and I try to downgrade the old version of cursor, and it shows the same error, probably the error is triggered by the Cursor API.
I think they might be doing this on purpose, to force you use the pro version and not your own API keys. For the moment I moved to Cline which allows you to use your OpenRouter API keys without any problem.
I’m now paying for the pro version and the custom API keys with OpenRouter still don’t work… I don’t understand why do they offer this option if it doesn’t work…