Custom API Keys fail with "The model does not work with your current plan or api key"

I’ve been using my own OpenRouter API key for the last months without any problem. Suddenly, since last week I’m getting the error the model does not work with your current plan or api key. It happens with some models (gemini, llama, deepseek). I’ve tried with different namings (google/gemini-2.5-pro-preview, gemini-2.5-pro-preview) but still doesn’t work.

Weirdly gpt-4o works but openai/gpt-4o fails.

I’m on a free plan but using custom API keys. Without this feature it doesn’t make sense for me to use Cursor.

8 Likes

me too

Hey, thanks for the report. We’ll try to fix it.

1 Like

yeah same issue. Very annoying. Have tried a bunch of options/workarounds at this point none work. Basically end up back at the same message.

2 Likes

This is still occurring with 0.51.1 BTW

2 Likes

With the OpenRouter key, it still show the errors like:
The model anthropic/claude-3.5-sonnet does not work with your current plan or api key

But gpt-4o works well, and I try to downgrade the old version of cursor, and it shows the same error, probably the error is triggered by the Cursor API.

1 Like

hi @deanrie , do we have news about this issue?

thanks!

1 Like

got the same issue. Any progress? still occurring with Version: 1.0.0

2 Likes

I think they might be doing this on purpose, to force you use the pro version and not your own API keys. For the moment I moved to Cline which allows you to use your OpenRouter API keys without any problem.

i think you are right, they force user to pay to use their own service

1 Like

I’m now paying for the pro version and the custom API keys with OpenRouter still don’t work… I don’t understand why do they offer this option if it doesn’t work…

Any news on this? I’m still having same issue