OpenRouter Gemini models no longer work

I’ve been using my own OpenRouter API key for the last months without any problem. Suddenly, since last week I’m getting the error the model does not work with your current plan or api key. It happens with some models (gemini, llama, deepseek). I’ve tried with different namings (google/gemini-2.5-pro-preview, gemini-2.5-pro-preview) but still doesn’t work.

Weirdly gpt-4o works but openai/gpt-4o fails.

Maybe something changed in the way you parse model names?

4 Likes

Also, the o3 model can’t be used via OpenRouter. It requieres to enable MAX mode which is only available if you pay.

1 Like

Yes, I have the same error message. In my case, all models cannot be used.

3 Likes

Getting the same error here, but API key and URL verification passes

2 Likes

Yes, same happens to me… Maybe it has something to do with the new version they released

Any clue why is this happening?