I’ve been using my own OpenRouter API key for the last months without any problem. Suddenly, since last week I’m getting the error the model does not work with your current plan or api key
. It happens with some models (gemini, llama, deepseek). I’ve tried with different namings (google/gemini-2.5-pro-preview
, gemini-2.5-pro-preview
) but still doesn’t work.
Weirdly gpt-4o
works but openai/gpt-4o
fails.
Maybe something changed in the way you parse model names?