If you’re struggling to use Gemini 2.5 or OpenAI models via Openrouter in Cursor, there’s a simple workaround: set up a Litellm proxy on a low-cost Linux VPS (about $1–2/month).
First, configure Litellm with a config.yaml like this:
Now LiteLLM will listen on http://your_vps_ip_address:4000 and forward your requests to Openrouter.
After this, you can add models in Cursor and override the OpenAI base URL with your http://your_vps_ip_address:4000. Use the mock key sk-mymasterkey — surprisingly, this works, and Cursor will let you use all Openrouter models this way.
Cursor ships a new version every day, but this hasn’t been fixed for about a year. It’s obviously a money making move, which is why such obscure workarounds are required. If there is a change here, it won’t be that OpenRouter is fully supported - it’ll be that the LiteLLM workaround doesn’t work anymore.
Name OpenRouter preset anything other than default vendor/model name, e.g., ant-clause-opus works whereas anthropic/claude-opus-4.5, OpenRouter’s default name, won’t work.
Suspect Cursor issue involving model name hashing. Make preset for each conflicting model which happens even if default Cursor models are disabled and/or you can make presets involve model sequences etc.
For example, GLM is fine since Cursor doesn’t support out-of-the-box whereas Opus, GPT 5.2, etc. or other directly supported models in Cursor will fail without presets.
i just realized nothing works via OpenRouter in Cursor! so many open issues on here and on github, some members blamed new models on OpenRouter, but even the older ones don’t work! i see provider_error for so many models on OpenRouter.
we can’t get api key directly from the providers or pay for your MAX, we only have OpenRouter access. so please, stop this, and at least support 3 models from openai, anthropic, and google!