Workaround to get Openrouter models working in Cursor

If you’re struggling to use Gemini 2.5 or OpenAI models via Openrouter in Cursor, there’s a simple workaround: set up a Litellm proxy on a low-cost Linux VPS (about $1–2/month).

First, configure Litellm with a config.yaml like this:

model_list:
  - model_name: gemini-2.5-pro
    litellm_params:
      model: openrouter/google/gemini-2.5-pro
      api_key: sk-or-v1-yourRealOpenRouterApiKey
      rpm: 60
general_settings:
  disable_database: true
  master_key: sk-mymasterkey

Then launch the proxy with the following command:

LITELLM_MASTER_KEY=sk-mymasterkey docker run -d --name=litellm-proxy -v /root/litellm/config.yaml:/app/config.yaml -e LITELLM_MASTER_KEY=$LITELLM_MASTER_KEY -p 4000:4000 --restart unless-stopped ghcr.io/berriai/litellm:main-latest --config /app/config.yaml

Now LiteLLM will listen on http://your_vps_ip_address:4000 and forward your requests to Openrouter.
After this, you can add models in Cursor and override the OpenAI base URL with your http://your_vps_ip_address:4000. Use the mock key sk-mymasterkey — surprisingly, this works, and Cursor will let you use all Openrouter models this way.

Cheers!

5 Likes

While we are working on an update that improves this, note:

  • Disable all models that are not available in your custom API
  • If a model available on your API key account is not listed you can add it manually in Cursor Settings > Models.

This should make it possible to verify your own API URL & key.
(it has been confirmed by a user in their own thread)

2 Likes

Cursor ships a new version every day, but this hasn’t been fixed for about a year. It’s obviously a money making move, which is why such obscure workarounds are required. If there is a change here, it won’t be that OpenRouter is fully supported - it’ll be that the LiteLLM workaround doesn’t work anymore.

5 Likes

Sorry but that is inaccurate assumption. We are working on improving the system and API URL usage.

2 Likes

Sure thing you do

4 Likes

is there a fix for this issue, i am unable to use litellm models or any local / remote proxy with cursor

2 Likes

Is there some update on this? Even if I set Override OpenAI Base URL it still goes to the openai API instead of custom endpoint.

Same here!

condor Not tired of coding yet?

Try OpenRouter presets.

Name OpenRouter preset anything other than default vendor/model name, e.g., ant-clause-opus works whereas anthropic/claude-opus-4.5, OpenRouter’s default name, won’t work.

Suspect Cursor issue involving model name hashing. Make preset for each conflicting model which happens even if default Cursor models are disabled and/or you can make presets involve model sequences etc.

For example, GLM is fine since Cursor doesn’t support out-of-the-box whereas Opus, GPT 5.2, etc. or other directly supported models in Cursor will fail without presets.

i just realized nothing works via OpenRouter in Cursor! so many open issues on here and on github, some members blamed new models on OpenRouter, but even the older ones don’t work! i see provider_error for so many models on OpenRouter.

we can’t get api key directly from the providers or pay for your MAX, we only have OpenRouter access. so please, stop this, and at least support 3 models from openai, anthropic, and google!

3 Likes