Workaround to get Openrouter models working in Cursor

If you’re struggling to use Gemini 2.5 or OpenAI models via Openrouter in Cursor, there’s a simple workaround: set up a Litellm proxy on a low-cost Linux VPS (about $1–2/month).

First, configure Litellm with a config.yaml like this:

model_list:
  - model_name: gemini-2.5-pro
    litellm_params:
      model: openrouter/google/gemini-2.5-pro
      api_key: sk-or-v1-yourRealOpenRouterApiKey
      rpm: 60
general_settings:
  disable_database: true
  master_key: sk-mymasterkey

Then launch the proxy with the following command:

LITELLM_MASTER_KEY=sk-mymasterkey docker run -d --name=litellm-proxy -v /root/litellm/config.yaml:/app/config.yaml -e LITELLM_MASTER_KEY=$LITELLM_MASTER_KEY -p 4000:4000 --restart unless-stopped ghcr.io/berriai/litellm:main-latest --config /app/config.yaml

Now LiteLLM will listen on http://your_vps_ip_address:4000 and forward your requests to Openrouter.
After this, you can add models in Cursor and override the OpenAI base URL with your http://your_vps_ip_address:4000. Use the mock key sk-mymasterkey — surprisingly, this works, and Cursor will let you use all Openrouter models this way.

Cheers!

4 Likes

While we are working on an update that improves this, note:

  • Disable all models that are not available in your custom API
  • If a model available on your API key account is not listed you can add it manually in Cursor Settings > Models.

This should make it possible to verify your own API URL & key.
(it has been confirmed by a user in their own thread)

1 Like

Cursor ships a new version every day, but this hasn’t been fixed for about a year. It’s obviously a money making move, which is why such obscure workarounds are required. If there is a change here, it won’t be that OpenRouter is fully supported - it’ll be that the LiteLLM workaround doesn’t work anymore.

2 Likes

Sorry but that is inaccurate assumption. We are working on improving the system and API URL usage.

Sure thing you do

2 Likes

is there a fix for this issue, i am unable to use litellm models or any local / remote proxy with cursor

2 Likes

Is there some update on this? Even if I set Override OpenAI Base URL it still goes to the openai API instead of custom endpoint.