If you’re struggling to use Gemini 2.5 or OpenAI models via Openrouter in Cursor, there’s a simple workaround: set up a Litellm proxy on a low-cost Linux VPS (about $1–2/month).
First, configure Litellm with a config.yaml like this:
model_list:
- model_name: gemini-2.5-pro
litellm_params:
model: openrouter/google/gemini-2.5-pro
api_key: sk-or-v1-yourRealOpenRouterApiKey
rpm: 60
general_settings:
disable_database: true
master_key: sk-mymasterkey
Then launch the proxy with the following command:
LITELLM_MASTER_KEY=sk-mymasterkey docker run -d --name=litellm-proxy -v /root/litellm/config.yaml:/app/config.yaml -e LITELLM_MASTER_KEY=$LITELLM_MASTER_KEY -p 4000:4000 --restart unless-stopped ghcr.io/berriai/litellm:main-latest --config /app/config.yaml
Now LiteLLM will listen on http://your_vps_ip_address:4000 and forward your requests to Openrouter.
After this, you can add models in Cursor and override the OpenAI base URL with your http://your_vps_ip_address:4000. Use the mock key sk-mymasterkey — surprisingly, this works, and Cursor will let you use all Openrouter models this way.
Cheers!