Provider Error after 10–20 seconds when using OpenRouter in Cursor

Where does the bug appear (feature/product)?

Cursor IDE

Describe the Bug

When I use OpenRouter as the provider in Cursor, requests start normally and the model responds for about 10–20 seconds. Then Cursor shows a “Provider Error” message and stops responding. This happens repeatedly, so I cannot complete longer prompts or multi-step tasks.

Steps to Reproduce

  1. Open Cursor IDE
  2. Configure AI provider to OpenRouter
  3. Start an AI Chat/Agent request (any prompt)
  4. Wait 10–20 seconds while it is generating a response
  5. Cursor shows “Provider Error” and the response stops

Expected Behavior

The response should continue until completion without disconnecting or failing mid-generation.

Screenshots / Screen Recordings

Operating System

Windows 10/11

Version Information

Version: 2.4.21 (system setup)
VSCode Version: 1.105.1
Commit: dc8361355d709f306d5159635a677a571b277bc0
Date: 2026-01-22T16:57:59.675Z
Build Type: Stable
Release Track: Default
Electron: 39.2.7
Chromium: 142.0.7444.235
Node.js: 22.21.1
V8: 14.2.231.21-electron.0
OS: Windows_NT x64 10.0.26100

For AI issues: which model did you use?

moonshotai/kimi-k2.5
z-ai/glm-4.7

For AI issues: add Request ID with privacy disabled

Request ID: 8e9b9bc0-a944-4645-a5a6-d7127689ab5f
[resource_exhausted] Error
LTe: [resource_exhausted] Error
at kmf (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:9095:38337)
at Cmf (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:9095:37240)
at $mf (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:9096:4395)
at ova.run (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:9096:8170)
at async qyt.resume (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:34190:60450)
at async Wpc.streamFromAgentBackend (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:34239:7695)
at async Wpc.getAgentStreamResponse (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:34239:8436)
at async FTe.submitChatMaybeAbortCurrent (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:9170:14575)

Additional Information

  • Issue is consistent and reproducible.
  • Happens across different prompts.
  • If there are logs I can provide (Cursor logs / console), tell me where to export them and I will attach them.

Does this stop you from using Cursor

Sometimes - I can sometimes use Cursor

Hey, thanks for the report.

OpenRouter is not officially supported by Cursor. We only support OpenAI, Anthropic, Google AI, Azure OpenAI, and AWS Bedrock.

When you use Override Base URL for OpenRouter, it can be unstable and affect all models, including the built-in ones.

Try switching to one of the officially supported providers - they will work more reliably.

I’m having the exact same problem. And im not using “OpenRouter”, im using claude 4.5 that comes with cursor.

I cant send a single message…

The 2.4.21 version worked better than 2.4.22 update.

Provider Error
Request ID: a0141157-2bd5-4cb6-a472-633712489902
{“error”:57,“details”:{“title”:“Provider Error”,“detail”:“We’re having trouble connecting to the model provider. This might be temporary - please try again in a moment.”,“isRetryable”:true,“additionalInfo”:{},“buttons”:,“planChoices”:},“isExpected”:true}
[resource_exhausted] Error
LTe: [resource_exhausted] Error
at Gmf (vscode-file://vscode-app/c:/Users/jello/AppData/Local/Programs/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:9095:38348)
at Hmf (vscode-file://vscode-app/c:/Users/jello/AppData/Local/Programs/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:9095:37251)
at rpf (vscode-file://vscode-app/c:/Users/jello/AppData/Local/Programs/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:9096:4395)
at fva.run (vscode-file://vscode-app/c:/Users/jello/AppData/Local/Programs/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:9096:8170)
at async Hyt.runAgentLoop (vscode-file://vscode-app/c:/Users/jello/AppData/Local/Programs/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:34196:57047)
at async Zpc.streamFromAgentBackend (vscode-file://vscode-app/c:/Users/jello/AppData/Local/Programs/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:34245:7695)
at async Zpc.getAgentStreamResponse (vscode-file://vscode-app/c:/Users/jello/AppData/Local/Programs/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:34245:8436)
at async FTe.submitChatMaybeAbortCurrent (vscode-file://vscode-app/c:/Users/jello/AppData/Local/Programs/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:9170:14575)

Version: 2.4.22 (user setup)
VSCode Version: 1.105.1
Commit: 618c607a249dd7fd2ffc662c6531143833bebd40
Date: 2026-01-26T22:51:47.692Z
Build Type: Stable
Release Track: Default
Electron: 39.2.7
Chromium: 142.0.7444.235
Node.js: 22.21.1
V8: 14.2.231.21-electron.0
OS: Windows_NT x64 10.0.26200

When do you plan to support openrouter?

We don’t have a public timeline for OpenRouter support. The team is aware of the requests, but for now we’re focused on official providers like OpenAI, Anthropic, Google AI, Azure OpenAI, and AWS Bedrock.

With respect, this is not a convincing position for a product positioned as a “next-generation AI IDE.”

OpenRouter is not a fringe or experimental service. It is a production-grade routing layer that provides immediate access to thousands of models, including the most up-to-date versions from OpenAI, Anthropic, Google, Meta, and others. Many developers already rely on it precisely because it abstracts provider volatility and accelerates adoption of new models.

Saying “just use official providers” fundamentally limits Cursor’s flexibility and future-proofing. The AI ecosystem is changing extremely fast. Today’s official provider list may be outdated in months, not years.

I am paying $20/month for Cursor because I expect choice, control, and speed of adaptation. Allowing a stable, officially supported custom Base URL or OpenRouter-style integration is not a risky experiment, it is a strategic necessity.

If Cursor does not move in this direction, another IDE will. And the one that enables broad model access first will gain mindshare very quickly.

I’m not asking for special treatment, only for a clear commitment:
official support for external API routers or a public roadmap toward it.

Please consider this feedback seriously. Many paying users depend on this flexibility.

1 Like