Trying to use models from openrouter.ai - error

1 I assigned a token in “OpenAI API Key”
2 “Override OpenAI Base URL” set to OpenRouter
3 Added models “qwen/qwen3.6-plus:free” and “arcee-ai/trinity-large-preview:free” and “anthropic/claude-opus-4.6”

It’s possible to select models - but when they should work I get the error

Provider Error - We’re having trouble finding the ressource you requested. If the problem persists, please contact support.
Request ID: e025d29a-9278-4fca-b192-82d3610d9af0
{“error”:“ERROR_PROVIDER_ERROR”,“details”:{“title”:“Provider Error”,“detail”:“We’re having trouble finding the resource you requested. If the problem persists, please contact support.”,“isRetryable”:false,“additionalInfo”:{},“buttons”:,“planChoices”:},“isExpected”:true}
Provider Error We’re having trouble finding the resource you requested. If the problem persists, please contact support.
aBi: Provider Error We’re having trouble finding the resource you requested. If the problem persists, please contact support.
at Pnw (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:43963:24394)
at Bnw (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:43963:23385)
at $nw (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:43964:6487)
at v9u.run (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:43964:11286)
at async JIn.runAgentLoop (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:56307:11753)
at async C0d.streamFromAgentBackend (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:56377:11057)
at async C0d.getAgentStreamResponse (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:56377:17161)
at async ZOe.submitChatMaybeAbortCurrent (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:44075:19892)
at async Object.Gl [as onSubmit] (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:55360:4891)
at async vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:55334:116485

1 Like

any updates on it ?

Seems to be like my issue of Agent timeouts, and i use “Auto” mode.

Version: 3.0.13 (system setup)
VSCode Version: 1.105.1
Commit: 48a15759f53cd5fc9b5c20936ad7d79847d914b0
Date: 2026-04-07T03:05:17.114Z
Layout: editor
Build Type: Stable
Release Track: Default
Electron: 39.8.1
Chromium: 142.0.7444.265
Node.js: 22.22.1
V8: 14.2.231.22-electron.0
OS: Windows_NT x64 10.0.26200
Request ID: fd97a1af-5149-4cf3-a63e-7fd9a0a73940
{"error":"ERROR_EXTENSION_HOST_TIMEOUT","details":{"title":"Agent Stream Start Timeout","detail":"The extension host did not respond in time. Please reload the window to continue.","isRetryable":false,"shouldShowImmediateError":true,"additionalInfo":{},"buttons":[{"label":"Reload Window","reloadWindow":{}}],"planChoices":[]}}
Agent Stream Start Timeout [deadline_exceeded]
ConnectError: [deadline_exceeded] Agent Stream Start Timeout
    at N (vscode-file://vscode-app/d:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:53347:18592)
    at vscode-file://vscode-app/d:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:53347:18862

nope

Hey, a couple of things:

  1. There’s a typo in the URL. You have https://openrouter.ai/api/V1 with a capital V. The correct one is https://openrouter.ai/api/v1 lowercase. That’s likely what’s causing Provider Error - trouble finding the resource, since the server can’t find the endpoint.

  2. One important note. OpenRouter isn’t an officially supported BYOK provider in Cursor. Supported providers are OpenAI, Anthropic, Google AI, Azure OpenAI, and AWS Bedrock. Some OpenRouter models might work, but it’s not guaranteed. Models that overlap with ones built into Cursor, like anthropic/claude-opus-4.6, will almost certainly error.

Try fixing the URL to lowercase v1 first, then test with models that don’t overlap with the built-in ones. qwen/qwen3.6-plus:free is a good test candidate.

If you need Claude Opus, it’s easier to use it directly via an Anthropic API key in Cursor Settings. That’s usually stable.

Let me know if changing the URL helped.

Thx for your answer
1 are back on “OpenRouter (all lowercase) again - I get the “Provider Error” in both variants…
2 several models have been tested, while all standard models have been deactivated - “qwen/qwen3.6-plus:free” was one of the tested models.

Status: URL for Override OpenAI Base URL changed to : OpenRouter
additional tests for different models - did not do any new behaviour, still the same errors.