Cursor + AWS Bedrock: can we use both without constantly switching settings?

Hey Cursor team :waving_hand:

Is there a way to use AWS Bedrock and still keep using the Cursor catalog models at the same time?

Right now, whenever I want to switch back and forth I have to dig into:
Settings → Models → API Keys → AWS Bedrock and toggle it on/off manually.
It’s pretty exhausting and slows down the workflow a lot.

Ideally I’d like both enabled so I can just pick the model directly without toggling Bedrock each time.

Is this possible today? If not, is there a plan for it?

Thanks!

3 Likes

Hey, thanks for the question.

Unfortunately, this isn’t possible right now. When AWS Bedrock is enabled, all requests, including requests to Cursor catalog models, get routed through Bedrock. This is a known issue and the team is already tracking it.

As a workaround, turn off Bedrock before using other models:
Settings → Models → API Keys → AWS Bedrock → toggle off

I’ve added your report to help increase visibility.

Also, if you’re interested, here’s a similar report with more details: Bedrock enabled reroutes Gemini API requests to Bedrock (“Selected model is not supported by bedrock”) — cannot use Gemini + Bedrock simultaneously (possibly when quota exhausted)

1 Like

Thanks Dean, appreciate the update :raising_hands:

1 Like

@deanrie Are there any plans to support this soon?

Is there a fix for this in the backlog?

This issue was fixed in version 2.5. Bedrock shouldn’t intercept requests to other models anymore. Update Cursor to the latest version and check again.

If the issue still happens after updating, let me know and we’ll look into it.

hi @deanrie I am trying to use the subagent feature of cursor with our bedrock models, the features seems to be not working. Can anyone else try and confirm or let me know if there is any config I am missing for this to work.

Can you please check the subagent feature with bedrock integration.

Hey, this is a known issue. Subagents aren’t inheriting Bedrock credentials correctly right now. The team is aware.

A couple things we need for debugging:

  1. What Cursor version are you on? (Help > About)
  2. What exactly happens? An error, a fallback to another model, or the subagent just doesn’t start?
  3. Which model are you using via Bedrock?

There’s also a similar bug for custom OpenAI URLs: Sub-agents are not using custom OpenAI base URLs

Let me know the details and I’ll pass them to the team.

> 1. Cursor version: 2.5.25 (Universal), Stable, macOS arm64

> 2. What happens: Subagent immediately fails with error: AI Model Not Found — Model name is not valid: “us.anthropic.claude-opus-4-6-v1”. The Task tool never starts; it returns an error inline.

> 3. Model via Bedrock: us.anthropic.claude-opus-4-6-v1 (Claude Opus 4.6, US cross-region Bedrock endpoint)

The core issue is that Cursor’s subagent spawning doesn’t properly resolve Bedrock model IDs. The parent agent works because the initial routing is set up, but child subagents fail because they can’t find the model by its Bedrock ARN/name. This is likely a Cursor-side bug where subagent invocations need to inherit the parent’s model provider configuration rather than doing a fresh model lookup by name.

1 Like

for me is also not working, when bedrock token is set I’m not able to use other models (like Codex;Gemini;…) for neither agent or quick edit…

the version I’m using:

Version: 2.5.26
VSCode Version: 1.105.1
Commit: 7d96c2a03bb088ad367615e9da1a3fe20fbbc6a0
Date: 2026-02-26T04:57:56.825Z
Build Type: Stable
Release Track: Nightly
Electron: 39.4.0
Chromium: 142.0.7444.265
Node.js: 22.22.0
V8: 14.2.231.22-electron.0
OS: Darwin arm64 25.3.0

The error on the agent panel:

Request ID: 29f589ab-b6de-449a-b924-b4993b82ebeb
{"error":"ERROR_OPENAI","details":{"title":"Unable to reach the model provider","detail":"We encountered an issue when using your API key: Selected model is not supported by bedrock, please use a different model\n\nAPI Error:\n\n```\nSelected model is not supported by bedrock, please use a different model\n```","additionalInfo":{},"buttons":[],"planChoices":[]},"isExpected":true}
[invalid_argument] Error
_he: [invalid_argument] Error
    at jpA (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:32115:44883)
    at WpA (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:32115:43786)
    at ZpA (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:32116:5088)
    at Ool.run (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:32116:9098)
    at async s$o.runAgentLoop (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:44360:8423)
    at async zOl.streamFromAgentBackend (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:44408:8884)
    at async zOl.getAgentStreamResponse (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:44408:9837)
    at async yLe.submitChatMaybeAbortCurrent (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:32182:15752)
    at async Object.Gs [as onSubmit] (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:43415:4781)
    at async vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:43389:57281

The log I get from Terminal:

workbench.desktop.main.js:44241 [transport] Stream error reported from extension host ConnectError: [invalid_argument] Error
    at o6g.$endAiConnectTransportReportError (workbench.desktop.main.js:43352:34726)
    at Nxt._doInvokeHandler (workbench.desktop.main.js:44181:23171)
    at Nxt._invokeHandler (workbench.desktop.main.js:44181:22913)
    at Nxt._receiveRequest (workbench.desktop.main.js:44181:21545)
    at Nxt._receiveOneMessage (workbench.desktop.main.js:44181:20362)
    at UCn.value (workbench.desktop.main.js:44181:18389)
    at He._deliver (workbench.desktop.main.js:55:2962)
    at He.fire (workbench.desktop.main.js:55:3283)
    at osn.fire (workbench.desktop.main.js:43336:12156)
    at MessagePort.<anonymous> (workbench.desktop.main.js:47217:18406) {arch: 'arm64', platform: 'darwin', channel: 'stable', client_version: '2.5.26', error: {…}, …}
error @ workbench.desktop.main.js:44241
$endAiConnectTransportReportError @ workbench.desktop.main.js:43352
_doInvokeHandler @ workbench.desktop.main.js:44181
_invokeHandler @ workbench.desktop.main.js:44181
_receiveRequest @ workbench.desktop.main.js:44181
_receiveOneMessage @ workbench.desktop.main.js:44181
(anonymous) @ workbench.desktop.main.js:44181
_deliver @ workbench.desktop.main.js:55
fire @ workbench.desktop.main.js:55
fire @ workbench.desktop.main.js:43336
(anonymous) @ workbench.desktop.main.js:47217
workbench.desktop.main.js:44241 [transport] [AGENT_ERROR_DIAGNOSTICS] requestId=29f589ab-b6de-449a-b924-b4993b82ebeb decision=RETRY (countAsServerError=true, countAsTransportError=false) {arch: 'arm64', platform: 'darwin', channel: 'stable', client_version: '2.5.26', decision: 'RETRY (countAsServerError=true, countAsTransportError=false)', …}
_log @ workbench.desktop.main.js:44241
warn @ workbench.desktop.main.js:44241
log @ workbench.desktop.main.js:32116
e @ workbench.desktop.main.js:464
warn @ workbench.desktop.main.js:464
run @ workbench.desktop.main.js:32116
await in run
run @ workbench.desktop.main.js:44360
runAgentLoop @ workbench.desktop.main.js:44360
streamFromAgentBackend @ workbench.desktop.main.js:44408
await in streamFromAgentBackend
getAgentStreamResponse @ workbench.desktop.main.js:44408
submitChatMaybeAbortCurrent @ workbench.desktop.main.js:32182
await in submitChatMaybeAbortCurrent
Gs @ workbench.desktop.main.js:43415
await in Gs
(anonymous) @ workbench.desktop.main.js:43389
At @ workbench.desktop.main.js:43389
onClick @ workbench.desktop.main.js:43389
(anonymous) @ workbench.desktop.main.js:40611
workbench.desktop.main.js:44241 [composer] No first token received within 2s {arch: 'arm64', platform: 'darwin', channel: 'stable', client_version: '2.5.26', requestId: '29f589ab-b6de-449a-b924-b4993b82ebeb', …}
_log @ workbench.desktop.main.js:44241
warn @ workbench.desktop.main.js:44241
(anonymous) @ workbench.desktop.main.js:32181
setTimeout
e.setTimeout @ workbench.desktop.main.js:43632
(anonymous) @ workbench.desktop.main.js:32181
submitChatMaybeAbortCurrent @ workbench.desktop.main.js:32181
await in submitChatMaybeAbortCurrent
Gs @ workbench.desktop.main.js:43415
await in Gs
(anonymous) @ workbench.desktop.main.js:43389
At @ workbench.desktop.main.js:43389
onClick @ workbench.desktop.main.js:43389
(anonymous) @ workbench.desktop.main.js:40611
workbench.desktop.main.js:44241 [composer] No first token received within 4s {arch: 'arm64', platform: 'darwin', channel: 'stable', client_version: '2.5.26', requestId: '29f589ab-b6de-449a-b924-b4993b82ebeb', …}
_log @ workbench.desktop.main.js:44241
warn @ workbench.desktop.main.js:44241
(anonymous) @ workbench.desktop.main.js:32181
setTimeout
e.setTimeout @ workbench.desktop.main.js:43632
(anonymous) @ workbench.desktop.main.js:32181
submitChatMaybeAbortCurrent @ workbench.desktop.main.js:32181
await in submitChatMaybeAbortCurrent
Gs @ workbench.desktop.main.js:43415
await in Gs
(anonymous) @ workbench.desktop.main.js:43389
At @ workbench.desktop.main.js:43389
onClick @ workbench.desktop.main.js:43389
(anonymous) @ workbench.desktop.main.js:40611
workbench.desktop.main.js:44241 [transport] Stream error reported from extension host ConnectError: [invalid_argument] Error
    at o6g.$endAiConnectTransportReportError (workbench.desktop.main.js:43352:34726)
    at Nxt._doInvokeHandler (workbench.desktop.main.js:44181:23171)
    at Nxt._invokeHandler (workbench.desktop.main.js:44181:22913)
    at Nxt._receiveRequest (workbench.desktop.main.js:44181:21545)
    at Nxt._receiveOneMessage (workbench.desktop.main.js:44181:20362)
    at UCn.value (workbench.desktop.main.js:44181:18389)
    at He._deliver (workbench.desktop.main.js:55:2962)
    at He.fire (workbench.desktop.main.js:55:3283)
    at osn.fire (workbench.desktop.main.js:43336:12156)
    at MessagePort.<anonymous> (workbench.desktop.main.js:47217:18406) {arch: 'arm64', platform: 'darwin', channel: 'stable', client_version: '2.5.26', error: {…}, …}
error @ workbench.desktop.main.js:44241
$endAiConnectTransportReportError @ workbench.desktop.main.js:43352
_doInvokeHandler @ workbench.desktop.main.js:44181
_invokeHandler @ workbench.desktop.main.js:44181
_receiveRequest @ workbench.desktop.main.js:44181
_receiveOneMessage @ workbench.desktop.main.js:44181
(anonymous) @ workbench.desktop.main.js:44181
_deliver @ workbench.desktop.main.js:55
fire @ workbench.desktop.main.js:55
fire @ workbench.desktop.main.js:43336
(anonymous) @ workbench.desktop.main.js:47217
workbench.desktop.main.js:44241 [transport] [AGENT_ERROR_DIAGNOSTICS] requestId=29f589ab-b6de-449a-b924-b4993b82ebeb decision=RETRY (countAsServerError=true, countAsTransportError=false) {arch: 'arm64', platform: 'darwin', channel: 'stable', client_version: '2.5.26', decision: 'RETRY (countAsServerError=true, countAsTransportError=false)', …}
_log @ workbench.desktop.main.js:44241
warn @ workbench.desktop.main.js:44241
log @ workbench.desktop.main.js:32116
e @ workbench.desktop.main.js:464
warn @ workbench.desktop.main.js:464
run @ workbench.desktop.main.js:32116
await in run
run @ workbench.desktop.main.js:44360
runAgentLoop @ workbench.desktop.main.js:44360
streamFromAgentBackend @ workbench.desktop.main.js:44408
await in streamFromAgentBackend
getAgentStreamResponse @ workbench.desktop.main.js:44408
submitChatMaybeAbortCurrent @ workbench.desktop.main.js:32182
await in submitChatMaybeAbortCurrent
Gs @ workbench.desktop.main.js:43415
await in Gs
(anonymous) @ workbench.desktop.main.js:43389
At @ workbench.desktop.main.js:43389
onClick @ workbench.desktop.main.js:43389
(anonymous) @ workbench.desktop.main.js:40611
workbench.desktop.main.js:44241 [composer] No first token received within 6s {arch: 'arm64', platform: 'darwin', channel: 'stable', client_version: '2.5.26', requestId: '29f589ab-b6de-449a-b924-b4993b82ebeb', …}
_log @ workbench.desktop.main.js:44241
warn @ workbench.desktop.main.js:44241
(anonymous) @ workbench.desktop.main.js:32181
setTimeout
e.setTimeout @ workbench.desktop.main.js:43632
(anonymous) @ workbench.desktop.main.js:32181
submitChatMaybeAbortCurrent @ workbench.desktop.main.js:32181
await in submitChatMaybeAbortCurrent
Gs @ workbench.desktop.main.js:43415
await in Gs
(anonymous) @ workbench.desktop.main.js:43389
At @ workbench.desktop.main.js:43389
onClick @ workbench.desktop.main.js:43389
(anonymous) @ workbench.desktop.main.js:40611
workbench.desktop.main.js:44241 [composer] No first token received within 8s {arch: 'arm64', platform: 'darwin', channel: 'stable', client_version: '2.5.26', requestId: '29f589ab-b6de-449a-b924-b4993b82ebeb', …}
_log @ workbench.desktop.main.js:44241
warn @ workbench.desktop.main.js:44241
(anonymous) @ workbench.desktop.main.js:32181
setTimeout
e.setTimeout @ workbench.desktop.main.js:43632
(anonymous) @ workbench.desktop.main.js:32181
submitChatMaybeAbortCurrent @ workbench.desktop.main.js:32181
await in submitChatMaybeAbortCurrent
Gs @ workbench.desktop.main.js:43415
await in Gs
(anonymous) @ workbench.desktop.main.js:43389
At @ workbench.desktop.main.js:43389
onClick @ workbench.desktop.main.js:43389
(anonymous) @ workbench.desktop.main.js:40611
workbench.desktop.main.js:44241 [transport] Stream error reported from extension host ConnectError: [invalid_argument] Error
    at o6g.$endAiConnectTransportReportError (workbench.desktop.main.js:43352:34726)
    at Nxt._doInvokeHandler (workbench.desktop.main.js:44181:23171)
    at Nxt._invokeHandler (workbench.desktop.main.js:44181:22913)
    at Nxt._receiveRequest (workbench.desktop.main.js:44181:21545)
    at Nxt._receiveOneMessage (workbench.desktop.main.js:44181:20362)
    at UCn.value (workbench.desktop.main.js:44181:18389)
    at He._deliver (workbench.desktop.main.js:55:2962)
    at He.fire (workbench.desktop.main.js:55:3283)
    at osn.fire (workbench.desktop.main.js:43336:12156)
    at MessagePort.<anonymous> (workbench.desktop.main.js:47217:18406) {arch: 'arm64', platform: 'darwin', channel: 'stable', client_version: '2.5.26', error: {…}, …}
error @ workbench.desktop.main.js:44241
$endAiConnectTransportReportError @ workbench.desktop.main.js:43352
_doInvokeHandler @ workbench.desktop.main.js:44181
_invokeHandler @ workbench.desktop.main.js:44181
_receiveRequest @ workbench.desktop.main.js:44181
_receiveOneMessage @ workbench.desktop.main.js:44181
(anonymous) @ workbench.desktop.main.js:44181
_deliver @ workbench.desktop.main.js:55
fire @ workbench.desktop.main.js:55
fire @ workbench.desktop.main.js:43336
(anonymous) @ workbench.desktop.main.js:47217
workbench.desktop.main.js:44241 [transport] [AGENT_ERROR_DIAGNOSTICS] requestId=29f589ab-b6de-449a-b924-b4993b82ebeb decision=RETRY (countAsServerError=true, countAsTransportError=false) {arch: 'arm64', platform: 'darwin', channel: 'stable', client_version: '2.5.26', decision: 'RETRY (countAsServerError=true, countAsTransportError=false)', …}
_log @ workbench.desktop.main.js:44241
warn @ workbench.desktop.main.js:44241
log @ workbench.desktop.main.js:32116
e @ workbench.desktop.main.js:464
warn @ workbench.desktop.main.js:464
run @ workbench.desktop.main.js:32116
await in run
run @ workbench.desktop.main.js:44360
runAgentLoop @ workbench.desktop.main.js:44360
streamFromAgentBackend @ workbench.desktop.main.js:44408
await in streamFromAgentBackend
getAgentStreamResponse @ workbench.desktop.main.js:44408
submitChatMaybeAbortCurrent @ workbench.desktop.main.js:32182
await in submitChatMaybeAbortCurrent
Gs @ workbench.desktop.main.js:43415
await in Gs
(anonymous) @ workbench.desktop.main.js:43389
At @ workbench.desktop.main.js:43389
onClick @ workbench.desktop.main.js:43389
(anonymous) @ workbench.desktop.main.js:40611
workbench.desktop.main.js:64   ERR [Extension Host] [otel.error] {"stack":"OTLPExporterError: Bad Request\n\tat IncomingMessage.<anonymous> (/Applications/Cursor.app/Contents/Resources/app/node_modules/@opentelemetry/otlp-exporter-base/build/src/transport/http-transport-utils.js:52:31)\n\tat IncomingMessage.emit (node:events:531:35)\n\tat endReadableNT (node:internal/streams/readable:1698:12)\n\tat process.processTicksAndRejections (node:internal/process/task_queues:90:21)","message":"Bad Request","code":"400","name":"OTLPExporterError","data":"{\"error\":\"Trace spans collection is not enabled for this user\"}"}
error @ workbench.desktop.main.js:64
error @ workbench.desktop.main.js:64
error @ workbench.desktop.main.js:46198
xlv @ workbench.desktop.main.js:43319
$logExtensionHostMessage @ workbench.desktop.main.js:43319
_doInvokeHandler @ workbench.desktop.main.js:44181
_invokeHandler @ workbench.desktop.main.js:44181
_receiveRequest @ workbench.desktop.main.js:44181
_receiveOneMessage @ workbench.desktop.main.js:44181
(anonymous) @ workbench.desktop.main.js:44181
_deliver @ workbench.desktop.main.js:55
fire @ workbench.desktop.main.js:55
fire @ workbench.desktop.main.js:43336
(anonymous) @ workbench.desktop.main.js:47217
workbench.desktop.main.js:43319 [Extension Host] [otel.error] {"stack":"OTLPExporterError: Bad Request\n\tat IncomingMessage.<anonymous> (/Applications/Cursor.app/Contents/Resources/app/node_modules/@opentelemetry/otlp-exporter-base/build/src/transport/http-transport-utils.js:52:31)\n\tat IncomingMessage.emit (node:events:531:35)\n\tat endReadableNT (node:internal/streams/readable:1698:12)\n\tat process.processTicksAndRejections (node:internal/process/task_queues:90:21)","message":"Bad Request","code":"400","name":"OTLPExporterError","data":"{\"error\":\"Trace spans collection is not enabled for this user\"}"}
klv @ workbench.desktop.main.js:43319
$logExtensionHostMessage @ workbench.desktop.main.js:43319
_doInvokeHandler @ workbench.desktop.main.js:44181
_invokeHandler @ workbench.desktop.main.js:44181
_receiveRequest @ workbench.desktop.main.js:44181
_receiveOneMessage @ workbench.desktop.main.js:44181
(anonymous) @ workbench.desktop.main.js:44181
_deliver @ workbench.desktop.main.js:55
fire @ workbench.desktop.main.js:55
fire @ workbench.desktop.main.js:43336
(anonymous) @ workbench.desktop.main.js:47217
workbench.desktop.main.js:44241 [composer] No first token received within 10s {arch: 'arm64', platform: 'darwin', channel: 'stable', client_version: '2.5.26', requestId: '29f589ab-b6de-449a-b924-b4993b82ebeb', …}
_log @ workbench.desktop.main.js:44241
warn @ workbench.desktop.main.js:44241
(anonymous) @ workbench.desktop.main.js:32181
setTimeout
e.setTimeout @ workbench.desktop.main.js:43632
(anonymous) @ workbench.desktop.main.js:32181
submitChatMaybeAbortCurrent @ workbench.desktop.main.js:32181
await in submitChatMaybeAbortCurrent
Gs @ workbench.desktop.main.js:43415
await in Gs
(anonymous) @ workbench.desktop.main.js:43389
At @ workbench.desktop.main.js:43389
onClick @ workbench.desktop.main.js:43389
(anonymous) @ workbench.desktop.main.js:40611
workbench.desktop.main.js:44241 [composer] No first token received within 12s {arch: 'arm64', platform: 'darwin', channel: 'stable', client_version: '2.5.26', requestId: '29f589ab-b6de-449a-b924-b4993b82ebeb', …}
_log @ workbench.desktop.main.js:44241
warn @ workbench.desktop.main.js:44241
(anonymous) @ workbench.desktop.main.js:32181
setTimeout
e.setTimeout @ workbench.desktop.main.js:43632
(anonymous) @ workbench.desktop.main.js:32181
submitChatMaybeAbortCurrent @ workbench.desktop.main.js:32181
await in submitChatMaybeAbortCurrent
Gs @ workbench.desktop.main.js:43415
await in Gs
(anonymous) @ workbench.desktop.main.js:43389
At @ workbench.desktop.main.js:43389
onClick @ workbench.desktop.main.js:43389
(anonymous) @ workbench.desktop.main.js:40611
workbench.desktop.main.js:44241 [composer] No first token received within 14s {arch: 'arm64', platform: 'darwin', channel: 'stable', client_version: '2.5.26', requestId: '29f589ab-b6de-449a-b924-b4993b82ebeb', …}
_log @ workbench.desktop.main.js:44241
warn @ workbench.desktop.main.js:44241
(anonymous) @ workbench.desktop.main.js:32181
setTimeout
e.setTimeout @ workbench.desktop.main.js:43632
(anonymous) @ workbench.desktop.main.js:32181
submitChatMaybeAbortCurrent @ workbench.desktop.main.js:32181
await in submitChatMaybeAbortCurrent
Gs @ workbench.desktop.main.js:43415
await in Gs
(anonymous) @ workbench.desktop.main.js:43389
At @ workbench.desktop.main.js:43389
onClick @ workbench.desktop.main.js:43389
(anonymous) @ workbench.desktop.main.js:40611
workbench.desktop.main.js:64   ERR [Extension Host] [otel.error] {"stack":"OTLPExporterError: Bad Request\n\tat IncomingMessage.<anonymous> (/Applications/Cursor.app/Contents/Resources/app/node_modules/@opentelemetry/otlp-exporter-base/build/src/transport/http-transport-utils.js:52:31)\n\tat IncomingMessage.emit (node:events:531:35)\n\tat endReadableNT (node:internal/streams/readable:1698:12)\n\tat process.processTicksAndRejections (node:internal/process/task_queues:90:21)","message":"Bad Request","code":"400","name":"OTLPExporterError","data":"{\"error\":\"Trace spans collection is not enabled for this user\"}"}
error @ workbench.desktop.main.js:64
error @ workbench.desktop.main.js:64
error @ workbench.desktop.main.js:46198
xlv @ workbench.desktop.main.js:43319
$logExtensionHostMessage @ workbench.desktop.main.js:43319
_doInvokeHandler @ workbench.desktop.main.js:44181
_invokeHandler @ workbench.desktop.main.js:44181
_receiveRequest @ workbench.desktop.main.js:44181
_receiveOneMessage @ workbench.desktop.main.js:44181
(anonymous) @ workbench.desktop.main.js:44181
_deliver @ workbench.desktop.main.js:55
fire @ workbench.desktop.main.js:55
fire @ workbench.desktop.main.js:43336
(anonymous) @ workbench.desktop.main.js:47217
workbench.desktop.main.js:43319 [Extension Host] [otel.error] {"stack":"OTLPExporterError: Bad Request\n\tat IncomingMessage.<anonymous> (/Applications/Cursor.app/Contents/Resources/app/node_modules/@opentelemetry/otlp-exporter-base/build/src/transport/http-transport-utils.js:52:31)\n\tat IncomingMessage.emit (node:events:531:35)\n\tat endReadableNT (node:internal/streams/readable:1698:12)\n\tat process.processTicksAndRejections (node:internal/process/task_queues:90:21)","message":"Bad Request","code":"400","name":"OTLPExporterError","data":"{\"error\":\"Trace spans collection is not enabled for this user\"}"}
klv @ workbench.desktop.main.js:43319
$logExtensionHostMessage @ workbench.desktop.main.js:43319
_doInvokeHandler @ workbench.desktop.main.js:44181
_invokeHandler @ workbench.desktop.main.js:44181
_receiveRequest @ workbench.desktop.main.js:44181
_receiveOneMessage @ workbench.desktop.main.js:44181
(anonymous) @ workbench.desktop.main.js:44181
_deliver @ workbench.desktop.main.js:55
fire @ workbench.desktop.main.js:55
fire @ workbench.desktop.main.js:43336
(anonymous) @ workbench.desktop.main.js:47217
workbench.desktop.main.js:44241 [composer] No first token received within 16s {arch: 'arm64', platform: 'darwin', channel: 'stable', client_version: '2.5.26', requestId: '29f589ab-b6de-449a-b924-b4993b82ebeb', …}
_log @ workbench.desktop.main.js:44241
warn @ workbench.desktop.main.js:44241
(anonymous) @ workbench.desktop.main.js:32181
setTimeout
e.setTimeout @ workbench.desktop.main.js:43632
(anonymous) @ workbench.desktop.main.js:32181
submitChatMaybeAbortCurrent @ workbench.desktop.main.js:32181
await in submitChatMaybeAbortCurrent
Gs @ workbench.desktop.main.js:43415
await in Gs
(anonymous) @ workbench.desktop.main.js:43389
At @ workbench.desktop.main.js:43389
onClick @ workbench.desktop.main.js:43389
(anonymous) @ workbench.desktop.main.js:40611
workbench.desktop.main.js:44241 [transport] Stream error reported from extension host ConnectError: [invalid_argument] Error
    at o6g.$endAiConnectTransportReportError (workbench.desktop.main.js:43352:34726)
    at Nxt._doInvokeHandler (workbench.desktop.main.js:44181:23171)
    at Nxt._invokeHandler (workbench.desktop.main.js:44181:22913)
    at Nxt._receiveRequest (workbench.desktop.main.js:44181:21545)
    at Nxt._receiveOneMessage (workbench.desktop.main.js:44181:20362)
    at UCn.value (workbench.desktop.main.js:44181:18389)
    at He._deliver (workbench.desktop.main.js:55:2962)
    at He.fire (workbench.desktop.main.js:55:3283)
    at osn.fire (workbench.desktop.main.js:43336:12156)
    at MessagePort.<anonymous> (workbench.desktop.main.js:47217:18406) {arch: 'arm64', platform: 'darwin', channel: 'stable', client_version: '2.5.26', error: {…}, …}
error @ workbench.desktop.main.js:44241
$endAiConnectTransportReportError @ workbench.desktop.main.js:43352
_doInvokeHandler @ workbench.desktop.main.js:44181
_invokeHandler @ workbench.desktop.main.js:44181
_receiveRequest @ workbench.desktop.main.js:44181
_receiveOneMessage @ workbench.desktop.main.js:44181
(anonymous) @ workbench.desktop.main.js:44181
_deliver @ workbench.desktop.main.js:55
fire @ workbench.desktop.main.js:55
fire @ workbench.desktop.main.js:43336
(anonymous) @ workbench.desktop.main.js:47217
workbench.desktop.main.js:44241 [transport] [AGENT_ERROR_DIAGNOSTICS] requestId=29f589ab-b6de-449a-b924-b4993b82ebeb decision=THROW _he {arch: 'arm64', platform: 'darwin', channel: 'stable', client_version: '2.5.26', decision: 'THROW _he', …}
_log @ workbench.desktop.main.js:44241
warn @ workbench.desktop.main.js:44241
log @ workbench.desktop.main.js:32116
e @ workbench.desktop.main.js:464
warn @ workbench.desktop.main.js:464
run @ workbench.desktop.main.js:32116
await in run
run @ workbench.desktop.main.js:44360
runAgentLoop @ workbench.desktop.main.js:44360
streamFromAgentBackend @ workbench.desktop.main.js:44408
await in streamFromAgentBackend
getAgentStreamResponse @ workbench.desktop.main.js:44408
submitChatMaybeAbortCurrent @ workbench.desktop.main.js:32182
await in submitChatMaybeAbortCurrent
Gs @ workbench.desktop.main.js:43415
await in Gs
(anonymous) @ workbench.desktop.main.js:43389
At @ workbench.desktop.main.js:43389
onClick @ workbench.desktop.main.js:43389
(anonymous) @ workbench.desktop.main.js:40611
workbench.desktop.main.js:44241 [transport] [nal_agent_retries] Error not retryable {arch: 'arm64', platform: 'darwin', channel: 'stable', client_version: '2.5.26', attempt: 3, …}
_log @ workbench.desktop.main.js:44241
warn @ workbench.desktop.main.js:44241
log @ workbench.desktop.main.js:32116
e @ workbench.desktop.main.js:464
warn @ workbench.desktop.main.js:464
run @ workbench.desktop.main.js:32116
await in run
run @ workbench.desktop.main.js:44360
runAgentLoop @ workbench.desktop.main.js:44360
streamFromAgentBackend @ workbench.desktop.main.js:44408
await in streamFromAgentBackend
getAgentStreamResponse @ workbench.desktop.main.js:44408
submitChatMaybeAbortCurrent @ workbench.desktop.main.js:32182
await in submitChatMaybeAbortCurrent
Gs @ workbench.desktop.main.js:43415
await in Gs
(anonymous) @ workbench.desktop.main.js:43389
At @ workbench.desktop.main.js:43389
onClick @ workbench.desktop.main.js:43389
(anonymous) @ workbench.desktop.main.js:40611
workbench.desktop.main.js:32182 [composer] Error in AI response: undefined _he: [invalid_argument] Error
    at jpA (workbench.desktop.main.js:32115:44883)
    at WpA (workbench.desktop.main.js:32115:43786)
    at ZpA (workbench.desktop.main.js:32116:5088)
    at Ool.run (workbench.desktop.main.js:32116:9098)
    at async s$o.runAgentLoop (workbench.desktop.main.js:44360:8423)
    at async zOl.streamFromAgentBackend (workbench.desktop.main.js:44408:8884)
    at async zOl.getAgentStreamResponse (workbench.desktop.main.js:44408:9837)
    at async yLe.submitChatMaybeAbortCurrent (workbench.desktop.main.js:32182:15752)
    at async Object.Gs [as onSubmit] (workbench.desktop.main.js:43415:4781)
    at async workbench.desktop.main.js:43389:57281Caused by: ConnectError: [invalid_argument] Error
    at o6g.$endAiConnectTransportReportError (workbench.desktop.main.js:43352:34726)
    at Nxt._doInvokeHandler (workbench.desktop.main.js:44181:23171)
    at Nxt._invokeHandler (workbench.desktop.main.js:44181:22913)
    at Nxt._receiveRequest (workbench.desktop.main.js:44181:21545)
    at Nxt._receiveOneMessage (workbench.desktop.main.js:44181:20362)
    at UCn.value (workbench.desktop.main.js:44181:18389)
    at He._deliver (workbench.desktop.main.js:55:2962)
    at He.fire (workbench.desktop.main.js:55:3283)
    at osn.fire (workbench.desktop.main.js:43336:12156)
    at MessagePort.<anonymous> (workbench.desktop.main.js:47217:18406)
submitChatMaybeAbortCurrent @ workbench.desktop.main.js:32182
await in submitChatMaybeAbortCurrent
Gs @ workbench.desktop.main.js:43415
await in Gs
(anonymous) @ workbench.desktop.main.js:43389
At @ workbench.desktop.main.js:43389
onClick @ workbench.desktop.main.js:43389
(anonymous) @ workbench.desktop.main.js:40611
workbench.desktop.main.js:44241 [composer] Error in AI response _he: [invalid_argument] Error
    at jpA (workbench.desktop.main.js:32115:44883)
    at WpA (workbench.desktop.main.js:32115:43786)
    at ZpA (workbench.desktop.main.js:32116:5088)
    at Ool.run (workbench.desktop.main.js:32116:9098)
    at async s$o.runAgentLoop (workbench.desktop.main.js:44360:8423)
    at async zOl.streamFromAgentBackend (workbench.desktop.main.js:44408:8884)
    at async zOl.getAgentStreamResponse (workbench.desktop.main.js:44408:9837)
    at async yLe.submitChatMaybeAbortCurrent (workbench.desktop.main.js:32182:15752)
    at async Object.Gs [as onSubmit] (workbench.desktop.main.js:43415:4781)
    at async workbench.desktop.main.js:43389:57281Caused by: ConnectError: [invalid_argument] Error
    at o6g.$endAiConnectTransportReportError (workbench.desktop.main.js:43352:34726)
    at Nxt._doInvokeHandler (workbench.desktop.main.js:44181:23171)
    at Nxt._invokeHandler (workbench.desktop.main.js:44181:22913)
    at Nxt._receiveRequest (workbench.desktop.main.js:44181:21545)
    at Nxt._receiveOneMessage (workbench.desktop.main.js:44181:20362)
    at UCn.value (workbench.desktop.main.js:44181:18389)
    at He._deliver (workbench.desktop.main.js:55:2962)
    at He.fire (workbench.desktop.main.js:55:3283)
    at osn.fire (workbench.desktop.main.js:43336:12156)
    at MessagePort.<anonymous> (workbench.desktop.main.js:47217:18406) {arch: 'arm64', platform: 'darwin', channel: 'stable', client_version: '2.5.26', error: {…}, …}
error @ workbench.desktop.main.js:44241
submitChatMaybeAbortCurrent @ workbench.desktop.main.js:32182
await in submitChatMaybeAbortCurrent
Gs @ workbench.desktop.main.js:43415
await in Gs
(anonymous) @ workbench.desktop.main.js:43389
At @ workbench.desktop.main.js:43389
onClick @ workbench.desktop.main.js:43389
(anonymous) @ workbench.desktop.main.js:40611
workbench.desktop.main.js:64   ERR [Extension Host] [otel.error] {"stack":"OTLPExporterError: Bad Request\n\tat IncomingMessage.<anonymous> (/Applications/Cursor.app/Contents/Resources/app/node_modules/@opentelemetry/otlp-exporter-base/build/src/transport/http-transport-utils.js:52:31)\n\tat IncomingMessage.emit (node:events:531:35)\n\tat endReadableNT (node:internal/streams/readable:1698:12)\n\tat process.processTicksAndRejections (node:internal/process/task_queues:90:21)","message":"Bad Request","code":"400","name":"OTLPExporterError","data":"{\"error\":\"Trace spans collection is not enabled for this user\"}"}
error @ workbench.desktop.main.js:64
error @ workbench.desktop.main.js:64
error @ workbench.desktop.main.js:46198
xlv @ workbench.desktop.main.js:43319
$logExtensionHostMessage @ workbench.desktop.main.js:43319
_doInvokeHandler @ workbench.desktop.main.js:44181
_invokeHandler @ workbench.desktop.main.js:44181
_receiveRequest @ workbench.desktop.main.js:44181
_receiveOneMessage @ workbench.desktop.main.js:44181
(anonymous) @ workbench.desktop.main.js:44181
_deliver @ workbench.desktop.main.js:55
fire @ workbench.desktop.main.js:55
fire @ workbench.desktop.main.js:43336
(anonymous) @ workbench.desktop.main.js:47217
workbench.desktop.main.js:43319 [Extension Host] [otel.error] {"stack":"OTLPExporterError: Bad Request\n\tat IncomingMessage.<anonymous> (/Applications/Cursor.app/Contents/Resources/app/node_modules/@opentelemetry/otlp-exporter-base/build/src/transport/http-transport-utils.js:52:31)\n\tat IncomingMessage.emit (node:events:531:35)\n\tat endReadableNT (node:internal/streams/readable:1698:12)\n\tat process.processTicksAndRejections (node:internal/process/task_queues:90:21)","message":"Bad Request","code":"400","name":"OTLPExporterError","data":"{\"error\":\"Trace spans collection is not enabled for this user\"}"}
klv @ workbench.desktop.main.js:43319
$logExtensionHostMessage @ workbench.desktop.main.js:43319
_doInvokeHandler @ workbench.desktop.main.js:44181
_invokeHandler @ workbench.desktop.main.js:44181
_receiveRequest @ workbench.desktop.main.js:44181
_receiveOneMessage @ workbench.desktop.main.js:44181
(anonymous) @ workbench.desktop.main.js:44181
_deliver @ workbench.desktop.main.js:55
fire @ workbench.desktop.main.js:55
fire @ workbench.desktop.main.js:43336
(anonymous) @ workbench.desktop.main.js:47217


When I try to use Codex while the Bedrock integration is enabled, it doesn’t work. AWS Bedrock is the only integration that’s turned on. I thought I might still have an old OpenAI API key configured, but that’s not the case.

Version: 2.5.26 (user setup)
VSCode Version: 1.105.1
Commit: 7d96c2a03bb088ad367615e9da1a3fe20fbbc6a0
Date: 2026-02-26T04:57:56.825Z
Build Type: Stable
Release Track: Default
Electron: 39.4.0
Chromium: 142.0.7444.265
Node.js: 22.22.0
V8: 14.2.231.22-electron.0
OS: Windows_NT x64 10.0.26200

Hey, I can see the screenshot with the error. Looks like the fix in 2.5 didn’t cover all cases, Bedrock is still intercepting requests to catalog models like GPT-5.3 Codex.

I’ve passed this to the team. I’ll update the thread when there’s news.

2 Likes

any updates on this?

1 Like

@deanrie Any chance there’s an ETA for a fix for this bug? It’s really hurting my team’s productivity. We use Claude through AWS Bedrock and OpenAI models through Cursor, and we basically can’t use both at the same time.

Hey, thanks for your patience and for the ping. I get that the wait has been long.

Unfortunately, there are still no updates on the fix. The bug is being tracked, but there’s no ETA yet. The workaround is still the same. Turn off Bedrock in Settings > Models > API Keys > AWS Bedrock before using catalog models like Codex, Gemini, etc.

I’ll update this thread when there’s news.

@deanrie Hi, is there any update on this?

A lot of versions have been released since this bug was reported, but the issue still persists. Our team’s annual contract is approaching, and I’m getting a lot of complaints from my team.

This feels like a very basic feature, and it’s already supported by the competitors. It’s quite frustrating that this has been broken in Cursor since day one, and there’s still no support for using both Cursor’s model catalog and third-party catalogs like AWS Bedrock in parallel.

In practice, what’s happening is that my team is working only with Bedrock and not utilizing any of the budget allocated for other models at all.

I’d really appreciate it if you could prioritize this issue and share any updates.

Thanks

Hey, thanks for pinging, and sorry for the long silence. I get that this is really hurting productivity, especially with a contract coming up.

Status update: we’re tracking the bug, but I can’t share a firm ETA for the fix yet. As soon as I have an update, I’ll reply in the thread right away.

The workaround is still the same: turn off Bedrock in Settings > Models > API Keys > AWS Bedrock when switching to catalog models. I know that’s not a real solution for a team your size.

Since you have an enterprise contract and this is a blocker for renewal, I’d also recommend reaching out to your account manager or enterprise support in parallel so the feedback comes through that channel too. That really helps us prioritize on our side.

1 Like