All GPT models have been unreachable for the past few days

Where does the bug appear (feature/product)?

Cursor IDE

Describe the Bug

When I attempt to work with GPT models now, they never actually get past the first few tokens of thinking before reporting that the model is unreachable. I have attempted to disable HTTP2 (per a conversation I read here about a similar issue). This is not intermittent per se - I just can’t use them and haven’t been able to use them since this past weekend. Prior to that, everything worked fine. I am able to use all other models without any issue. The only intermittent piece is that sometimes I get a few tokens (just got plan mode to ask two questions, then nothing happened after I responded), other times it just spins on “planning next moves” then drops out. I have tried codex and regular GPT in various thinking levels with the same results for all. This happens in new sessions, current sessions, with or without attachments, etc…

I haven’t made any changes to my network config at home or on my PC since they last worked.

Request ID: 9f048658-7b9c-4bd9-b718-5fef9314b16d
{“error”:“ERROR_OPENAI”,“details”:{“title”:“Unable to reach the model provider”,“detail”:“We’re having trouble connecting to the model provider. This might be temporary - please try again in a moment.”,“additionalInfo”:{},“buttons”:,“planChoices”:},“isExpected”:false}
ConnectError: [unavailable] Error
at tBc.$endAiConnectTransportReportError (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:6625:446993)
at fPo._doInvokeHandler (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:7298:22831)
at fPo._invokeHandler (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:7298:22573)
at fPo._receiveRequest (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:7298:21335)
at fPo._receiveOneMessage (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:7298:20152)
at nRt.value (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:7298:18244)
at Ee._deliver (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:49:2962)
at Ee.fire (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:49:3283)
at Hmt.fire (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:6610:12156)
at MessagePort. (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:9346:18433)

Steps to Reproduce

Open latest Cursor (I get nightly builds) in Windows 11.

Start new session

Choose a gpt model and send a prompt.

Expected Behavior

Model actually responds.

Operating System

Windows 10/11

Current Cursor Version (Menu → About Cursor → Copy)

Version: 2.2.0-pre.22.patch.0 (system setup)
VSCode Version: 1.105.1
Commit: 1459faeb76c672cfa094b4b4d028112ec5f8bed0
Date: 2025-12-02T08:24:16.317Z
Electron: 37.7.0
Chromium: 138.0.7204.251
Node.js: 22.20.0
V8: 13.8.258.32-electron.0
OS: Windows_NT x64 10.0.22631

For AI issues: which model did you use?

Gpt-5.1-codex (medium, high)
GPT-5.1 (medium, high)

Does this stop you from using Cursor

No - Cursor works, but with this issue

Hey, thanks for the report. I see ERROR_OPENAI “Unable to reach the model provider” and ConnectError [unavailable] on GPT-5.1 / gpt-5.1-codex. It looks like a network unavailability issue with the GPT provider from the client’s side.

Could you please share:

  • A screenshot of Settings > Network > Run Diagnostics (full result)
  • Errors from Help > Toggle Developer Tools > Console at the same moment (Cmd+Shift+P → “Developer: Toggle Developer Tools”)
  • Does it reproduce on the stable build (not nightly) on the same machine/network?
  • Does it reproduce on another network (mobile hotspot) + your region/ISP?
  • Are you using built-in models or your own OpenAI API key? If your own - specify the base URL/provider and if proxy/VPN is enabled?

Share the info, and if needed, I’ll pass it to the team.

  1. workbench.desktop.main.js:7345 [transport] Stream error reported from extension host ConnectError: [unavailable] Error
    at tBc.$endAiConnectTransportReportError (workbench.desktop.main.js:6625:446993)
    at fPo._doInvokeHandler (workbench.desktop.main.js:7298:22831)
    at fPo._invokeHandler (workbench.desktop.main.js:7298:22573)
    at fPo._receiveRequest (workbench.desktop.main.js:7298:21335)
    at fPo._receiveOneMessage (workbench.desktop.main.js:7298:20152)
    at nRt.value (workbench.desktop.main.js:7298:18244)
    at Ee._deliver (workbench.desktop.main.js:49:2962)
    at Ee.fire (workbench.desktop.main.js:49:3283)
    at Hmt.fire (workbench.desktop.main.js:6610:12156)
    at MessagePort. (workbench.desktop.main.js:9346:18433) {arch: ‘x64’, platform: ‘win32’, channel: ‘stable’, client_version: ‘2.2.0-pre.22.patch.0’, streamId: ‘6540c10f-efc6-4bc0-89b8-ef8cdfbaf10f’, …}
    error @ workbench.desktop.main.js:7345
    $endAiConnectTransportReportError @ workbench.desktop.main.js:6625
    _doInvokeHandler @ workbench.desktop.main.js:7298
    _invokeHandler @ workbench.desktop.main.js:7298
    _receiveRequest @ workbench.desktop.main.js:7298
    _receiveOneMessage @ workbench.desktop.main.js:7298
    (anonymous) @ workbench.desktop.main.js:7298
    _deliver @ workbench.desktop.main.js:49
    fire @ workbench.desktop.main.js:49
    fire @ workbench.desktop.main.js:6610
    (anonymous) @ workbench.desktop.main.js:9346
    workbench.desktop.main.js:7546 [AgentService] Error running agent: ConnectError: [unavailable] Error
    at tBc.$endAiConnectTransportReportError (workbench.desktop.main.js:6625:446993)
    at fPo._doInvokeHandler (workbench.desktop.main.js:7298:22831)
    at fPo._invokeHandler (workbench.desktop.main.js:7298:22573)
    at fPo._receiveRequest (workbench.desktop.main.js:7298:21335)
    at fPo._receiveOneMessage (workbench.desktop.main.js:7298:20152)
    at nRt.value (workbench.desktop.main.js:7298:18244)
    at Ee._deliver (workbench.desktop.main.js:49:2962)
    at Ee.fire (workbench.desktop.main.js:49:3283)
    at Hmt.fire (workbench.desktop.main.js:6610:12156)
    at MessagePort. (workbench.desktop.main.js:9346:18433)
    (anonymous) @ workbench.desktop.main.js:7546
    Promise.then
    runAgentLoop @ workbench.desktop.main.js:7546
    await in runAgentLoop
    streamFromAgentBackend @ workbench.desktop.main.js:7563
    await in streamFromAgentBackend
    handleStreamComposer @ workbench.desktop.main.js:3022
    streamResponse @ workbench.desktop.main.js:7346
    getAgentStreamResponse @ workbench.desktop.main.js:7563
    (anonymous) @ workbench.desktop.main.js:7494
    (anonymous) @ workbench.desktop.main.js:7496
    (anonymous) @ workbench.desktop.main.js:7490
    (anonymous) @ workbench.desktop.main.js:7490
    (anonymous) @ workbench.desktop.main.js:7490
    (anonymous) @ workbench.desktop.main.js:3136
    (anonymous) @ workbench.desktop.main.js:7494
    processCodeBlocks @ workbench.desktop.main.js:3066
    submitChatMaybeAbortCurrent @ workbench.desktop.main.js:4226
    await in submitChatMaybeAbortCurrent
    (anonymous) @ workbench.desktop.main.js:52
    Lrr @ workbench.desktop.main.js:52
    BCd @ workbench.desktop.main.js:52
    n.value @ workbench.desktop.main.js:52
    Ue @ workbench.desktop.main.js:4216
    (anonymous) @ workbench.desktop.main.js:4261
    workbench.desktop.main.js:7346 [AiService] streamResponse ConnectError: [unavailable] Error
    at tBc.$endAiConnectTransportReportError (workbench.desktop.main.js:6625:446993)
    at fPo._doInvokeHandler (workbench.desktop.main.js:7298:22831)
    at fPo._invokeHandler (workbench.desktop.main.js:7298:22573)
    at fPo._receiveRequest (workbench.desktop.main.js:7298:21335)
    at fPo._receiveOneMessage (workbench.desktop.main.js:7298:20152)
    at nRt.value (workbench.desktop.main.js:7298:18244)
    at Ee._deliver (workbench.desktop.main.js:49:2962)
    at Ee.fire (workbench.desktop.main.js:49:3283)
    at Hmt.fire (workbench.desktop.main.js:6610:12156)
    at MessagePort. (workbench.desktop.main.js:9346:18433)
    streamResponse @ workbench.desktop.main.js:7346
    await in streamResponse
    getAgentStreamResponse @ workbench.desktop.main.js:7563
    (anonymous) @ workbench.desktop.main.js:7494
    (anonymous) @ workbench.desktop.main.js:7496
    (anonymous) @ workbench.desktop.main.js:7490
    (anonymous) @ workbench.desktop.main.js:7490
    (anonymous) @ workbench.desktop.main.js:7490
    (anonymous) @ workbench.desktop.main.js:3136
    (anonymous) @ workbench.desktop.main.js:7494
    processCodeBlocks @ workbench.desktop.main.js:3066
    submitChatMaybeAbortCurrent @ workbench.desktop.main.js:4226
    await in submitChatMaybeAbortCurrent
    (anonymous) @ workbench.desktop.main.js:52
    Lrr @ workbench.desktop.main.js:52
    BCd @ workbench.desktop.main.js:52
    n.value @ workbench.desktop.main.js:52
    Ue @ workbench.desktop.main.js:4216
    (anonymous) @ workbench.desktop.main.js:4261
    workbench.desktop.main.js:7345 [transport] Stream response error ConnectError: [unavailable] Error
    at tBc.$endAiConnectTransportReportError (workbench.desktop.main.js:6625:446993)
    at fPo._doInvokeHandler (workbench.desktop.main.js:7298:22831)
    at fPo._invokeHandler (workbench.desktop.main.js:7298:22573)
    at fPo._receiveRequest (workbench.desktop.main.js:7298:21335)
    at fPo._receiveOneMessage (workbench.desktop.main.js:7298:20152)
    at nRt.value (workbench.desktop.main.js:7298:18244)
    at Ee._deliver (workbench.desktop.main.js:49:2962)
    at Ee.fire (workbench.desktop.main.js:49:3283)
    at Hmt.fire (workbench.desktop.main.js:6610:12156)
    at MessagePort. (workbench.desktop.main.js:9346:18433) {arch: ‘x64’, platform: ‘win32’, channel: ‘stable’, client_version: ‘2.2.0-pre.22.patch.0’, generationUUID: ‘0d8cc78d-1e8d-466b-89ab-6fb799c0e092’, …}
    error @ workbench.desktop.main.js:7345
    streamResponse @ workbench.desktop.main.js:7346
    await in streamResponse
    getAgentStreamResponse @ workbench.desktop.main.js:7563
    (anonymous) @ workbench.desktop.main.js:7494
    (anonymous) @ workbench.desktop.main.js:7496
    (anonymous) @ workbench.desktop.main.js:7490
    (anonymous) @ workbench.desktop.main.js:7490
    (anonymous) @ workbench.desktop.main.js:7490
    (anonymous) @ workbench.desktop.main.js:3136
    (anonymous) @ workbench.desktop.main.js:7494
    processCodeBlocks @ workbench.desktop.main.js:3066
    submitChatMaybeAbortCurrent @ workbench.desktop.main.js:4226
    await in submitChatMaybeAbortCurrent
    (anonymous) @ workbench.desktop.main.js:52
    Lrr @ workbench.desktop.main.js:52
    BCd @ workbench.desktop.main.js:52
    n.value @ workbench.desktop.main.js:52
    Ue @ workbench.desktop.main.js:4216
    (anonymous) @ workbench.desktop.main.js:4261
    workbench.desktop.main.js:7345 [transport] Automatic bug report submitted for unexpected connection error {arch: ‘x64’, platform: ‘win32’, channel: ‘stable’, client_version: ‘2.2.0-pre.22.patch.0’, requestId: ‘0d8cc78d-1e8d-466b-89ab-6fb799c0e092’, …}
    _log @ workbench.desktop.main.js:7345
    warn @ workbench.desktop.main.js:7345
    showImmediateErrorMessage @ workbench.desktop.main.js:2863
    handleError @ workbench.desktop.main.js:2863
    streamResponse @ workbench.desktop.main.js:7346
    await in streamResponse
    getAgentStreamResponse @ workbench.desktop.main.js:7563
    (anonymous) @ workbench.desktop.main.js:7494
    (anonymous) @ workbench.desktop.main.js:7496
    (anonymous) @ workbench.desktop.main.js:7490
    (anonymous) @ workbench.desktop.main.js:7490
    (anonymous) @ workbench.desktop.main.js:7490
    (anonymous) @ workbench.desktop.main.js:3136
    (anonymous) @ workbench.desktop.main.js:7494
    processCodeBlocks @ workbench.desktop.main.js:3066
    submitChatMaybeAbortCurrent @ workbench.desktop.main.js:4226
    await in submitChatMaybeAbortCurrent
    (anonymous) @ workbench.desktop.main.js:52
    Lrr @ workbench.desktop.main.js:52
    BCd @ workbench.desktop.main.js:52
    n.value @ workbench.desktop.main.js:52
    Ue @ workbench.desktop.main.js:4216
    (anonymous) @ workbench.desktop.main.js:4261
    workbench.desktop.main.js:4227 [composer] Error in AI response: {“error”:“ERROR_OPENAI”,“details”:{“title”:“Unable to reach the model provider”,“detail”:“We’re having trouble connecting to the model provider. This might be temporary - please try again in a moment.”,“additionalInfo”:{},“buttons”:,“planChoices”:},“isExpected”:false} ConnectError: [unavailable] Error
    at tBc.$endAiConnectTransportReportError (workbench.desktop.main.js:6625:446993)
    at fPo._doInvokeHandler (workbench.desktop.main.js:7298:22831)
    at fPo._invokeHandler (workbench.desktop.main.js:7298:22573)
    at fPo._receiveRequest (workbench.desktop.main.js:7298:21335)
    at fPo._receiveOneMessage (workbench.desktop.main.js:7298:20152)
    at nRt.value (workbench.desktop.main.js:7298:18244)
    at Ee._deliver (workbench.desktop.main.js:49:2962)
    at Ee.fire (workbench.desktop.main.js:49:3283)
    at Hmt.fire (workbench.desktop.main.js:6610:12156)
    at MessagePort. (workbench.desktop.main.js:9346:18433)
    submitChatMaybeAbortCurrent @ workbench.desktop.main.js:4227
    await in submitChatMaybeAbortCurrent
    (anonymous) @ workbench.desktop.main.js:52
    Lrr @ workbench.desktop.main.js:52
    BCd @ workbench.desktop.main.js:52
    n.value @ workbench.desktop.main.js:52
    Ue @ workbench.desktop.main.js:4216
    (anonymous) @ workbench.desktop.main.js:4261
    workbench.desktop.main.js:7345 [composer] Error in AI response ConnectError: [unavailable] Error
    at tBc.$endAiConnectTransportReportError (workbench.desktop.main.js:6625:446993)
    at fPo._doInvokeHandler (workbench.desktop.main.js:7298:22831)
    at fPo._invokeHandler (workbench.desktop.main.js:7298:22573)
    at fPo._receiveRequest (workbench.desktop.main.js:7298:21335)
    at fPo._receiveOneMessage (workbench.desktop.main.js:7298:20152)
    at nRt.value (workbench.desktop.main.js:7298:18244)
    at Ee._deliver (workbench.desktop.main.js:49:2962)
    at Ee.fire (workbench.desktop.main.js:49:3283)
    at Hmt.fire (workbench.desktop.main.js:6610:12156)
    at MessagePort. (workbench.desktop.main.js:9346:18433) {arch: ‘x64’, platform: ‘win32’, channel: ‘stable’, client_version: ‘2.2.0-pre.22.patch.0’, requestId: ‘0d8cc78d-1e8d-466b-89ab-6fb799c0e092’, …}
    error @ workbench.desktop.main.js:7345
    submitChatMaybeAbortCurrent @ workbench.desktop.main.js:4227
    await in submitChatMaybeAbortCurrent
    (anonymous) @ workbench.desktop.main.js:52
    Lrr @ workbench.desktop.main.js:52
    BCd @ workbench.desktop.main.js:52
    n.value @ workbench.desktop.main.js:52
    Ue @ workbench.desktop.main.js:4216
    (anonymous) @ workbench.desktop.main.js:4261
    workbench.desktop.main.js:593 [composer] Large diff detected for c:\Users\Tyler\git\cursor\gh-sos-app\src\components\services\shared\ShareServiceDialog.tsx (578/580 lines). This may be due to diff timeout or whitespace issues.
    getCodeBlockDiffStats @ workbench.desktop.main.js:593
    await in getCodeBlockDiffStats
    (anonymous) @ workbench.desktop.main.js:52
    Lrr @ workbench.desktop.main.js:52
    gq @ workbench.desktop.main.js:52
    n.value @ workbench.desktop.main.js:52
    (anonymous) @ workbench.desktop.main.js:6343
    zbs @ workbench.desktop.main.js:6343
    r @ workbench.desktop.main.js:7506
    await in r
    (anonymous) @ workbench.desktop.main.js:7507
    (anonymous) @ workbench.desktop.main.js:50
    Promise.then
    trigger @ workbench.desktop.main.js:50
    o @ workbench.desktop.main.js:7507
    (anonymous) @ workbench.desktop.main.js:7507
    _deliver @ workbench.desktop.main.js:49
    _deliverQueue @ workbench.desktop.main.js:49
    fire @ workbench.desktop.main.js:49
    fireDidFinishStreamChat @ workbench.desktop.main.js:564
    submitChatMaybeAbortCurrent @ workbench.desktop.main.js:4227
    await in submitChatMaybeAbortCurrent
    (anonymous) @ workbench.desktop.main.js:52
    Lrr @ workbench.desktop.main.js:52
    BCd @ workbench.desktop.main.js:52
    n.value @ workbench.desktop.main.js:52
    Ue @ workbench.desktop.main.js:4216
    (anonymous) @ workbench.desktop.main.js:4261
    workbench.desktop.main.js:7345 [transport] Connect error in unary AI connect ConnectError: [not_found] Error
    at t (workbench.desktop.main.js:6625:448537)
    at async Object.getBackgroundComposerChangesHash (workbench.desktop.main.js:565:72397)
    at async workbench.desktop.main.js:9364:27933
    at async Sxo._fetchOptimizedDiffDetailsAndUpdateImpl (workbench.desktop.main.js:9364:28988) {arch: ‘x64’, platform: ‘win32’, channel: ‘stable’, client_version: ‘2.2.0-pre.22.patch.0’, service: ‘aiserver.v1.BackgroundComposerService’, …}
    error @ workbench.desktop.main.js:7345
    t @ workbench.desktop.main.js:6625
    await in t
    (anonymous) @ workbench.desktop.main.js:6625
    Lrr @ workbench.desktop.main.js:52
    gq @ workbench.desktop.main.js:52
    unary @ workbench.desktop.main.js:6625
    unary @ workbench.desktop.main.js:565
    await in unary
    (anonymous) @ workbench.desktop.main.js:565
    (anonymous) @ workbench.desktop.main.js:9364
    await in (anonymous)
    _fetchOptimizedDiffDetailsAndUpdateImpl @ workbench.desktop.main.js:9364
    fetchOptimizedDiffDetailsAndUpdate @ workbench.desktop.main.js:9364
    e.length._attemptUpdateOptimizedDiffDetailsPromise.q8t.max @ workbench.desktop.main.js:9364
    (anonymous) @ workbench.desktop.main.js:2835
    (anonymous) @ workbench.desktop.main.js:2835
    _trySubscribe @ workbench.desktop.main.js:2835
    subscribe @ workbench.desktop.main.js:2835
    w @ workbench.desktop.main.js:2835
    v @ workbench.desktop.main.js:2835
    pXd @ workbench.desktop.main.js:2833
    next @ workbench.desktop.main.js:2835
    next @ workbench.desktop.main.js:2835
    pXd @ workbench.desktop.main.js:2833
    next @ workbench.desktop.main.js:2835
    TXd @ workbench.desktop.main.js:2833
    (anonymous) @ workbench.desktop.main.js:2833
    _trySubscribe @ workbench.desktop.main.js:2835
    subscribe @ workbench.desktop.main.js:2835
    (anonymous) @ workbench.desktop.main.js:2835
    _trySubscribe @ workbench.desktop.main.js:2835
    subscribe @ workbench.desktop.main.js:2835
    nQd @ workbench.desktop.main.js:2835
    (anonymous) @ workbench.desktop.main.js:2835
    _trySubscribe @ workbench.desktop.main.js:2835
    subscribe @ workbench.desktop.main.js:2835
    hQd @ workbench.desktop.main.js:2835
    (anonymous) @ workbench.desktop.main.js:2835
    _trySubscribe @ workbench.desktop.main.js:2835
    subscribe @ workbench.desktop.main.js:2835
    (anonymous) @ workbench.desktop.main.js:2835
    H8t @ workbench.desktop.main.js:2835
    q8t @ workbench.desktop.main.js:2835
    (anonymous) @ workbench.desktop.main.js:9364
    attemptUpdateOptimizedDiffDetailsTracker @ workbench.desktop.main.js:9364
    n @ workbench.desktop.main.js:9364
    (anonymous) @ workbench.desktop.main.js:9364
    workbench.desktop.main.js:9364 Optimized diff batch update failed: ConnectError: [not_found] Error
    at t (workbench.desktop.main.js:6625:448537)
    at async Object.getBackgroundComposerChangesHash (workbench.desktop.main.js:565:72397)
    at async workbench.desktop.main.js:9364:27933
    at async Sxo._fetchOptimizedDiffDetailsAndUpdateImpl (workbench.desktop.main.js:9364:28988)
    (anonymous) @ workbench.desktop.main.js:9364
    Promise.catch
    attemptUpdateOptimizedDiffDetailsTracker @ workbench.desktop.main.js:9364
    n @ workbench.desktop.main.js:9364
    (anonymous) @ workbench.desktop.main.js:9364
  2. Reverted to Default stable build. Model seems to be responding now. Must be an issue with a release sometime in the past few days via the nightly builds.
  3. Same issue occurs in nightly build even on VPN and hotspot.
  4. These are the built-in models.

This topic was automatically closed 22 days after the last reply. New replies are no longer allowed.