Planning next moves stuck

I have experienced this as well - but only on my personal subscription. My company’s team plan has experienced none of these issues. Granted I don’t use them at the same time of day but I’d think evenings would be quieter.

Have same problem.
It started 7th of December during active work.
In about 12PM all chat request started failing.
Disabling http2 helped with failures BUT it keep working extremely slow since than.

Using composer1 mostly but it slow with all

Same since yesterday

Now, I have the same issues. The response is very slowly, i’m using http/2

Version: 2.1.50 (user setup)
VSCode Version: 1.105.1
Commit: 56f0a83df8e9eb48585fcc4858a9440db4cc7770
Date: 2025-12-06T23:39:52.834Z
Electron: 37.7.0
Chromium: 138.0.7204.251
Node.js: 22.20.0
V8: 13.8.258.32-electron.0
OS: Windows_NT x64 10.0.22000

1 Like

Same for me, using http 1.1, have to wait 1-2 minutes for a single round

Where does the bug appear (feature/product)?

Cursor IDE

Describe the Bug

Whenever the models go to planning next moves they get stuck for about 5-10 minutes at a time

Steps to Reproduce

Use Gemini or Anthropic models.

Expected Behavior

Only go to planning next moves for 1 minute max.

Operating System

MacOS

Current Cursor Version (Menu → About Cursor → Copy)

Version: 2.1.50
VSCode Version: 1.105.1
Commit: 56f0a83df8e9eb48585fcc4858a9440db4cc7770
Date: 2025-12-06T23:39:52.834Z
Electron: 37.7.0
Chromium: 138.0.7204.251
Node.js: 22.20.0
V8: 13.8.258.32-electron.0
OS: Darwin arm64 25.1.0

For AI issues: which model did you use?

All of the Anthropic and Google models. Composer 1 is fine.

Additional Information

It will implement the code quickly, but the "planning next moves"part between code implementations is just incredibly slow.

I’ve tried disabling all my mcp’s, creating a new project. Disabling all my rules. My network diagnostics look completely fine. There’s something going on between Cursor and the model providers.

Does this stop you from using Cursor

Yes - Cursor is unusable

Thank you all for the reports, team is looking into this. Would be super helpful if you can share request ids (Chat message context menu > Copy Request ID) and any details from Settings > Network > Run Diagnostics.

Yep, same here. It started on Sunday, 100%.

Hey, I’m getting the same issues. In the last 2 days Cursor has been extremely slow.

1 Like

This is request ID 8fdb3a37-cbc6-430d-a649-332920ee67f1. The model is stuck on Planning next steps.

Ping
[2025-12-09T11:06:16.225Z] Sending ping 1
[2025-12-09T11:06:18.300Z] Response: ‘ping’ in 2075ms
[2025-12-09T11:06:18.300Z] Sending ping 2
[2025-12-09T11:06:20.357Z] Response: ‘ping’ in 2057ms
[2025-12-09T11:06:20.357Z] Sending ping 3
[2025-12-09T11:06:22.892Z] Response: ‘ping’ in 2535ms
[2025-12-09T11:06:22.892Z] Sending ping 4
[2025-12-09T11:06:24.674Z] Response: ‘ping’ in 1782ms
[2025-12-09T11:06:24.674Z] Sending ping 5
[2025-12-09T11:06:26.456Z] Response: ‘ping’ in 1782ms
[2025-12-09T11:06:26.456Z] Result: Error: Response took 1782ms

AGENT
[2025-12-09T11:06:16.229Z] Starting stream
[2025-12-09T11:06:16.229Z] Pushing first message
[2025-12-09T11:06:19.231Z] Response: ‘foo’ in 3002ms
[2025-12-09T11:06:19.733Z] Pushing next message
[2025-12-09T11:06:22.669Z] Response: ‘foo’ in 3437ms
[2025-12-09T11:06:23.169Z] Pushing next message
[2025-12-09T11:06:26.077Z] Response: ‘foo’ in 3408ms
[2025-12-09T11:06:26.579Z] Pushing next message
[2025-12-09T11:06:27.681Z] Response: ‘foo’ in 1604ms
[2025-12-09T11:06:28.182Z] Pushing next message
[2025-12-09T11:06:31.098Z] Response: ‘foo’ in 3417ms
[2025-12-09T11:06:31.098Z] Result: Error: HTTP/1.1 SSE responses are being buffered by a proxy in your network environment

For all requests the model response is extremely slow. Cursor has become unusable. I might need to discontinue the pro plan.

Stuck on “Planning next moves”

Request ID: e83589ba-b21e-4efa-a85f-8e1903541e7b

Running Diagnostics Script:
Chat

[2025-12-09T12:07:01.216Z] Starting streamSSE
[2025-12-09T12:07:01.712Z] Response: ‘foo’ in 496ms
[2025-12-09T12:07:02.711Z] Response: ‘foo’ in 999ms
[2025-12-09T12:07:03.711Z] Response: ‘foo’ in 1000ms
[2025-12-09T12:07:04.711Z] Response: ‘foo’ in 1000ms
[2025-12-09T12:07:05.711Z] Response: ‘foo’ in 1000ms
[2025-12-09T12:07:06.715Z] Result: true

Agent
[2025-12-09T12:07:01.216Z] Starting stream
[2025-12-09T12:07:01.216Z] Pushing first message
[2025-12-09T12:07:01.829Z] Response: ‘foo’ in 613ms
[2025-12-09T12:07:02.329Z] Pushing next message
[2025-12-09T12:07:02.875Z] Response: ‘foo’ in 1046ms
[2025-12-09T12:07:03.375Z] Pushing next message
[2025-12-09T12:07:03.875Z] Response: ‘foo’ in 1000ms
[2025-12-09T12:07:04.376Z] Pushing next message
[2025-12-09T12:07:04.875Z] Response: ‘foo’ in 1000ms
[2025-12-09T12:07:05.376Z] Pushing next message
[2025-12-09T12:07:05.876Z] Response: ‘foo’ in 1001ms
[2025-12-09T12:07:05.876Z] Result: true

Ping
[2025-12-09T12:07:01.215Z] Sending ping 1
[2025-12-09T12:07:01.711Z] Response: ‘ping’ in 496ms
[2025-12-09T12:07:01.711Z] Sending ping 2
[2025-12-09T12:07:02.150Z] Response: ‘ping’ in 439ms
[2025-12-09T12:07:02.150Z] Sending ping 3
[2025-12-09T12:07:02.591Z] Response: ‘ping’ in 441ms
[2025-12-09T12:07:02.591Z] Sending ping 4
[2025-12-09T12:07:03.056Z] Response: ‘ping’ in 465ms
[2025-12-09T12:07:03.056Z] Sending ping 5
[2025-12-09T12:07:03.500Z] Response: ‘ping’ in 444ms
[2025-12-09T12:07:03.500Z] Result: true

Also this request ID: e83589ba-b21e-4efa-a85f-8e1903541e7b

Some issues detected in the console:

Summary

workbench.desktop.main.js:6377 [Extension Host] [otel.error] {“stack”:“OTLPExporterError: Bad Request\n\tat IncomingMessage. (/Applications/Cursor.app/Contents/Resources/app/node_modules/@opentelemetry/otlp-exporter-base/build/src/transport/http-transport-utils.js:52:31)\n\tat IncomingMessage.emit (node:events:531:35)\n\tat endReadableNT (node:internal/streams/readable:1698:12)\n\tat process.processTicksAndRejections (node:internal/process/task_queues:90:21)”,“message”:“Bad Request”,“code”:“400”,“name”:“OTLPExporterError”,“data”:“{\“error\”:\“Trace spans collection is not enabled for this user\”}”} Csf @ workbench.desktop.main.js:6377 workbench.desktop.main.js:55 ERR [Extension Host] [otel.error] {“stack”:“OTLPExporterError: Bad Request\n\tat IncomingMessage. (/Applications/Cursor.app/Contents/Resources/app/node_modules/@opentelemetry/otlp-exporter-base/build/src/transport/http-transport-utils.js:52:31)\n\tat IncomingMessage.emit (node:events:531:35)\n\tat endReadableNT (node:internal/streams/readable:1698:12)\n\tat process.processTicksAndRejections (node:internal/process/task_queues:90:21)”,“message”:“Bad Request”,“code”:“400”,“name”:“OTLPExporterError”,“data”:“{\“error\”:\“Trace spans collection is not enabled for this user\”}”} error @ workbench.desktop.main.js:55 error @ workbench.desktop.main.js:55 error @ workbench.desktop.main.js:8929 Esf @ workbench.desktop.main.js:6377 $logExtensionHostMessage @ workbench.desktop.main.js:6377 _doInvokeHandler @ workbench.desktop.main.js:7026 _invokeHandler @ workbench.desktop.main.js:7026 _receiveRequest @ workbench.desktop.main.js:7026 _receiveOneMessage @ workbench.desktop.main.js:7026 (anonymous) @ workbench.desktop.main.js:7026 _deliver @ workbench.desktop.main.js:49 fire @ workbench.desktop.main.js:49 fire @ workbench.desktop.main.js:6394 (anonymous) @ workbench.desktop.main.js:8973 workbench.desktop.main.js:6377 [Extension Host] [otel.error] {“stack”:“OTLPExporterError: Bad Request\n\tat IncomingMessage. (/Applications/Cursor.app/Contents/Resources/app/node_modules/@opentelemetry/otlp-exporter-base/build/src/transport/http-transport-utils.js:52:31)\n\tat IncomingMessage.emit (node:events:531:35)\n\tat endReadableNT (node:internal/streams/readable:1698:12)\n\tat process.processTicksAndRejections (node:internal/process/task_queues:90:21)”,“message”:“Bad Request”,“code”:“400”,“name”:“OTLPExporterError”,“data”:“{\“error\”:\“Trace spans collection is not enabled for this user\”}”} Csf @ workbench.desktop.main.js:6377 $logExtensionHostMessage @ workbench.desktop.main.js:6377 _doInvokeHandler @ workbench.desktop.main.js:7026 _invokeHandler @ workbench.desktop.main.js:7026 _receiveRequest @ workbench.desktop.main.js:7026 _receiveOneMessage @ workbench.desktop.main.js:7026 (anonymous) @ workbench.desktop.main.js:7026 _deliver @ workbench.desktop.main.js:49 fire @ workbench.desktop.main.js:49 fire @ workbench.desktop.main.js:6394 (anonymous) @ workbench.desktop.main.js:8973 workbench.desktop.main.js:55 ERR [Extension Host] [otel.error] {“stack”:“OTLPExporterError: Bad Request\n\tat IncomingMessage. (/Applications/Cursor.app/Contents/Resources/app/node_modules/@opentelemetry/otlp-exporter-base/build/src/transport/http-transport-utils.js:52:31)\n\tat IncomingMessage.emit (node:events:531:35)\n\tat endReadableNT (node:internal/streams/readable:1698:12)\n\tat process.processTicksAndRejections (node:internal/process/task_queues:90:21)”,“message”:“Bad Request”,“code”:“400”,“name”:“OTLPExporterError”,“data”:“{\“error\”:\“Trace spans collection is not enabled for this user\”}”} error @ workbench.desktop.main.js:55 error @ workbench.desktop.main.js:55 error @ workbench.desktop.main.js:8929 Esf @ workbench.desktop.main.js:6377 $logExtensionHostMessage @ workbench.desktop.main.js:6377 _doInvokeHandler @ workbench.desktop.main.js:7026 _invokeHandler @ workbench.desktop.main.js:7026 _receiveRequest @ workbench.desktop.main.js:7026 _receiveOneMessage @ workbench.desktop.main.js:7026 (anonymous) @ workbench.desktop.main.js:7026 _deliver @ workbench.desktop.main.js:49 fire @ workbench.desktop.main.js:49 fire @ workbench.desktop.main.js:6394 (anonymous) @ workbench.desktop.main.js:8973 workbench.desktop.main.js:6377 [Extension Host] [otel.error] {“stack”:“OTLPExporterError: Bad Request\n\tat IncomingMessage. (/Applications/Cursor.app/Contents/Resources/app/node_modules/@opentelemetry/otlp-exporter-base/build/src/transport/http-transport-utils.js:52:31)\n\tat IncomingMessage.emit (node:events:531:35)\n\tat endReadableNT (node:internal/streams/readable:1698:12)\n\tat process.processTicksAndRejections (node:internal/process/task_queues:90:21)”,“message”:“Bad Request”,“code”:“400”,“name”:“OTLPExporterError”,“data”:“{\“error\”:\“Trace spans collection is not enabled for this user\”}”}

workbench.desktop.main.js:2782 [composer] Large diff detected for (66/66 lines). This may be due to diff timeout or whitespace issues. workbench.desktop.main.js:2782 [composer] Large diff detected for (13/13 lines). This may be due to diff timeout or whitespace issues.

1 Like

For me, the only model that works fast is Composer 1.

1 Like

For me, the only models that hang at planning next move phase for 5-10 minutes are anthropic and Google models.

Grok, ChatGPT, Composer 1 are all unaffected.

That feels like a big clue.

I’m also based in London, England, if that helps.

Is there any news of any fixes or updates?
Cursor is completely unusable for me.

The team is already working on a fix. Sorry for the inconvenience.

2 Likes

even composer is failing Request ID: 073b9200-4e29-4e89-9d8a-6205f59f4e88

Looks like fixed. Http2 start responding

Did it permanently get stuck in planning next moves for you? or was it just stuck in it for 5-10 minutes?

Planning next moves, taking 5-10 minutes at a time, is still happening for me.

It worked for 1 job. And now fail again (