Cursor keeps hanging after saying "Now I" during chat when referencing CSV or MD files with tabular data

Where does the bug appear (feature/product)?

Cursor IDE

Describe the Bug

Given a prompt like:

now i want to merge in migrated-tests.txt to PROGRESS.csv consult PROGRESS.md to understand the mapping from legacy to migrated, but ONLY add information from migrated-tests.txt. DO NOT copy any new information not already present in PROGRESS.csv or migrated-tests.txt.

The agent will say “Now I” and then pause for multiple minutes. Then later continue “understand the structure” and pause again for multiple minutes.

Sometimes I get a Network Error saying “We’re having trouble connecting to the model provider. This might be temporary - please try again in a moment.”

Steps to Reproduce

Type a prompt that references a CSV with 200 rows in it containing raw data, like file paths and such (not prose).

Operating System

Windows 10/11

Version Information

IDE Version: 2.6.21
VSCode Version: 1.105.1
Commit: fea2f54

For AI issues: which model did you use?

claude-4.5-opus-high

For AI issues: add Request ID with privacy disabled

Request ID: aea4a706-ad5c-45aa-ac4a-92758d9805c0
{“error”:“ERROR_NETWORK_ERROR”,“details”:{“title”:“Network Error”,“detail”:“We’re having trouble connecting to the model provider. This might be temporary - please try again in a moment.”,“isRetryable”:true,“additionalInfo”:{},“buttons”:,“planChoices”:},“isExpected”:true}
[resource_exhausted] Error
gie: [resource_exhausted] Error
at p9A (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:34267:23755)
at h9A (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:34267:22658)
at w9A (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:34268:6285)
at rau.run (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:34268:10400)
at async GOa.runAgentLoop (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:46834:10206)
at async _5u.streamFromAgentBackend (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:46888:9277)
at async _5u.getAgentStreamResponse (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:46888:13663)
at async vMe.submitChatMaybeAbortCurrent (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:34329:17597)
at async Ea (vscode-file://vscode-app/c:/Program%20Files/cursor/resources/app/out/vs/workbench/workbench.desktop.main.js:45813:4826)

Additional Information

I don’t know exactly what causes it. Other prompts seem to work OK. But always there is at least some latency with “Now I” (pause…) or “You’re absolutely” (pause…) in every conversation.

Does this stop you from using Cursor

Sometimes - I can sometimes use Cursor

Hi @Steven_Padfield1,

Thanks for the detailed report and the request ID. I looked into what happened with that specific request and found a clear picture of the issue.

What happened with your request:

Your request to Claude Opus 4.5 timed out at the model provider’s backend. The system automatically retried 4 times across different servers over about 6.5 minutes, but each attempt timed out as well. This is why you experienced “multiple minutes” of pausing before getting the final network error. The partial output you saw (“Now I”, “understand the structure”) was likely from the initial streaming attempts before each connection dropped.

Why larger files make it worse:

You’re right to notice a correlation with CSV and MD files containing tabular data. A 200-row CSV adds significantly to the request size, which means each attempt takes longer to process at the provider and is more susceptible to timeouts. With smaller prompts, the model responds faster and stays under the timeout threshold, so you mostly just see the baseline thinking pauses. With a larger context, the longer processing time increases the chance of hitting a timeout, triggering the retry cycle.

The baseline “Now I” pauses:

You also mentioned “always there is at least some latency” with phrases like “Now I” or “You’re absolutely.” This is separate from the timeout issue. Claude 4.5 Opus is a thinking model, which means it does internal reasoning between visible output tokens. These brief pauses (usually a few seconds) are normal behavior for thinking models. A user in this related thread reported the same pattern, and our team has acknowledged the elevated latency.

What you can do:

  1. Retry the request. The provider timeouts are intermittent, and a retry often succeeds.

  2. Try a non-thinking model. Switching to Sonnet 4.5 or Sonnet 4.6 gives smoother, more fluid streaming without the internal reasoning pauses, and these models are also faster (less likely to hit timeouts).

  3. Reduce context size where possible. If you can reference specific sections of the CSV rather than the full file, the smaller payload processes faster and is less likely to time out.

Thanks @mohitjain, that is helpful context.

I’ve gotten somewhat better results with CSV rather than tabular data in MD.

Now I am having a similar issue even without the large tabular data. Even attempting to create a skill using Anthropics skill-creator skill, it is constantly hanging. I retried using Sonnet, and the text is faster, but still very slow. It takes minutes for it to laboriously type out word by word, only gets about 20 words per second, so typing a whole file takes multiple minutes, and then another minute waiting for it to summarize what it typed, etc.

This is becoming unusable. Is there something wrong with my setup?

Request ID: 45ec52ff-dbe8-4a22-86e3-218c1d568262

Thanks for the follow-up and the new request ID.

No, there’s nothing wrong with your setup. We’ve been experiencing some service-side performance issues over the past couple of days that are causing the slowness and hanging you’re seeing. Our team is actively working on it.

A couple of things to try in the meantime:

  1. Try Auto mode — it may perform better right now.

  2. Run Cursor’s network diagnostics — Ctrl+Shift+P → “Cursor: Diagnostics”. If it flags any proxy buffering, let me know and I can help troubleshoot that separately.

This should improve soon. If things are still slow by early next week, reply here and I’ll check in.