Write tool with json very slow

Have use case where json comes back from mcp. If return hits a certain size Cursor writes to it’s storage\filesystem; this is very quick. If the json is small it seems to go directly into context\chat inline. I then need to call a local py diff tool where two separate json returns from mcp get passed in as params.

Passing in the json as string seems to present problems due to being too long or has escaping issues and the AI recommendation is to write both json returns to file and just pass in two file paths. But since Cursor sort of decides on the fly whether to write to file or go to chat inline I can’t control that directly so I ask Cursor to use write tool to write to file. For some reason this is extremely slow around 3 to 5 minutes to write. Anyone else run into this? Any work-around? It was only an 11 kb file.

Hey there!

When the agent uses the Write tool, the entire file must be generated token-by-token by the LLM. The model is literally “typing out” every character of your 11KB JSON.

Are you in control of the MCP server? The ideal fix is to have your MCP server write the JSON to a file directly and just return the file path as the tool result. That way, the file I/O happens at native speed inside your MCP server process, and only a short path string flows through the model. Your diff workflow becomes:

  • Call MCP tool A → returns /tmp/result_a.json
  • Call MCP tool B → returns /tmp/result_b.json
  • Pass both paths to your Python diff tool
1 Like