Cursor-agent returns no response, OTLPExporterError

Where does the bug appear (feature/product)?

Cursor CLI

Describe the Bug

When using any other model than Grok or “auto”, the cli doesn’t respond. After a short “generating…” state, the ui is back to the input step (prompt in text input box) and that’s it.

Running with --debug flag, it seems to be a problem with OpenTelemetry on the backend side.

Terminal:

cursor-agent --model gpt-5 --debug

  Cursor Agent
  ~/src/userlytics-service · main

  Debug info: http://127.0.0.1:43111

┌───────────────────────────────────────────────────┐
│ Hello there GPT!                                  |
└───────────────────────────────────────────────────┘

 GPT-5
  / commands · @ files · ! shell

From the debug page:

--- Cursor Agent Debug Session 2025-09-21T19:46:28.481Z ---
2025-09-21T19:46:28.486
debug-session-start
{
  "event": "debug-session-start",
  "port": 43111,
  "basePort": 43111,
  "attempts": 1,
  "ephemeralFallback": false,
  "directory": "/tmp/cursor-agent-debug-KyQjVI",
  "logFile": "/tmp/cursor-agent-debug-KyQjVI/session.log",
  "pid": 4008175,
  "startTime": "2025-09-21T19:46:28.486Z"
}
2025-09-21T19:46:29.060
privacy.refresh.start {"isStale":false,"sampleEvery":10}
2025-09-21T19:46:29.122
Loading existing conversation
2025-09-21T19:46:29.214
protoPrivacyToGhostMode {"privacyMode":2}
2025-09-21T19:46:29.264
privacy.refresh.updated {"ghost":true}
2025-09-21T19:46:37.816
logger {"message":"Running agent","metadata":{"action":{"userMessageAction":{"userMessage":{"text":"Hello there GPT!","messageId":"db75dae3-69a4-4634-bdf6-b443c06ee3d1","selectedContext":{"selectedImages":[]}}}},"modelDetails":{"modelId":"gpt-5","displayModelId":"gpt-5","displayName":"GPT-5","displayNameShort":"GPT-5","aliases":[]}}}
2025-09-21T19:46:46.882
otel.error {"stack":"OTLPExporterError: Bad Request\n at IncomingMessage.<anonymous> (file:///home/mwb/.local/share/cursor-agent/versions/2025.09.18-7ae6800/index.js:3766:25916)\n at IncomingMessage.emit (node:events:530:35)\n at endReadableNT (node:internal/streams/readable:1698:12)\n at process.processTicksAndRejections (node:internal/process/task_queues:90:21)","message":"Bad Request","code":"400","name":"OTLPExporterError","data":"{\"error\":\"Trace spans collection is not enabled for this user\"}"}

Steps to Reproduce

Any time I try cursor-agent with any model other than Grok, for any prompt.

Expected Behavior

I expect any model that I am able to select to work, or display a clear error message

Operating System

Linux

Current Cursor Version (Menu → About Cursor → Copy)

2025.09.18-7ae6800 (cursor-agent cli)

Additional Information

I’m a user on a Team Plan.

Does this stop you from using Cursor

Yes - Cursor is unusable

Hey, thanks for the report. This looks like a backend telemetry configuration issue, the error “Trace spans collection is not enabled for this user” points to a mismatch between Team Plan permissions and telemetry settings.

It’s interesting that Grok works while other models don’t, which suggests different code paths for different providers.

Things to try:

  • Make sure you’re logged into the correct account: cursor-agent status
  • Log out and back in: cursor-agent logout, then cursor-agent login
  • Ask your team admin if any model restrictions are enabled

If that doesn’t help, please attach your debug logs, they’ll help a lot with diagnosing the telemetry issue.

OpenTelemetry errors typically shouldn’t block model responses, so there may be a deeper model‑routing issue for Team Plan users.

Hello, thanks for the response, I was becoming worried that this will be ignored.

I can confirm that the problem happens even after logging out and in and that I am connected to the correct account (I only have one).

Also that there are no restrictions put on me by team admin (I have the very same config as other team members, for whom the cli tool works well, albeit they all use Mac, only I use Linux). But! I was using this account as “personal” before, paying for it myself, then after few months I was added to the Team as our company offered to pay for it. So I’m just guessing, but something relevant could change then?

One interesting piece of puzzle though!

The problem is not present in version 2025.08.22-82fb571. I only managed to revert to this version because the name of it was provided to me by one of the colleagues, and I manually put it into the sh install script (instead of the newest 2025.09.18-7ae6800).

I don’t know in which version was the bug introduced, as I possibly missed some version between these two, but I can confirm that it must be a combination of client and backend problem, as with different client version all the models respond correctly.

On the other hand, as you’ve pointed out, the error from the debug logs (in the original post) seems as some kind back-end configuration problem.

I have no idea what to fix or what more debug info can I provide, any ideas?

The cli now works for me, but I don’t like the idea of being frozen on the old client version :slight_smile:

Many thanks again!

Oh my, sorry for multiple replies, but I just learned something even more crucial…

As mentioned above, the cursor-agent cli client of version 2025.08.22-82fb571 works with all the models.

And here’s the debug log for a working session like that:

───────┬─────────────────────────────────────────────────────────────────────
       │ File: /tmp/cursor-agent-debug-yHNPKo/session.log
───────┼─────────────────────────────────────────────────────────────────────
   1   │ --- Cursor Agent Debug Session 2025-09-23T17:36:07.655Z ---
   2   │ {"event":"debug-session-start","port":43112,"basePort":43111,"attempts":2,"ephemeralFallback
       │ ":false,"directory":"/tmp/cursor-agent-debug-yHNPKo","logFile":"/tmp/cursor-agent-debug-yHNP
       │ Ko/session.log","pid":22294,"startTime":"2025-09-23T17:36:07.658Z"}
   3   │ [2025-09-23T17:36:07.659Z] ink:resolutionError Error [ERR_MODULE_NOT_FOUND]: Cannot find pac
       │ kage '@anysphere/ink' imported from /home/mwb/.local/share/cursor-agent/versions/2025.09.18-
       │ 7ae6800/index.js
   4   │ [2025-09-23T17:36:07.660Z] privacy.refresh.start {"isStale":true,"sampleEvery":10}
   5   │ [2025-09-23T17:36:07.661Z] privacy.refresh.start {"isStale":true,"sampleEvery":10}
   6   │ [2025-09-23T17:36:07.661Z] privacy.refresh.start {"isStale":true,"sampleEvery":10}
   7   │ [2025-09-23T17:36:08.198Z] protoPrivacyToGhostMode {"privacyMode":2}
   8   │ [2025-09-23T17:36:08.200Z] protoPrivacyToGhostMode {"privacyMode":2}
   9   │ [2025-09-23T17:36:08.200Z] protoPrivacyToGhostMode {"privacyMode":2}
  10   │ [2025-09-23T17:36:08.201Z] privacy.refresh.updated {"ghost":true}
  11   │ [2025-09-23T17:36:08.209Z] privacy.refresh.updated {"ghost":true}
  12   │ [2025-09-23T17:36:08.210Z] privacy.refresh.updated {"ghost":true}
  13   │ [2025-09-23T17:36:08.312Z] No .git directory found starting from: /home/mwb/src
  14   │ [2025-09-23T17:36:08.338Z] Loading existing conversation
  15   │ [2025-09-23T17:36:09.643Z] global.error {"message":"Running agent"}
  16   │ [2025-09-23T17:36:09.663Z] privacy.refresh.start {"isStale":false,"sampleEvery":10}
  17   │ [2025-09-23T17:36:10.000Z] protoPrivacyToGhostMode {"privacyMode":2}
  18   │ [2025-09-23T17:36:10.001Z] privacy.refresh.updated {"ghost":true}
  19   │ [2025-09-23T17:36:15.081Z] otel.error {"stack":"OTLPExporterError: Bad Request\n    at Incom
       │ ingMessage.<anonymous> (file:///home/mwb/.local/share/cursor-agent/versions/2025.09.18-7ae68
       │ 00/index.js:220409:26)\n    at IncomingMessage.emit (node:events:530:35)\n    at endReadable
       │ NT (node:internal/streams/readable:1698:12)\n    at process.processTicksAndRejections (node:
       │ internal/process/task_queues:90:21)","message":"Bad Request","code":"400","name":"OTLPExport
       │ erError","data":"{\"error\":\"Trace spans collection is not enabled for this user\"}"}
  20   │ [2025-09-23T17:36:24.386Z] otel.error {"stack":"OTLPExporterError: Bad Request\n    at Incom
       │ ingMessage.<anonymous> (file:///home/mwb/.local/share/cursor-agent/versions/2025.09.18-7ae68
       │ 00/index.js:220409:26)\n    at IncomingMessage.emit (node:events:530:35)\n    at endReadable
       │ NT (node:internal/streams/readable:1698:12)\n    at process.processTicksAndRejections (node:
       │ internal/process/task_queues:90:21)","message":"Bad Request","code":"400","name":"OTLPExport
       │ erError","data":"{\"error\":\"Trace spans collection is not enabled for this user\"}"}
  21   │ [2025-09-23T17:36:57.257Z] No .git directory found starting from: /home/mwb/src
  22   │ [2025-09-23T17:36:57.258Z] colorMode truecolor
  23   │ [2025-09-23T17:36:57.258Z] isLightTheme false
  24   │ [2025-09-23T17:37:02.397Z] otel.error {"stack":"OTLPExporterError: Bad Request\n    at Incom
       │ ingMessage.<anonymous> (file:///home/mwb/.local/share/cursor-agent/versions/2025.09.18-7ae68
       │ 00/index.js:220409:26)\n    at IncomingMessage.emit (node:events:530:35)\n    at endReadable
       │ NT (node:internal/streams/readable:1698:12)\n    at process.processTicksAndRejections (node:
       │ internal/process/task_queues:90:21)","message":"Bad Request","code":"400","name":"OTLPExport
       │ erError","data":"{\"error\":\"Trace spans collection is not enabled for this user\"}"}
───────┴─────────────────────────────────────────────────────────────────────

So the OpenTelemetry 400 Bad Request is logged even in the old version when it works.

I should have investigate better before creating the thread :person_facepalming:

Anyway, so the OpenTelemetry is probably not the real problem and now I have absolutely no clue why version 2025.08.22-82fb571works but only Grok or “auto” works on 2025.09.18-7ae6800.

Here’s the screen capture of how the cli acts on the newest version, simply no response, no error:

1 Like

I’m facing the similar issue like yours. It only worked fine when using auto or Grok model. When I used GPT-5 or claude-sonnet-4.5, it somehow show no response, but still counted in the usage billing.

I’m clueless for now.

1 Like

This topic was automatically closed 22 days after the last reply. New replies are no longer allowed.