I use curl to request the custom API interface and it works fine
However, an error is reported in the cursor. We’re having trouble connecting to the model provider. This might be temporary - please try again in a moment.
Hey, this is a known issue with the Override OpenAI Base URL feature. A few things are going on here:
Responses API vs Chat Completions format
When you use Agent mode with a custom base URL, Cursor currently sends requests in OpenAI’s Responses API format (input, flat tool format, etc.) instead of the standard Chat Completions format (messages, nested tool format). If your endpoint only supports /v1/chat/completions, it’ll fail. Related thread with more details: Cursor Agent sends Responses API format to /chat/completions endpoint.
Workaround: Try switching from Agent mode to Ask mode. It’s more likely to use the standard Chat Completions format.
HTTP/2 compatibility
Go to Cursor Settings > Network > HTTP Compatibility Mode and switch to HTTP/1.1. TLS and connection errors often happen when HTTP/2 doesn’t work well with custom endpoints.
Version update
You’re on 2.4.22 (from January). The latest stable is 2.5.x. It’s worth updating since there have been BYOK-related fixes since then.
A couple of questions:
Does your endpoint support the OpenAI Responses API format, or only the Chat Completions format?
Can you try Ask mode instead of Agent and let me know if that works?
The team is aware of the Responses API format issue with custom URLs. There’s no timeline yet, but reports like yours help with prioritization.
I have upgraded to 2.6.13 and set the network mode to HTTP1, and I have seen that my interface supports the OpenAI response api response format, and I still cannot connect to my URL correctly in ask or agent modes.