The custom override of the OpenAI base URL is unusable

I have a custom URL that works fine on the public internet and with plugins like cline, but after configuring the cursor’s Models in the same way, it keeps reporting a network error. Could you please help me figure out what the problem is?

Hey, thanks for the report. I can see the error logs. There are two different errors happening:

  1. TLS disconnect: “Client network socket disconnected before secure TLS connection was established”
  2. resource_exhausted: likely the provider rejecting the connection

A couple of things to try:

First, disable HTTP/2. Go to Cursor Settings > Network > HTTP Compatibility Mode and switch to HTTP/1.1. The TLS error often happens when HTTP/2 doesn’t work well with certain endpoints.

Second, the “Override OpenAI Base URL” setting currently applies to all models, not just the ones using your custom key. This is a known limitation. So if you’re trying to use both Cursor Pro models and your custom endpoint, you’ll need to toggle the override on and off depending on which model you’re using.

A few questions to narrow this down:

  • Which provider or URL are you pointing to? For example, a local proxy or a cloud provider.
  • Which model are you selecting in the model picker?
  • Does your endpoint support the OpenAI Chat Completions API format specifically? Cursor currently sends Responses API payloads in some cases, which can break if the endpoint only supports /v1/chat/completions.

Let me know how it goes with the HTTP/2 toggle.