Cursor Models Fail When Using BYOK OpenAI Key with Overridden Base URL (GLM-4.7)

Where does the bug appear (feature/product)?

Cursor IDE

Describe the Bug

When using a custom BYOK OpenAI API key with an overridden Base URL (for GLM-4.7), the custom model works correctly. However, after this configuration, Cursor-provided models (Cursor Pro) stop working. Cursor appears to route its own models to the same overridden OpenAI Base URL instead of using Cursor’s internal model provider, which causes model resolution failures.

Steps to Reproduce

1.Open Cursor and go to Settings → Models
2.Add a custom BYOK OpenAI API key
3.Set an overridden Base URL pointing to a GLM OpenAI-compatible endpoint
4.Configure and use the GLM-4.7 model (this works correctly)
5.Switch to any Cursor-provided model (available under the Cursor Pro plan)
6.Observe that the Cursor model request fails with a provider error

Expected Behavior

-The GLM-4.7 model should use the overridden OpenAI-compatible Base URL.

-Cursor-provided models (Cursor Pro) should continue to work normally and route through Cursor’s internal model provider, unaffected by the BYOK Base URL override.

-Both custom BYOK models and Cursor models should be usable in the same workspace without conflicts.

Screenshots / Screen Recordings

Operating System

Windows 10/11

Current Cursor Version (Menu → About Cursor → Copy)

Version: 2.2.43 (system setup)
VSCode Version: 1.105.1
Commit: 32cfbe848b35d9eb320980195985450f244b3030
Date: 2025-12-19T06:06:44.644Z
Electron: 37.7.0
Chromium: 138.0.7204.251
Node.js: 22.20.0
V8: 13.8.258.32-electron.0
OS: Windows_NT x64 10.0.26100

For AI issues: which model did you use?

Gemini 3 Flash, but it happens for all the model

For AI issues: add Request ID with privacy disabled

Request ID: 90986360-2d83-4053-9223-ccd6d21643f2
{“error”:“ERROR_OPENAI”,“details”:{“title”:“Unable to reach the model provider”,“detail”:“We encountered an issue when using your API key: Streaming error\n\nAPI Error:\n\n\n{\"error\":{\"type\":\"provider\",\"reason\":\"provider_error\",\"message\":\"Provider returned 400\",\"retryable\":false,\"provider\":{\"status\":400,\"body\":\"{\\\"error\\\":{\\\"code\\\":\\\"1211\\\",\\\"message\\\":\\\"Unknown Model, please check the model code.\\\"}}\"}}}\n”,“additionalInfo”:{},“buttons”:,“planChoices”:},“isExpected”:true}

Does this stop you from using Cursor

No - Cursor works, but with this issue

2 Likes

Hey, thanks for the report.

This is a known issue. When you set “Override OpenAI Base URL”, it affects all API keys and models, including Cursor Pro models.

Workaround:

  • Turn off “Override OpenAI Base URL” when you want to use Cursor Pro models
  • Turn it back on only when you need to use GLM-4.7
  • Switch it manually depending on which model you’re using

A similar issue was discussed here: Anthropic models break when Override OpenAI BaseUrl is set

I’ll pass your details to the team for escalation.

1 Like

It’s also worth noting that plain GLM api is a bad fit for cursor - your models’ thinking is being discarded on both inter turns and between your messages, which is not how the model is supposed to be used. You probably noticed that all reasoning/thinking is missing in cursor GUI, under the hood it’s being discarded too. If you want to use GLM you have to wrap it with small proxy which will unwrap reasoning_content deltas into content deltas, and insert opening/closing think tag + new lines around them, if you won’t do that the model will misbehave and perform much weaker than it’s advertised.

1 Like

There are many who have pointed this out but this is not an issue that Cursor has any interest in fixing. They have had months. I have been using Cursor for months now and the oonly models that are usable are other own and Anthoropics.

3 Likes

This topic was automatically closed 22 days after the last reply. New replies are no longer allowed.