Fresh bugs with custom model

Where does the bug appear (feature/product)?

Cursor IDE

Describe the Bug

Hi!
I’m using the GLM-4.7 model with the OpenAI endpoint and API key replaced. I understand this is some hack, but you promised support for custom models.
First, why can’t I use other models: Claude, Gemini, Composer, Grok, Kimi, etc., if I changed just the endpoint for OpenAI, it’s ok if ChatGPT isn’t available in this case, but not other models. This is really strange, as much as the impossibility to add any model I want, maybe OpenRouter, or Qwen, or anything else.
But today the situation has become much worse. I can’t use a custom model anyway at all: I got an error “Invalid model. The model GLM-4.7 does not work with your current plan or api key.”
I have a Pro Plus subscription. And now I’m hardly considering changing the IDE because of these issues. I have no clue why do you limit paying customers to use only built-in models, just to resell tokens?

Steps to Reproduce

Use a custom model with API key and endpoint entered in OpenAI requisits

Operating System

Windows 10/11

Current Cursor Version (Menu → About Cursor → Copy)

Version: 2.3.35 (user setup)
VSCode Version: 1.105.1
Commit: cf8353edc265f5e46b798bfb276861d0bf3bf120
Date: 2026-01-13T07:39:18.564Z
Electron: 37.7.0
Chromium: 138.0.7204.251
Node.js: 22.20.0
V8: 13.8.258.32-electron.0
OS: Windows_NT x64 10.0.26200

Does this stop you from using Cursor

Yes - Cursor is unusable

Hey, thanks for the report.

This is a known issue. When “Override OpenAI Base URL” is enabled, it affects all API keys and models, including Cursor’s built-in models (Claude, Gemini, etc.). The team is working on a fix, but for now here’s a workaround:

  • Turn off “Override OpenAI Base URL” when you want to use Cursor’s standard models
  • Turn it back on only when you need GLM-4.7
  • Switch it manually depending on which model you’re using

A similar issue was discussed here for Anthropic: Anthropic models break when Override OpenAI BaseUrl is set

And specifically for GLM-4.7: Cursor Models Fail When Using BYOK OpenAI Key with Overridden Base URL (GLM-4.7)

I’ll pass your details to the team.

Hi! Thank you for the answer.
But we have two issues here. The first one is when other models are overwritten, the second is “Invalid model. The model GLM-4.7 does not work with your current plan or api key”. The second issue makes Cursor totally unusable with an external model.

Also if follow walkaround for the first issue - each time when I need to enable custom model, I have to input endpoint manualy again - it doesn’t save

Yes, the second issue (the endpoint not being saved when you toggle the switch) is also a known bug. The team is aware and is working on fixing the whole base URL override system.

Unfortunately, the current workaround is to manually enter the endpoint every time. The only alternative is to save the endpoint in a note or text file and copy it in when you turn the toggle on.

I know that’s inconvenient, which is why we’re planning to add the option to set a separate base URL for each custom model. That should fix both issues.

I have this same issue. GLM-4.7 works with my Ultra subscription, but changing the switch is pain in the ass. Cursor should use the GLM-4.7 to replace the lousy Composer1 by the way…

Hi Dean,
Do you have any ETA for these fixes?

Hey, unfortunately I don’t have an exact ETA. The issue is in our backlog, and the team is working on it, but I can’t share a specific date yet.

I get that the workaround is inconvenient. I’ll let you know here once the fix is out.

This is an issue on your side, GLM should work unless you’re on free plan (BYOK doesn’t work on free plan) or unless you pasted the key/endpoint wrongly

As for endopint pasting each time - disable the “OpenAI API key” toggle to use cursor models, not the “Override…” toggle, that way your endpoint stays saved.

Also facing the same issue today. Oddly enough, GLM-4.7 was working just fine until this morning. Then I restarted cursor for an update and now it doesn’t work. I’m on a pro plan. Cursor Version: 2.4.22 (Universal)

Also facing this issue, can’t connect any GLM models to cursor. Checked API, it is working ok

Is there any news on ETA when this bug will be fixed?

Receiving this error message now:

Request ID: 5600d4cf-f01b-4d5b-8861-5eefe3107f76
AI Model Not Found Model name is not valid: “GLM-4.7”
F4t: AI Model Not Found Model name is not valid: “GLM-4.7”
at Gmf (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:9095:38263)
at Hmf (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:9095:37251)
at rpf (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:9096:4395)
at fva.run (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:9096:8170)
at async Hyt.runAgentLoop (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:34196:57047)
at async Zpc.streamFromAgentBackend (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:34245:7695)
at async Zpc.getAgentStreamResponse (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:34245:8436)
at async FTe.submitChatMaybeAbortCurrent (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:9170:14575)
at async Ei (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:32994:3808)

Same problem

Same problem. Using v2.4.27. Unable to use openai/openrouter models

Update: Still broken in v2.4.28

wow, cancelled my cursor sub a month or two ago and came back to try with the z.ai keys.. wow…

I had the same problem at an older version like 2.4.28. Now I upgrade to 2.4.31 and get a new error message “Free plans can only use Auto. Switch to Auto or upgrade plans to continue.“. Is this still a bug or I need to upgrade my plan to use custom models?

Don’t bother, I have subscription and openrouter API key is still broken in Version: 2.4.31

I ended up installing Roo Code extension to use my openrouter key :frowning:. Also started using Antigravity. It’s still on preview so it has higher limits for free plan.

Update: still broken in Version: 2.4.36

1 Like