Custom Model problems

Hi there,

not sure if this is a bug or if someone has already found a solution and i just cant find a post regarding it but im having a lot of trouble adding GLM-4.7 to cursor.

I followed the guide to the exact word, I have a coding plan on z.ai, im on cursor pro, im using https://api.z.ai/api/coding/paas/v4, ive created several new api keys but I keep getting “model not found” errors and i dont know how to fix it.

Does anyone have any information as to how i can use GLM 4.7?

2 Likes

Hey, thanks for the report.

Since you’re following the instructions exactly and still getting “model not found,” we need to isolate the issue: is it Cursor or the z.ai API.

Test the API directly with curl:

curl https://api.z.ai/api/coding/paas/v4/chat/completions \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"model": "GLM-4.7", "messages": [{"role": "user", "content": "Hello"}]}'

If curl also returns “model not found”:

  • The issue is on the z.ai side. Your Coding Plan might not include GLM-4.7, or your API key doesn’t have access.
  • Reach out to z.ai support.

If curl works but Cursor doesn’t:

  • Send a screenshot of the Models section in Cursor Settings, where you set the OpenAI API Key and Override Base URL.
  • Also send the full error text from the chat.

There might be other model options. Also try glm-4-0520 or GLM-4.5-air if curl with GLM-4.7 doesn’t work.

Curl request worked fine, response bellow:

{
“choices”: [{
“finish_reason”: “stop”,
“index”: 0,
“message”: {
“content”: “Hello! I’m GLM, trained by Z.ai. How can I assist you today?..”,
“reasoning_content”: “…(internal reasoning)…”,
“role”: “assistant”
}
}],
“model”: “GLM-4.7”,
“object”: “chat.completion”,
“usage”: {
“completion_tokens”: 226,
“prompt_tokens”: 6,
“total_tokens”: 232
}
}

Thanks for the info. The error Model name is not valid: GLM-4.7 means Cursor is validating the model name on its side before sending the request, and it rejects unknown names. Looks like this was changed in the latest Cursor version.

Similar reports: Fresh bugs with custom model

Since curl works, the issue isn’t with z.ai. It’s that Cursor won’t allow the custom model name. For now, this is a Cursor limitation.

You can try:

  • As an alternative, try GLM-4.5-air or glm-4-0520. Different names might pass validation.

Also, open DevTools (Help > Toggle Developer Tools) and check the Network and Console tabs while sending the request. There may be more details on what exactly is being blocked.

its the same for all GLM models. I was using glm-4.7 just yesterday, have I missed something? is glm suddenly not allowed to be used?

ahhh so it is a bug my bad, i shouldve made a report rather than a discussion. It seems like a selective bug though? One of the users from the bug report post was using a free plan so GLM-4.7 wouldnt work anyway and it doesnt look like the other user who commented had an issue using GLM-4.7

I cant use any glm models though which again im confused by because 24hrs ago I could, curl tests all respond still and cursor hasnt updated. this is has just come about randomly

Recently, Cursor added client-side validation for the model name, which blocks custom models like GLM.

A similar issue is described here: Fresh bugs with custom model. GLM-4.7 also stopped working there. I’ve already passed this on to the team.

Same issue. Tried other custom models and same issue.

1 Like

How is this issue still not fixed? It’s been 11 days and I’m trying to add a custom model and facing this still.

3 Likes

Same problem, trying to use GLM-4.5-Flash but it doesn’t work

I’m having the exact same problem with GLM-4.7

Has anyone found a solution or workaround to this problem?

I have opened a support ticket with Z.ai and pointed them to this forum thread.


Same issue, version 2.4.28

I might have found a silly workaround that seem to be working for me, at least today. GLM-4.7 wasn’t working, but glm-4.7 (lowercase) does.

2 Likes