GLM-4.7 Streaming Error

Hello. Bit of a noob here.

I have setup GLM-4.7 as per instructions from zai documentation website and I keep getting the error Unable to reach the model provider.

Hey, thanks for the report.

Important: GLM isn’t on Cursor’s list of officially supported API providers. The only officially supported providers are OpenAI, Anthropic, Google AI, Azure OpenAI, and AWS Bedrock. Third-party providers via Override OpenAI Base URL may work unreliably.

Other known issues with GLM:

  • The GLM API isn’t compatible with Cursor. The model’s reasoning or thinking content gets dropped between messages, which seriously hurts performance. To make it work, you need a proxy server that moves reasoning_content into content.

Similar issues were discussed here: Cursor Models Fail When Using BYOK OpenAI Key with Overridden Base URL (GLM-4.7)

I recommend using officially supported providers for stable performance.