Hey, thanks for the report.
Important: GLM isn’t on Cursor’s list of officially supported API providers. The only officially supported providers are OpenAI, Anthropic, Google AI, Azure OpenAI, and AWS Bedrock. Third-party providers via Override OpenAI Base URL may work unreliably.
Other known issues with GLM:
- The GLM API isn’t compatible with Cursor. The model’s reasoning or thinking content gets dropped between messages, which seriously hurts performance. To make it work, you need a proxy server that moves
reasoning_contentintocontent.
Similar issues were discussed here: Cursor Models Fail When Using BYOK OpenAI Key with Overridden Base URL (GLM-4.7)
I recommend using officially supported providers for stable performance.