Hey, thanks for the report.
This is a known issue. When the Override OpenAI Base URL setting is enabled for GLM-4.7, it affects all models, including Cursor’s built-in models (Sonnet 4.5 and others). Cursor then tries to route its own models through your custom Base URL instead of its provider, which leads to the ERROR_OPENAI error.
Workaround:
- Turn off Override OpenAI Base URL when you want to use Cursor’s built-in models (Sonnet 4.5, etc.)
- Turn it back on only when you need GLM-4.7
- Manually switch the setting depending on the model you’re using
Important: GLM isn’t in Cursor’s list of officially supported providers (only OpenAI, Anthropic, Google AI, Azure OpenAI, AWS Bedrock). If you use it via Override Base URL, you may see unstable behavior, and the GLM API can lose reasoning content between messages, which reduces answer quality.
Similar case: Cursor Models Fail When Using BYOK OpenAI Key with Overridden Base URL (GLM-4.7)
Let me know if this helps.