Using a base URL override + OpenAI API Key, requests are formulated as Responses requests but are in fact sent to /v1/chat/completions, resulting in:
Request failed with status code 400: {"error":{"message":"openai error: Missing required parameter: 'messages'.","type":"invalid_request_error","param":"messages","code":"missing_required_parameter"},"provider":"openai"}
Steps to Reproduce
Open Cursor Settings
Set “OpenAI API Key”
Set “Override OpenAI Base URL” to a proxy
Observe request being received on /v1/chat/completions path with payload formulated for /v1/responses (characterized by input field rather than messages field)
Expected Behavior
Requests are formulated using the correct payload for the target endpoint.
Operating System
MacOS
Current Cursor Version (Menu → About Cursor → Copy)
Hey, thanks for the report. This is a known limitation: BYOK currently only supports /v1/chat/completions, while GPT-5/5.1 models use the /v1/responses format-so the proxy sees an input payload on the chat/completions endpoint and returns the “missing messages” error.
Workarounds:
Turn off your OpenAI API key and Base URL override and use Cursor’s built-in API for GPT-5/5.1 (Settings > Models > API Keys, or Cmd+Shift+0). This should unblock you.
Thanks for the screenshot - that’s the exact BYOK case: with Base URL override, the request is formed as /v1/responses, but it goes to /v1/chat/completions, which causes the “missing_required_parameter: ‘messages’” error.
Please send:
The exact Base URL proxy
Request ID from the error (with privacy mode disabled)
We’re tracking this and working on a fix for the error.
So you’re saying this is intentional behaviour? Why?
Surely the base URL override should ask the user which endpoint they’d like to hit, and then build the request according to that endpoint?
This behaviour changed recently and it’s honestly pretty wild that you folks didn’t mention this in your changelog (as far as I could see). This breaks a tonne of integrations.
The base URL I used is an internal URL so I can’t provide it.
Request ID: 04433689-472c-427c-8009-eb378616edba
{"error":"ERROR_OPENAI","details":{"title":"Unable to reach the model provider","detail":"We encountered an issue when using your API key: Provider was unable to process your request\n\nAPI Error:\n\n```\nRequest failed with status code 400: {\"error\":{\"code\":\"missing_required_parameter\",\"message\":\"Missing required parameter: 'messages'.\",\"param\":\"\",\"type\":\"invalid_request_error\"}}data: [DONE]\n\n\n```","additionalInfo":{},"buttons":[],"planChoices":[]},"isExpected":true}
Hey, thanks for the ping. Confirmed: this is a bug. It’s a known BYOK limitation with Override OpenAI Base URL, where the payload for /v1/responses is sent to /v1/chat/completions and fails with missing_required_parameter: 'messages'.
The team is working on a fix. Thanks for your patience.