OpenAI compatible problems with GPT models

Where does the bug appear (feature/product)?

Cursor IDE

Describe the Bug

In Cursor, when I send a request to Gemini, the format sent by Cursor is like this
[
{
“model”: “gemini-3-flash-preview”,
“temperature”: 1,
“top_p”: 0.95,
“messages”: [
{
“role”: “system”,
“content”: “You are an AI coding assistant…”
},
{
“role”: “user”,
“content”: “<user_query>\nhi\n</user_query>\n”
}
],

But once switched to the GPT series of models, the requests sent by Cursor are like this
{
“user”: “9388450ab22b70b9”,
“model”: “gpt-5-mini”,
“input”: [
{
“role”: “system”,
“content”: “You are an AI coding assistant,…”
},

which is causing a problem with my OpenAI-compatible service provider. I received this error from Bltcy:{“error”:{“type”:“provider”,“reason”:“provider_error”,“message”:“Provider returned 500”,“retryable”:true,“provider”:{“status”:500,“body”:“{"error":{"message":"field messages is required (request id: B20260101123954989515729fsvwyqOF)","type":"new_api_error","param":"","code":"invalid_request"}}”}}}

The standard OpenAI format should be like this:
{
“model”: “gpt-5-mini”,
“temperature”: 1,
“top_p”: 0.95,
“messages”: [
{
“role”: “system”,
“content”: “You are an AI coding assistant…”
},
{
“role”: “user”,
“content”: “<user_query>\nhi\n</user_query>\n”
}
],

Is there a way to send requests in this format?

Steps to Reproduce

Use an OpenAI-compatible service provider. Such as OpenRouter, or Bltcy. I am using Bltcy (https://api.bltcy.ai/), but it only supports Chinese. You can use OpenRouter or another service provider to get the same result.

Expected Behavior

I expect Curosr to send a request in this format:
{
“model”: “gpt-5-mini”,
“temperature”: 1,
“top_p”: 0.95,
“messages”: [
{
“role”: “system”,
“content”: “You are an AI coding assistant…”
},
{
“role”: “user”,
“content”: “<user_query>\nhi\n</user_query>\n”
}
]
, containing the “messages” field.

Operating System

Windows 10/11

Current Cursor Version (Menu → About Cursor → Copy)

Version: 2.3.15 (system setup)
VSCode Version: 1.105.1
Commit: bb2dbaacf30bb7eb9fd48a37812a8f326defa530
Date: 2025-12-30T20:30:37.151Z
Electron: 37.7.0
Chromium: 138.0.7204.251
Node.js: 22.20.0
V8: 13.8.258.32-electron.0
OS: Windows_NT x64 10.0.26200

For AI issues: which model did you use?

All GPT models, such as gpt-5-mini, gpt-5.1, etc.

Additional Information

This is not a problem with the service provider, since Cursor is working fine with other models (such as Gemini, Claude, or Grok) using the same service provider. But Cursor just won’t work with GPTs.

Does this stop you from using Cursor

Sometimes - I can sometimes use Cursor

Hey, thanks for the report.

This is a known issue: when using BYOK (Override OpenAI Base URL) with GPT models, Cursor sends requests in the Responses API format (with the input field) instead of the Chat Completions API format (with the messages field). OpenAI-compatible providers expect the standard format and return an error.

The team is working on a fix, but there’s no ETA yet.

Workarounds:

  1. Turn off BYOK for GPT models and use Cursor’s built-in API (Settings > Models, remove the OpenAI API Key and Override Base URL)
  2. Use a proxy that can convert the Responses API payload into the Chat Completions format (for example, LiteLLM proxy)

Related thread with the same issue: Requests are sent to incorrect endpoint when using base URL override