While investigating the part of the code where I call Azure OpenAI with Cursor, I discovered at least the following three issues.
The first issue is the API version. When Azure OpenAI settings are changed, a connection test is performed, but during that test the specified API version is “2023-03-15-preview”, which is a very old version. When using models like o1 or o3, the API version must be at least “2024-12-01-preview” or higher.
The second issue is the request parameters. In the connection test, “max_tokens” is specified, but for reasoning models such as o1 or o3, the “max_completion_tokens” parameter must be used instead of “max_tokens”.
The third issue concerns the error messages. When an error occurs during the connection test with Azure OpenAI, all error messages seem to be condensed into “Invalid credentials. Please try again.” The first two issues I mentioned are likely causing Azure OpenAI to return more specific error messages instead of just “Invalid credentials. Please try again.” I believe it would be more user-friendly to display the error messages returned directly from the Azure OpenAI endpoint on the screen.
I think these issues can be fixed easily.
I would appreciate it if you could share this information with the development team.
I’m glad if I could be of help.
1 Like
now o3 came out, but you guys still don’t support even Azure’s o1
I see this error in the network window:
{
"error": {
"message": "Unsupported value: 'messages[0].role' does not support 'system' with this model.",
"type": "invalid_request_error",
"param": "messages[0].role",
"code": "unsupported_value"
}
}
That’s because I think o1 doesn’t support system
as a role. It should be developer
or assistant
.
If only there was a way to change this for every request, it would have worked.
Hey, not currently a way around this, but thanks for flagging - I’ll throw this to the team to see if we can solve it!
1 Like
People report ’ O3-mini agent mode is insane’. So can we PLEASE have access to Azure’s o3?
Half of a year you are not fixing this
would it be possible for the team to describe the source of your hatred to Azure users in brief?
No hatred at all towards Azure users!
We absolutely support using Azure OpenAI through custom API keys. While Azure integration isn’t currently our top priority given other features we’re working on, you can still use O1 within Cursor via an Azure subscription.
I totally get the frustration around O1/O3 support - the technical details shared by others in this thread are super helpful and I’ve made sure the team has this info to help improve Azure support down the line
Let’s clarify please:
you can still use O1 within Cursor via an Azure subscription
This is not so. We can’t use o1 within Cursor via an Azure subscription. Cursor gives ‘Invalid credentials’ for o1 models, this includes o1-preview as well as a regular o1 (I tried it just now). It does work for older models though, such as gpt-4o. This mirrors this thread’s title and initial message.
You are right. I apologize; I mistyped it.
That was meant to read ‘via a Cursor Pro subscription’.
1 Like
I am Pro user on latest stable. It appears only GPT-3.5-turbo and gpt-4o work for me, but I believe the model_version for those hardcoded and old. This seems silly to me. The Cursor development cycle will not be able to keep up with model dev. You should open up your model configs so that users can tweak to make them work with new models while your tooling dev catches up. THe community could even inform your dev efforts. I like Cursor so far but its lack of configurability and black box feel is frustrating. Looking at other vscode extensions because of this.
1 Like
I also got the same error with o3 mini model.
{
"error": {
"message": "Unsupported value: 'messages[0].role' does not support 'developer' with this model.",
"type": "invalid_request_error",
"param": "messages[0].role",
"code": "unsupported_value"
}
}
current version of cursor
Version: 0.46.9 (user setup)
VSCode Version: 1.96.2
Commit: 3395357a4ee2975d5d03595e7607ee84e3db0f20
Date: 2025-03-05T08:37:15.270Z
Electron: 32.2.6
Chromium: 128.0.6613.186
Node.js: 20.18.1
V8: 12.8.374.38-electron.0
OS: Windows_NT x64 10.0.22621
Hi.When will it be fixed?
It still only works up to 4o and is inferior to other products.
Unfortunately no updates to share here yet, but will ask the team if this is something we can look at soon.
Thank you for your reply. It is said that o4-mini will also be released soon. I’m looking forward to being able to use it in Cursor.
o4 has been released, you still don’t support even o1
Hey, following up with the team on this with the team!
The issue is that the API for these models are not standard OpenAI format (using the role developer
, whereas system
is the standard), and therefore we have no routing enabled for this format under the hood.
This has been the best part of a year now and not a single azure open ai model connection is working. Incredibly frustrating!