Not able to use azure api key

Hey

Can anyone please tell me what am I doing wrong here?

I have added my base URL as “https://<base_url>.openai.azure.com”
and deployment name is - “gpt-4o-mini”

Still it is giving me message - Upgrade for more usage. Get more credits on higher plans.

I am on Pro plan as of now!

What should I do?

Can anyone please help? what am i doing wrong here?

Hey, thanks for the report. Your Azure configuration looks correct, and the deployment in Azure is active.

The issue is most likely which model you’re selecting in chat. Azure BYOK doesn’t automatically replace all models. You need to explicitly pick the Azure model from the dropdown.

Please check:

  1. In the model dropdown (where it currently shows Auto), your Azure model gpt-4o-mini should appear. Select that specific one.

  2. Try adding the model manually: Cursor Settings → Models → Add Model → enter gpt-4o-mini (exactly the same name as your Azure deployment).

  3. In Azure Portal, try Azure Portal → Open in playground to confirm the deployment works.

If the Azure model still doesn’t show up in the list after that, restart Cursor.

I changed the Deployment Name - gpt-4.1-mini

Base URL - https://tigestai8810248134.openai.azure.com

and getting this error

{“error”:{“type”:“client”,“reason”:“invalid_input”,“message”:“model is required”,“retryable”:false}}

I added the model and it is showing in place of Auto

I verified in azure playground, this model works, plus it works with other projects , but not in cursor

Request details - Request ID: 05dcfb3d-a8ba-47f8-988e-fdc0bd06951c
{“error”:“ERROR_OPENAI”,“details”:{“title”:“Unable to reach the model provider”,“detail”:“We encountered an issue when using your API key: Streaming error\n\nAPI Error:\n\n\n{\"error\":{\"type\":\"client\",\"reason\":\"invalid_input\",\"message\":\"model is required\",\"retryable\":false}}\n\n”,“additionalInfo”:{},“buttons”:,“planChoices”:},“isExpected”:true}
ConnectError: [invalid_argument] Error
at dou.$endAiConnectTransportReportError (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:12706:475325)
at JXe._doInvokeHandler (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:13633:23170)
at JXe._invokeHandler (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:13633:22912)
at JXe._receiveRequest (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:13633:21544)
at JXe._receiveOneMessage (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:13633:20361)
at pMt.value (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:13633:18388)
at Ce._deliver (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:49:2962)
at Ce.fire (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:49:3283)
at jyt.fire (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:12691:12156)
at MessagePort. (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:15681:18406)

Thanks for the info. Your Azure configuration looks correct, and the gpt-4.1-mini deployment is active.

The “model is required” error is a Cursor-side bug where the model parameter isn’t being passed correctly to the Azure API. Similar issues were fixed in recent versions.

Please check:

  1. Cursor version: Go to Help > About Cursor and share the version. If it’s below 2.3.34, update it (Cursor Settings > Beta > Early Access, then restart).

  2. Try Ask mode: At the bottom of the chat, switch from “Agent” to “Ask”. Agent mode has more issues with Azure BYOK.

If it still doesn’t work after updating and switching to Ask mode, send me your Cursor version and we’ll dig deeper. I already have the Request ID.

Request ID: f5fd9c19-229e-45d3-aa4d-13d8901d1db0
{“error”:“ERROR_OPENAI”,“details”:{“title”:“Unable to reach the model provider”,“detail”:“We encountered an issue when using your API key: Streaming error\n\nAPI Error:\n\n\n{\"error\":{\"type\":\"client\",\"reason\":\"invalid_input\",\"message\":\"model is required\",\"retryable\":false}}\n\n”,“additionalInfo”:{},“buttons”:,“planChoices”:},“isExpected”:true}

I am on this version - 2.3.41 (Universal)

Tried restarting and also tried Ask
this is not just the issue with Azure but also with the gemini keys, it is also giving me error

Request ID: 04129f73-3ef7-4da2-9f67-a79350a2a159
{“error”:“ERROR_OPENAI”,“details”:{“title”:“Unable to reach the model provider”,“detail”:“We encountered an issue when using your API key: Streaming error\n\nAPI Error:\n\n\n{\"error\":{\"type\":\"client\",\"reason\":\"invalid_input\",\"message\":\"model is required\",\"retryable\":false}}\n\n”,“additionalInfo”:{},“buttons”:,“planChoices”:},“isExpected”:true}

Can you please help me

Hey @deanrie

Can you please check this?

I saw your latest update. The issue is still happening on version 2.3.41, even after switching to Ask mode and restarting. This is clearly more serious than expected, especially since it affects both Azure and Gemini BYOK.

I’ll pass this to the dev team for a deeper investigation.