Azure Claude 4.5 Sonnet Error

Where does the bug appear (feature/product)?

Cursor IDE

Describe the Bug

When i’m trying to connect my Claude 4.5 sonnet deployment from azure in the Azure models i got an error

Steps to Reproduce

  1. Deploy a Claude 4.5 sonnet model in Azure AI Foundry
  2. Connect the model in Cursor Settings IDE
  3. Test model

Expected Behavior

Connect the model and use it in the Agents sections like any other model

Operating System

MacOS

Current Cursor Version (Menu → About Cursor → Copy)

Version: 2.0.77
VSCode Version: 1.99.3
Commit: ba90f2f88e4911312761abab9492c42442117cf0
Date: 2025-11-13T23:10:43.113Z
Electron: 37.7.0
Chromium: 138.0.7204.251
Node.js: 22.20.0
V8: 13.8.258.32-electron.0
OS: Darwin arm64 25.1.0

For AI issues: which model did you use?

Sonnet 4.5

Does this stop you from using Cursor

No - Cursor works, but with this issue

2 Likes

Hey, thanks for the report. To understand what’s causing the Azure Claude 4.5 Sonnet connection error, please share:

  • The exact error message you see when testing the model
  • A screenshot of the error and your Azure model configuration in Cursor Settings
  • Azure deployment details:
    • Endpoint URL format
    • API version
    • Deployment name
  • Console errors: open Help → Toggle Developer Tools → Console tab, test the model again, and share any errors that appear
  • Request ID: after trying to use the model

This will help us see whether it’s a configuration issue or an API compatibility problem similar to other Azure model integration issues we’ve seen.

  1. Request failed with status code 404: {“error”:{“code”:“404”,“message”: “Resource not found”}}
  2. Details:
    1. Endpoint = “https://{myresource}.services.ai.azure.com/anthropic/“
    2. API Version = Not needed
    3. deployment_name= “claude-sonnet-4-5”

i will give you the Azure Foundry example that runs perfect when i used the model

from anthropic import AnthropicFoundry

endpoint = "https://{myresource}.openai.azure.com/anthropic"
deployment_name = "claude-sonnet-4-5"
api_key = "<your-api-key>"

client = AnthropicFoundry(
    api_key=api_key,
    base_url=endpoint
)

message = client.messages.create(
    model=deployment_name,
    messages=[
        {"role": "user", "content": "What is the capital of France?"}
    ],
    max_tokens=1024,
)

print(message.content)

Thanks for providing those details! I noticed an important mismatch:

In your Cursor settings you have:
https://{myresource}.services.ai.azure.com/anthropic

But in your working Python example you use:
https://{myresource}.openai.azure.com/anthropic

Please update your Cursor settings to use the same endpoint URL format as in your working Python code (with openai.azure.com instead of services.ai.azure.com) and test again.

Also, please share the console errors as I asked above (Help → Toggle Developer Tools → Console tab, then test the model).

Hi! I already tried with both and it didn’t work either


Thanks for the logs. I can see the issue now - Cursor is showing ERROR_OPENAI and [invalid_argument] errors, which means Cursor is treating your Azure Anthropic deployment as an Azure OpenAI endpoint.

The problem is that Anthropic models in Azure AI Foundry use a different API format than Azure OpenAI. Your Python code works because the AnthropicFoundry client uses the correct Anthropic API format, while Cursor’s Azure integration currently only supports the Azure OpenAI API format.

Fixing this will require engineering work to add full support for Anthropic in Azure AI Foundry. I’ll pass this to the team as a feature request/bug.

3 Likes

Yes, there’s a problem with Sonnet 4.5. I’m having the same problem. You’re asking for a log, but this isn’t a specific problem; it’s a general one. I haven’t used it for a week. I was spending $200 a day.

Hoping to get full support for Anthropic models through Azure AI Foundry. For anybody looking for a workaround, you can use the Claude Code extension. Not the same but at least I can keep using Claude models after I hit the Cursor Pro usage limits..

Hi! @deanrie

Did you added this feature request or should i do it ?

Thanks for all the support btw

Hey @Gus, yes, I already passed this to the team.

The team is tracking it in their backlog. Thanks for the report.

1 Like

Hi! Where i can check the advances related to this?

+1…

+1…

+1…

+1…