Azure Anthropic Proxy for Cursor (Claude Opus 4.5)

About a week ago, something unclear changed on Cursor’s side and Azure API stopped working in the API configuration for me.
Around the same time, Azure announced support for Claude Opus 4.5, and there were articles mentioning the integration — but it still seems delayed and not officially usable yet.

Instead of waiting, I referenced Cursor-Azure-GPT-5 and set up my own Azure Anthropic proxy deployed on Railway to keep using Azure + Claude inside Cursor.
If you’re facing the same issue or want to use Claude via Azure in Cursor, you can try this approach.

Azure Anthropic Proxy for Cursor

A lightweight proxy server that connects Cursor IDE to Azure Anthropic API (Claude) by exposing OpenAI-compatible endpoints.

:globe_with_meridians: Production URLs

  • Base URL:
    https://cursor-azure-claude-proxy-production.up.railway.app/

  • Health Check:
    https://cursor-azure-claude-proxy-production.up.railway.app/health


:clipboard: Supported Endpoints

  • POST /chat/completions — Main endpoint for Cursor (OpenAI format)

  • POST /v1/chat/completions — OpenAI-compatible

  • POST /v1/messages — Native Anthropic format

  • GET /health — Server health check


:rocket: How to use with Cursor IDE

  1. Open Cursor Settings

  2. Go to Models / Model Settings

  3. Select Opus 4.5

  4. Set OpenAI Custom API URL to:

    https://cursor-azure-claude-proxy-production.up.railway.app
    
  5. Set OpenAI API Key to the same value as SERVICE_API_KEY on the server

:warning: Important:
The API key in Cursor Settings must exactly match SERVICE_API_KEY.
If it doesn’t match, requests will be rejected with an authentication error.


:key: Required Environment Variables

AZURE_ENDPOINT         Azure Anthropic endpoint
AZURE_API_KEY          Azure API key
SERVICE_API_KEY        Auth key for Cursor requests
AZURE_DEPLOYMENT_NAME  Default: claude-opus-4-5
PORT                   Default: 8080

:locomotive: Deployment (Railway – quick summary)

  • Deploy as a Node.js project

  • Set the environment variables above

  • Railway will auto-deploy and provide a public URL

  • Verify with:

    GET /health
    
    

    Expected response:

    { "status": "ok" }
    
    

:memo: License

MIT


:folded_hands: Reference

Inspired by Cursor-Azure-GPT-5, a proxy service that enables Cursor to work with Azure GPT deployments.

Hey, thanks for the detailed solution, this is a great workaround for the community.

Just confirming this is a known limitation. Cursor currently supports only the Azure OpenAI API format, and Anthropic models in Azure AI Foundry use a different API format. The team is tracking this in the backlog.

Your proxy approach could help other users who want to use Azure Anthropic before official support lands.

1 Like

Hi DonNguyen,
is there a specific repo you can share, which one should deploy on railway, to get claude working?

1 Like

GitHub - bcat95/Cursor-Azure-Claude: Proxy server to connect Cursor IDE with Azure Anthropic API (Claude). Transform OpenAI format to Anthropic format. Supports Claude Opus 4.5, streaming, and Railway deployment. Oh, I forgot to attach it

1 Like

Hey, thanks for confirming!
Yes, that makes sense. I also ran into this limitation and am using this proxy as a temporary workaround.
Hopefully official support for Azure Anthropic models will land soon — in the meantime, I’m glad this approach can help others in the community.

2 Likes