Cursor can not work with Azure OpenAI

Where does the bug appear (feature/product)?

Cursor IDE

Describe the Bug

I never successfully added my company’s self-hosted Azure OpenAI LLM into the Cursor IDE;
I encountered below error:
{“error”:{“type”:“client”,“reason”:“ssrf_blocked”,“message”:“connection to private IP is blocked”,“retryable”:false}}

Our Azure OpenAI LLM is protected behind the company’s VPN. However, I have a question: why do other IDEs like Chatwise, Plugins Roo Code, and Kilo Code have no issues connecting to the LLM from my local machine directly?

I’m unsure how Cursor communicates with the self-hosted LLM. It appears my prompt must be sent to the Cursor server before being forwarded to the LLM provider.

Do we have a workaround or other option to resolve this issue?

Steps to Reproduce

  • Open Cursor Settings
  • Go to Models page
  • Turn on Azure OpenAI
  • Fill in Base URL
  • Provide API Key
  • Click Add Custom Model, add model gpt-5-chat

Expected Behavior

I shall be able to communicate with my self-hosted Azure OpenAI LLM from my local machine directly

Screenshots / Screen Recordings

Operating System

MacOS

Current Cursor Version (Menu → About Cursor → Copy)

Version: 2.3.34 (Universal)
VSCode Version: 1.105.1
Commit: 643ba67cd252e2888e296dd0cf34a0c5d7625b90
Date: 2026-01-10T21:17:10.428Z
Electron: 37.7.0
Chromium: 138.0.7204.251
Node.js: 22.20.0
V8: 13.8.258.32-electron.0
OS: Darwin arm64 25.2.0

For AI issues: which model did you use?

gpt-5-chat

Does this stop you from using Cursor

No - Cursor works, but with this issue

Hey @shawnzhang.dev

When you configure Azure OpenAI in Cursor with BYOK (Bring Your Own Key), the request flow is:

Your Machine → Cursor Servers → Azure OpenAI Endpoint

Because requests are proxied through Cursor’s infrastructure, we have SSRF (Server-Side Request Forgery) protection that blocks connections to private/internal IP ranges.

The other IDEs you mention likely make direct connections from your local machine to Azure OpenAI, so they can reach endpoints behind your VPN. Cursor’s architecture routes BYOK requests through our servers, which cannot access your private network.

In order to use Azure OpenAI, you’ll need to have an endpoint that is accessible via a public endpoint (secured via API keys).

1 Like

Hi @Colin
Thank you, this is clear to me now.

I have a side question regarding the BYOK (Bring Your Own Key) mode, as I understand that requests are proxied through Cursor’s infrastructure. I noticed that Cursor offers two privacy modes: Privacy Mode (Legacy) and the newer Privacy Mode, which indicates that code may be stored for background agents and other features.

I read this article: https://cursor.com/security#codebase-indexing.It seems that Cursor uses a Merkle tree of hashes and only stores embeddings in a vector database.

I want to double-check with you
If I choose to use newer Privacy Mode, the cursor server does not store my code in plaintext; only vector data is stored.

I’ll defer to our notes Data Use & Privacy Overview

If you choose to index your codebase, Cursor will upload your codebase in small chunks to our server to compute embeddings, but all plaintext code for computing embeddings ceases to exist after the life of the request. The embeddings and metadata about your codebase (hashes, file names) may be stored in our database.

1 Like