Issue Description:
I am using Cursor to connect to OpenAI through a local proxy server (http://127.0.0.1:55552/openai/v1). However, when attempting to use the Chat functionality, I am encountering the following issues:
-
Requests not hitting the local server: When I send a request through Cursor UI, it seems that the request does not reach my local proxy server, resulting in the request not being processed by my server.
-
Error Message: In the Cursor UI, I see the following error:
-
Error: “
ERROR_OPENAI→Unable to reach the model provider→Request failed with status code 403: Access to private networks is forbidden” -
Message: “The model
my-proxy-llmdoes not work with your current plan or API key”
-
These errors suggest that Cursor is not correctly using my local proxy, but instead directly sending requests to the OpenAI official services, which results in the 403 error.
The model “my-proxy-llm” is being transformed into gpt-4o-mini in code.
Steps Taken:
-
Confirmed the local proxy is working:
-
I successfully sent a request to
http://127.0.0.1:55552/openai/v1/responsesusing APIPost, and the server responded correctly, confirming the proxy server is functioning properly. -
Example request body:
{ "model": "gpt-4o-mini", "input": "Write a one-sentence bedtime story about a unicorn." }
-
-
Checked Cursor Configuration:
-
In Settings → Models → API Keys, I have enabled Override OpenAI Base URL and set it to
http://127.0.0.1:55552/openai/v1. -
I have verified that the OpenAI API Key is correct and properly configured.
-
-
Issue Symptoms:
-
When I make a simple request from the Cursor UI, the response shows the error “Access to private networks is forbidden”, and my server does not receive any request logs.
-
This indicates that the request is not hitting my local proxy, and instead, it’s being affected by Cursor’s own internal settings, causing it to attempt to connect directly to OpenAI, which results in the
403error.
-