Azure OpenAI "on premise"

Hi there :wave: (first post, so I think it’s polite to say “hi”),

I’d like to use Cursor with our corporate Azure OpenAI provider.
After reading the docs, I thought it should be possible to connect it easily, so I created an API key from our corporate azure portal, entered the baseUrl, deployment and api-key, but it wasn’t accepted (invalid credentials)

I guess that this feature is not intended to be used for Azure OpenAI except the azure-public-cloud hosted versions, is that right?
I think that I’d need to configure ?api-version={api-version} at least.

It would be super helpful if at least somebody could point me to logs where I can see which request is performed for “validating the connection”.

Thanks!

Oliver

Hey, just to confirm - in this setup, are the LLMs running on machines local to you, or still running on the cloud?

Cursor should be happy with any publicly accessible LLM provider. But there are some more complex Azure setups where Cursor may not be working as intended.

The model is hosted in a corporate azure account.
I can connect to it from localhost via vpn and API key from localhost. However, I need to specify an api-version parameter on the test API (see the docs ).

I cannot configure this in Cursor and it seems there is no other app which alles to determine the versions as consumer.

Unfortunately, the model has to be publicly accessible, as there is no way to configure our servers to tunnel through your VPN to access your Azure instance.

The LLM request is done via Cursor’s servers, not your own machine, so even though it is accessible to your machine, Cursor is unfortunately unable to use it!

1 Like