Custom models usage

Where does the bug appear (feature/product)?

Cursor IDE

Describe the Bug

We’re experiencing an issue with locally deployed LLMs after 10-04-2025. Before this date, we were able to use both types of LLMs, the team plan we purchased from you and the local LLMs using the custom API Key + Custom OpenAI base URL.

I would also like to add that, even after capturing the traffic from our Nginx deployed on our server, we did not receive any request from Cursor. This indicates that Cursor was unable to send the request and instead immediately displayed the error message we already shared with you: “The model gpt-oss:20b does not work with your current plan or API key.”

Steps to Reproduce

Please find our answers below:
• We are using the OLLAMA API, which supports the OpenAI API. We have been working with it for over two weeks using Cursor.
• We have tried multiple models such as gpt-oss:20b, Mistral, and LLaMA. All of them were working properly up until Friday, October 3, 2025. The issue started on Monday.
• We are using an open-source LLM deployed on our infrastructure. As mentioned, we have tested it across multiple models.

Expected Behavior

Ollama has the same API structure as OpenAI, which has allowed us to work with it for more than two weeks. We started encountering this error on Monday, October 6, 2025.

Screenshots / Screen Recordings

Operating System

Windows 10/11

Current Cursor Version (Menu → About Cursor → Copy)

version: 1.7.40
vscode version: 1.99.3

For AI issues: which model did you use?

Free Ollama models: gpt-oss:20b , mistral-small3.2:latest etc…

Does this stop you from using Cursor

Yes - Cursor is unusable

1 Like

Hi Karim - thanks for raising this! We’ve passed this on to our team for further investigation.