Cursor blocking calls to local models

Where does the bug appear (feature/product)?

Cursor IDE

Describe the Bug

I’ve been digging into it and it seems that Cursor is blocking the calls to my local LLM, which works when used from other sources but cursor insists that the current model is not available through my plan (I’m with the Pro+ but also it should have nothing to do), even thought the model is running in my machine and I even added a certificate and a custom local domain into it.

I’ve even tried to alias codellama:34b to gpt-4 and then Cursor tells me I need to pay the Ultra plan, this most probably means someone had the moronic idea of hardcoding a list of model names and the program checks it before even attempting to perform the request.

Steps to Reproduce

Configure any local model and attempt to use it from the Cursor chat

Expected Behavior

I expect it to just work

Screenshots / Screen Recordings

Operating System

MacOS

Current Cursor Version (Menu → About Cursor → Copy)

Version: 1.5.11 (Universal)
VSCode Version: 1.99.3
Commit: 2f2737de9aa376933d975ae30290447c910fdf40
Date: 2025-09-05T03:48:32.332Z
Electron: 34.5.8
Chromium: 132.0.6834.210
Node.js: 20.19.1
V8: 13.2.152.41-electron.0
OS: Darwin arm64 24.6.0

Does this stop you from using Cursor

Yes - Cursor is unusable

Hey, local models are not supported, but you can check out this topic for more information:

It’s not that “it’s not supported” it’s that it’s being blocked on purpose

Alright, let me explain. Cursor doesn’t support local models because it essentially routes all requests through our backend. Since local models don’t have an external URL, it’s essentially not possible. To use local models, you need an external URL. You can try using ngrok to route your local requests through an external gateway, but even then, the operation of local models is not guaranteed. So, yes, it’s not supported.

This topic was automatically closed 22 days after the last reply. New replies are no longer allowed.