Trying to use local models, getting an error

Where does the bug appear (feature/product)?

Cursor IDE

Describe the Bug

I’m trying to use local models with Cursor using the ‘bring your own api key’ feature
I’ve setup LM Studio on another computer and created a public API link using Ngrok. The connection is working and the AI is successfully responding.

Everything works fine until the first edit_file tool call in agent mode. After that, I get the error
We’re having trouble connecting to the model provider. This might be temporary - please try again in a moment.

I’ve tried several prompts of various complexity, and it seems to be always happening right after the edit_file tool call. The model completes the tool call, a couple seconds pass, and I get the error.

Note that the model does manage to complete making the edit tool call, it’s just that Cursor doesn’t move forward with prompting the AI to continue after the edit.

Steps to Reproduce

  • Setup LM Studio on a computer
  • Expose the LM Studio API to the public using Ngrok or LocalTunnel
  • Use the public API URL as the OpenAI base URL and use Cursor with a model
  • Ask the model to edit a file. All non-editing tool calls work well, but after the very first edit_file tool call, the app gives the error

Expected Behavior

After the edit_file tool call is completed, Cursor should move forward with the process.

Screenshots / Screen Recordings

I instructed the AI to make two read_file tool calls, and two edit_file tool calls. The read_file works perfectly fine, but after the first edit_file call, it fails.

Operating System

MacOS

Current Cursor Version (Menu → About Cursor → Copy)

Version: 1.4.5
VSCode Version: 1.99.3
Commit: af58d92614edb1f72bdd756615d131bf8dfa5290
Date: 2025-08-13T02:08:56.371Z
Electron: 34.5.8
Chromium: 132.0.6834.210
Node.js: 20.19.1
V8: 13.2.152.41-electron.0
OS: Darwin arm64 24.6.0

Does this stop you from using Cursor

Yes - Cursor is unusable

Hey, thanks for the report. Since Cursor doesn’t support local models, what you’re doing is a workaround and doesn’t guarantee stable operation. You can try setting it up through OpenRouter, and it should work.

Hi, thanks for the reply.

I’d rather not use OpenRouter as that would defeat the point of local models. Do you have any idea why this issue is happening so I might be able to fix it on my end? Perhaps by changing the backend or the tunnel?

This topic was automatically closed 22 days after the last reply. New replies are no longer allowed.