Support local LLM's

Yes. Please allow us to use local models, even if it’s a paid feature

3 Likes

In the meantime Continue published a post about using Mistral AI’s Codestral 22B, DeepSeek Coder 6.7B, and Llama 3 8B.

4 Likes

if i buy a new laptop i’d definitely be looking for something with enough ram to run some of these locally, pretty incredible!

Managed to connect a local Coder-V2 by using ollama and cloudflared tunnel. 16b model is super slow on my Mac M1 Pro, takes a minute to process any request, even the simplest ones.
But the main issue it breaks the code, injects some weird markdowns, cuts lines, and so on. Seems like it needs some additional instructions and prompts tuning on the Cursor side.
So guys, technically it’s possible but as of now it’s far from ready-to-use state.

1 Like

I am using LM Studio. I get this base url http://localhost:1234/v1 and I used it on cursor but not working. Here is the curl

curl http://localhost:1234/v1/chat/completions -H "Content-Type: application/json" -H "Authorization: Bearer lm-studio" -d "{ \"messages\": [ { \"role\": \"system\", \"content\": \"You are a test assistant.\" }, { \"role\": \"user\", \"content\": \"Testing. Just say hi and nothing else.\" } ], \"model\": \"bartowski/DeepSeek-Coder-V2-Lite-Instruct-GGUF\" }"

Please someone tell me how to use local LLM on Cursor.

1 Like

You must have an external address. You can configure this through various proxying services or find something ready-made. Like this one works, but I’m not sure GitHub - kcolemangt/llm-router: Access models from OpenAI, Groq, local Ollama, and others by setting llm-router as Cursor's Base URL

1 Like

This needs to happen. Let me experiment using Cursor with Llama 3.1 I have on my own machine.

I won’t be using cursor for serious work until this is in place.

2 Likes

Are there any plans to support Model Catalog - LM Studio

@Jakob since this reply, DeepSeek-R1 was released, so the argument that “the state of OSS at the moment” would lead to a “much worse product experience”, no longer holds. Can this topic be marked as unanswered?

1 Like

How did you manage to do this? I am using a cloudflare tunnel to access my ollama service. A curl command accesses it with no problem, but I can’t get commands from cursor to show up in the ollama console.