Make Local Hosting on LLAMA 3.1 Nemotron 70B Possible

Previously I saw that the cursor devs didn’t want to introduce open source local models because of worries of experience degradation. I can understand that. In fact, I agreed with them.

However, with locally hostable models like the one in the title that can perform or outperform GPT 4o and Claude 3.5 Sonnet, I think it would be extremely valuable for us to get local model support.

I am trying hard to get cursor into the standard tooling of my startup right now, and I am getting push back due to security reasons and especially the inability to run cursor on local models.

Can we all discuss this? Anyone who also wants this? Cursor team? What do you guys think?

Hi @AntreasAntoniou

I understand your wish to run local models from Cursor, but that’s not possible for several reasons. Cursor receives responses via its server, so everything has to go through it. Additionally, the codebase is stored in cloud storage, which makes this feature impractical. If you were to use a local model, you’d lose access to features like Composer, Apply Code, and Cursor Tab. If you’re willing to give those up, that’s fine by me, but what benefits would you still get from Cursor?

Thank you for your response.

I am a bit confused however. Doesn’t cursor support using Anthropic, Google and OpenAI APIs? If everything has to go through cursor servers does that mean that Cursor is querying those APIs from their own servers on my behalf? If that’s correct then that should be made more clear in the application as it might help inform security concerns.

Thank you for illuminating me.

With all that being said, a large part of potential customers would want to have a locally hosted version of cursor for security and privacy reasons that are simply not bendable (because of laws involved), and as a result can make cursor impossible to use for them.

I love Cursor and want to advocate for it to my own startup team and others, but the lack of locally hosted LLMs makes this basically impossible as soon as strict privacy concerns are introduced.

Our site has a couple of sections dedicated to security. Feel free to check them out.

I agree it would be an extremely feature to have local models.

I have tried cursor on weekend projects and love it, but I can not touch it for work because their our code can’t be shared with any third party services.

So I was searching how I could run nemotron (locally) with absolute privacy. Although I have not tried it yet, it seams continue was built for this: Using Llama 3.1 with Continue | Continue

1 Like

(post deleted by author)

(post deleted by author)

Can you upvote this post? We should get more attention to this sort of thing if we want the cursor team to look at it and potentially make changes.

As @AntreasAntoniou said, Cursor has to make model requests either way. Swapping the base URL for the API requests doesn’t change anything if the API adheres to the schema of either OpenAI or Anthropic.

I also don’t see why the requests have to go through Cursor, if this is the reason why they don’t want to let the user customise the model endpoint URL.

If this is true, and Cursor hosts their infrastructure on Aws, Fireworks like described in Security page @deanrie mentioned, I may have a even better solution for us .

I don’t ming paying Cursor to use it’s awesome features with consistent quality ( Which not a single LLM is providing lately, specially Anthropic + Claude here is having worst mood swings than humans )

Why not offer LLama 3.2 as one of the options in Cursor? like cursor-mini if it works and helps, it’s welcome ! and if it’s cheaper for Cursor team , I am happy that they get a better ROI and invest in the product instead of raising prices like OpenAi is rumoring with o1 next year. And free CPU/RAM is a must have on dev machines. And bring more models, we are already used to switch in Cursor, so why not ?

For privacy and compliance, have you looked into AWS Bedrock/GCP Vertex etc? I’ve set it up myself previously when it’s been a need from a compliance perspective