I’m wondering if there are plans in the future to support local LLM’s within Cursor? While today you support GPT-3.5 & GPT-4, it would be great if we could point Cursor to a local LLM on the machine that has been specifically tuned on a particular codebase(s).
No, it’s noticeably worse, but good enough for syntax questions, what does this error message mean, how do these pieces of the web app stack work etc. Definitely worth playing around with via ollama if you have a mac
https://ollama.ai/ - via the cli tool i’ve found mistral and codellama most useful but they have others. i have a 16gb m2 and they run pretty well. https://continue.dev/ is a vscode extension, works in cursor as well that let’s you use ollama + codellama in a similar way to cursor - I think they are just going to get eaten by copilot x/cursor just being an extension.
I had a period of a few weeks where I was frequently without internet and found these very useful
Fair enough! i assumed it would be difficult because features are tuned around the capabilities of 3.5/4, so a drop in replacement with some lesser model would be a poor experience
Have you reconsidered this feature ? Mistral model are getting pretty good on code and using LMstudio for exemple could be a really amzing alternative solution.
For the future, it would be nice if it was possible to add URLs, not just modify them. For instance, I sometimes use the OpenAI key, but then I have to remove the base URL and enter the key. If possible, please do this.