Allow the use of local LLMs

With the latest steam has been getting, providing you the ability to use “CoPilot” features with models on your local machine.
I think it’d be a great addition for Cursor to invest in creating this feature.

Since there are a LOT of AI tools out there, they are all requiring a subscription and it’s turning into “streaming wars”. Subscription here subscription there. The cost is starting to become unrealistic when the machine most people run on can handle something like code llama or other open source LLMs that can provide the copilot experience.

1 Like