Cursor - Community Forum
Using Local LLMs with Cursor: Is it Possible?
Feature Requests
fuxi
September 13, 2024, 7:19am
6
Have you done it successfully?
show post in topic
Related topics
Topic
Replies
Views
Activity
Make Local Hosting on LLAMA 3.1 Nemotron 70B Possible
Feature Requests
8
670
November 12, 2024
Support local LLM's
Feature Requests
46
15477
August 10, 2024
Add Support for Ollama natively
Feature Requests
6
3623
January 9, 2025
How to use Cursor with external models (such as Groq LLMs)?
How To
1
674
September 3, 2024
Can use cursor without login
Discussion
1
233
October 9, 2024