Hi, how can we use ollama or lm studio to setup public server with an api key for use with Cursor? Would be a great feature for those of us who prefer to use and test open source models with my fav coding tool.
2 Likes
Hi, I partially solved this problem by installing the “Continue” extension. There are many features in it, try it out.
1 Like
with this approach, can you use the web scraping capailities of cursor.sh?
Yes, you can: Context Providers | Continue
1 Like
curious about your development setup. from my quick look over the continue docs, a lot of the functionality seems to be common between cursor and continue. what advantages you get with the cursor + continue extension over vscode + continue?
I use Continue exclusively for testing LLMs.
I’ve created experimental backend to support this request - let me know how does it work for you: