I was able to run large models locally via ollama, connected to the cursor, but after this update, I can no longer
Why, is this not allowed?
Or maybe the function has changed and my previous approach is no longer feasible
Hey, can you describe the method you used? We’ll try to figure it out.
Okay, let me explain it to you.
as you can see
I have this version of cursor
and I have a codestral model running on ollama
I can add the local model directly to the cursor like this and it works fine
But on my other computer, I can’t do that.
The update changed too many things and affected my work
Hey, actually, it doesn’t work like that. Local models from Ollama don’t work by default in Cursor because all requests go through our servers. The reason the Codestal model works for you is that it operates within Cursor, it’s not your local model. To use a local model in Cursor, you need to set up an API gateway, for example, using ngrok.
Oh, so even if I delete Ollama, it won’t have any impact, is that what you mean?
Yes, you can check this if you close the Ollama application.
OMG you are right, I am such a fool, ollama deleted and nothing changed