Setup Ollama (local model) in Cursor

Hi all,

I’m running a local model as below. As I’m on a free plan, am I supposed to be able to use Ask/Plan mode only, right?

What I did was turn off all model toggles, add a custom model name as “deepseek-r1:70b”. Turn off all API key toggles. Turn on only “Override OpenAI Base URL”. Add this https://45e0ff4fbaa8.ngrok-free.app/v1

It says: “Invalid model
The model deepseek-r1:70b does not work with your current plan or api key”

Please let me know if I did anything wrong. Thanks

2 Likes

I have the same issue. I wish I had something to add for you, but it seems to be an issue with the latest version of Cursor.

I also have the same issue. Hopefully we can connect our own GPUs soon

Hey! Any updates? I am facing the same issue.
I am trying to run the nemotron with Cursor.

Seems like the Cursor doesn’t work with local Ollama anymore, Ollama now support cloud models like glm-4.7:cloud, anyway, moving back to Claude Code now.