Using Local LLMs with Cursor: Is it Possible?

Yes, it can be used

  1. input the model name correctly (case sensitive & whitespace sensitive)
  2. enable OpenAi custom api
  3. input override OpenAi base url with your server (it needs to be publicly accessible)
  4. Change the model name in cmd + K or chat
2 Likes