A lot of the magic of Cursor happens on our servers which process your requests before sending it off to an LLM. Therefore, locally hosted models running on your own machine won’t ever work in the current setup because our servers can’t see them!
If you are happy to set up your self-hosted LLM to be public-facing (secured with an API key), you should be able to override the OpenAI URL to your own and get this to work.