Hey, as @deanrie said, this isn’t something support by Cursor right now, as it’s not your client that makes the final request to the LLM, but our backend servers.
Therefore, our backend cannot execute any request where the LLM is locally hosted! While you may be able to publicly expose a locally hosted LLM (e.g. with ngrok), this isn’t something we support or recommend due to the security issues that could come with it!