Right now, Cursor supports the “Override OpenAI Base URL” option in the model settings, but it has limitations. It requires a publicly accessible HTTPS endpoint. Direct connections to localhost or a LAN IP aren’t supported yet.
All requests go through Cursor servers to build prompts
This is indeed a problem. We also need a capability to directly connect to the local model without going through a curler, so that requests to the local model without going through a curler would be much faster.