Hey, unfortunately, Cursor’s servers do a lot of the heavy lifting when it comes to communicating directly with the LLM, so it’s unlikely there will be offline / local model support in the near future!
Sorry to be the bearer of bad news here thought, I do sympathise that there are workflows with Cursor that could benefit from offline / local LLM support!