I’m trying to use Ngrok to forward the traffic of Cursor on a local LLM. Even if inside the Cursor configuration, if I click Verify after putting the dynamic address of Ngrok in the base URL of OpenAI, it says OK, if I use the chat, it goes in timeout before the response arrive from my local LLM.
The error in cursor is: “Error reaching xxx.ngrok-free.app”
Inside the traffic inspector of Ngrok, I see the request from Cursor and the response when it is completed.
Any idea on how to solve it? Do I need to upgrade to Ngrok paid? or there is another way to do the port forwarding?
Thanks