Custom OpenAI endpoint not working

I’m trying to run cursor through a local LiteLLM proxy (also to have an alternative once I run out of ‘fast’ queries), but it seems that this is not working. See screenshot:

So the ‘verify’ button works, but then the chat doesn’t seem to actually use it. When using a packet-sniffer, I see traffic going to the local proxy when using the ‘verify’ button, but not when actually using cursor. I’m assuming the custom endpoint is being ignored.