Hi,
I am hosting Claude Sonnet 3.5 on Vertex AI and I want to set that for LLM model, but just setting Google Vertex API key is not enough. How can I specify which model to use which provider?
Hi,
I am hosting Claude Sonnet 3.5 on Vertex AI and I want to set that for LLM model, but just setting Google Vertex API key is not enough. How can I specify which model to use which provider?
You can use litellm to convert Google Vertex AI’s interface into an OpenAI-compatible API, and then use it in Cursor. I can connect and chat normally, but it cannot generate code. I am currently investigating the reason for this.
Have you resolved this?