Hi,
as far as I know one cannot use OpenAI and OpenRouter models at the same time given that in order to use models at OpenRouter one would have to make use of the fields (API key and base URL) designated for OpenAI. It would be nice to have an additional field where one can configure usage for services like OpenRouter
Hey, you can see my comment on this here, but we have already logged this request internally and will look at adding it in a future update!
Hey, don’t think we have a good solution for having both an OpenAI API key, and a custom LLM provider working at the same time right now, but I’m adding this to an internal log to see if it’s something we can add in the future.
1 Like
It gets even worse, because you cannot use APIs from non-OpenAi providers.I think this is a serious design flaw. Maybe it should be prioritized
Now that I double-check this stop working. it is not even working with non OpenAI models. Even if I disable the usage of the OpenAI key (with modified URL pointing to Openrouter) then trying to use Google models still fails because Cursor is trying to use the custom OpenAI Base URL. That means that if you modified the Base URL then Cursor will always try to use that one, instead of the appropriates for Anthropic, google etc.