Hi,
Is there a way to have multiple providers for the same LLM model.
For example (some use cases can be combined):
- I have the Pro plan, but want to use my own OpenAI API key once I run out of premium requests.
- I want to have my own API key but as well as having an OpenRouter model.
- I want to have my own API for OpenAI, Open Router models, and other proxies for other models such as DeepSeek or other custom providers.
I think the solution would be to have custom models with individual settings for each.