Issue: Inability to Use Both OpenAI and Openrouter API Keys

My company pay for my OpenAI API usage but I need to use Openrouter models.

Pain points:

  • Switching requires:
    • Deactivation/activation of models.
    • Changing the base URL.
  • These steps are tedious each time, involving numerous clicks as the settings are in a tab, and the base URL is inside a disclosure container.

Implementation of a solution:

  • A simple UI change.
  • From my understanding, a simple revision of the request implementation.

This issue is such a challenge for me that it’s making me reconsider using Cursor as my primary IDE and it would provide a vast UX improvement, particularly when weighed against the cost of implementation.

Thank you, and great work guys.

5 Likes

Did you find a solution for this?

Hey, don’t think we have a good solution for having both an OpenAI API key, and a custom LLM provider working at the same time right now, but I’m adding this to an internal log to see if it’s something we can add in the future.

2 Likes

Thanks for your answer, it will be also very cool to have a way to use APIs from other companies, for example DeepSeek and maybe even local endpoints liek with Ollama or LmStudio