Add support for Openai Compatible Models with different customized API

There are many LLMs compatible with OpenAI API, like Groq.com, OpenRouter.ai and Deepseek. I hope we can add support for using these models in addition to GPTs, with customized APIs and baseUrls.

Hopefully, we can set the different APIs and endpoints for each of the models. In continue.dev, we can do this by modifying the config json file:

{
  "models": [
    {
      "model": "deepseek-coder",
      "title": "model 1",
      "systemMessage": "You always respond in English.",
      "apiKey": REDACTED,
      "provider": "openai",
      "apiBase": "https://api.deepseek.com"
    },
    {
      "model": "deepseek-chat",
      "title": "model 2",
      "systemMessage": "You always respond in English.",
      "apiKey": REDACTED,
      "provider": "openai",
      "apiBase": "https://api.groq.com"
    },
  ]
}
3 Likes