Support for Openrouter.ai

Ok, I think this is starting to make more sense.

It looks likes even if the ‘verify’ step errors out, the connection can still work - for both OpenRouter (when it fails) and Groq (using justhttps://api.groq.com/openai/v1 as the base url).

So I guess errors can be ignored in the short term? :man_shrugging:

At the moment, if anything is using OpenAI’s connection, I’m turning off all models that aren’t part of the custom connection - hard to tell if that’s always required or not. It’s a bit inconsistent and fussy.

@deanrie if/when there’s a chance to implement updates to the model settings that can group models by provider, it’d be a huge win for cases like this.

Also interesting side note - it looks like Llama 3.2 likes to always include chain of thought and source references in its responses (via Groq or OpenRouter).

1 Like