I have installed Qwen3-coder with ollama, and exposed it via open ai compatible service in localhost.
But whatever I do, when I pick the model in the chat window it says that the model is not available either because my plan doesn’t support it, or I don’t have permissions.
Can this be done? Am I missing something?