Please allow max number of tokens supported by the models

I’m not sure if it is possible, but I added a feature request to display number of tokens used in chat as an indicator of whether the chat has got too long etc: