It seems like at a point, the long context starts to refuse to continue to chat. I am set up to do optional runs and I was charged for 6 extras but after that, it stopped working. Do I have to start a new chat? i have been judicious with the context window and selective with the files at hand to look at, under the 200k mark. A clean token count that remained would be a nice feature, and sometime as i mentioned in another chat was that when the long context is generating, the entire IDE slows down significantly. Thanks!
1 Like
Related feature request :
1 Like
Yes the token count appears next to the file or folder when you add it to the chat, but a master token counter would be great for both the long context and normal chat to keep track of where we are. Also the long context chat really bogs down the entire IDE, which I mentioned before. It seems like there should be a way to separate the AI features from the regular IDE usage so you can work while the chat is generating.
2 Likes