So basically what i want to ask is i have been using some very long chats for my development so does it affects the model performance, BTW the model i use is sonnet 4.5.
Obviously, any model is gonna take a hit as the context gets longer, but massive parameter models like Claude 4.5 Sonnet tend to have much better ‘retention.’ You can just look at the max context limits for reference. That said, some models definitely start ‘hallucinating’ or forgetting things long before they actually hit that token limit.
Yes! Please avoid long chats and start a new chat when you have already some interactions.
not only it uses more credits the results are usually not as good