Anyone know the context size?
Note the model is capable of 128k context, but cursor only uses 10k Why are the models on using your own keys so much better? - #6 by arvid220u, not sure how much of that is input/output.
1 Like
I think it is combined. Once I put a super long custom rule and then the output is then axed super early.
2 Likes
I made a feature request for this lads, long context mode gpt-4o
Please like it so we get attention
3 Likes
3 Likes