Claude-3-5-sonnet-200k is actually slow request?

I am using long context chat. I choose “claude-3-5-sonnet-200k” because I want to chat with claude 3.5 with more context. But if I use this model and send request, my Premium models Usage does not increase.
In addition, I found that it takes a long time to get the request returned. Usually it will be 2-3 mins. Sometimes it will take more than 10 mins or timeout. If I switch to normal context chat and use claude-3.5-sonnet, for a similar question, it will return the request much faster and my Premium models Usage increases per request.

I need some clarification of whether the claude-3-5-sonnet-200k model is ALWAYS tied to slow request, which seems to be the factor for me.

You get 10 free requests per day for the claude-3-5-sonnet-200k model. After that, it switches to slow mode. If you need fast requests for this model, you’ll need to pay 20 cents per request.

How do you select long context in v0.43.5? The dropdown menu in the chat pane has been removed

2 Likes

This answer is marked as a solution, but I can’t find a way to select any of the long context models in the chat pane after the update. Would you mind explaining how we can access our 10 claude-3-5-sonnet-200k requests per day that are included in the pro subscription?

Have you found the answer to this?

No, they removed long context models from Cursor, and it doesn’t seem like they plan on bringing them back.

How are they going to solve the AI help of multi-modal code that needs long context? seems counter-productive