Just upgraded to 47.6, how do I use this new sonnet MAX thing? My model names didn’t get any updates by default
It has been pretty weird. I’ve used it about 3 times now and each time it goes away for about 10-15 minutes. My suspicion is servers arent putting it up right now for some reason. I want to say its because of how many people are trying to use it but i dont even think this was fully announced to be realeased today so i cant imagine the load is too high.
How effective was it when you used it?
Oh well it was pretty cool. I havent had it long enough for a full test but i gave a really complex program idea and tried one shotting it. And it gave me about 8k lines of code for just one prompt because of new 200 chain limit. It seems promising because they maxed out its 200k context and maxed out its thinking. But again i just havent had it long enough unfortunately.
I would also like to add that the discussion post that showcased 3.7 MAX seems as if it was taken down : (
maybe use “ add model” ? in models tab
(after input the model name “claude-3.7-sonnet-max”,don’t use “add model” button, just use “enter” key)
Whats different with normal model
just update to 0.47.7
It’s back in 0.47.7, but I can’t find any announcement or specific details anywhere.
There’s also a non-thinking MAX version, so it can’t just be the standard version with the thinking-token budget maxed out.
Ok so you basically NERF context length on sonnet 3.7 and 3.5 then you reintroduce it as 3.7 max lol. Win
I might as well use Cline / Roocode right? and direct API , that’s the case, i mean its not $0.05 per query, you’re system prompt is probably gonna split the task up into multiple steps which is $0.05 each step for tool calling.
Tell me i’m wrong.
Seriously? That’s bad
I shed tears of poverty.
It’s not Cursor’s fault. It’s my fault. I’m sorry.
To use your words, all the models in Cursor are “nerfed” in this way.
In the past, models have performed poorly if you thrown entire files at them and expect them to be able to use the relevant sections, and ignore the irrelevant ones.
As such, one thing that Cursor has always done is curate the context we send to the model, to make sure the model has the best input possible to help the model make the best possible output!
Claude 3.7 is the first model that performs much better with the larger context than any existing model we have, and to make use of that, Claude Max was born!
is it not possible to not ‘nerf’ the context but nerf another qualities? for a pro user looks like missing much this way
We already have the long context setting, which will have a larger input of context to the model, in exchange for using more fast requests - this in effect has a similar effect to the context window changes in Claude Max!
Hey, it would be useful to have a notification in the chat window when we reach the end of the context window, with the option to enable context window extension.
Check the documentation /s