Did Cursor remove the unlimited slow request option?

I checked the pricing on the Cursor site. Previously, I saw that the Pro subscription included unlimited slow premium requests. Now, it has been removed from the description. Is this true, or was it just not mentioned in the pricing section?

2 Likes

Before I consider starting a subscription again, I’d like to know: what’s the situation with the unlimited slow premium requests right now?

I want to know this too!

They are still working, I just tried that, However, being removed from Pricing raises concerns.

Yeah me too!

Thanks for checking and confirming it’s still working!
However, I agree – removing it from the pricing page without any official statement creates uncertainty.

@deanrie / @ravirahman – Could we please get an official clarification on this?
Is the unlimited slow request feature still officially part of the Pro plan, or is it planned to be phased out?

A short update from your side would really help avoid confusion for both current and potential subscribers.

Thanks in advance!

2 Likes

Hey, it’s back, shouldn’t have been removed!

If we were to make any pricing changes, there would be a proper announcement - sorry for any confusion :folded_hands:

3 Likes

Only on text right? because I still can’t do slow request after using cuota :upside_down_face:

Hey, you should be able to use slow requests after you use your 500 fast requests!

What happens when you try, do you see an error or anything?

slow request where by far Cursors best feature until .50.

now, in Summary , slow requests are either stuck, on error, or anything else on below threads:

Slow requests should still be working fine for everyone,¡ hypothetically. We are still working to fix any bugs we find, but the option is still available to users on every plan.

1 Like

For several days now any attempt to use claude-3.5-sonnet or 3.7 stays at “Generating…” forever (I’ve waited over 30 minutes) so all recent work is being done with Haiku or other models, with mixed unsatisfactory results.
I have tried my own OpenRouter and Anthropic API keys to circle around this, of course, but then it’s “The model ■■■ does not work with your current plan or api key”, for which I did find a solution once, only to deal with a different restriction regarding context size.
Then there is this every 20 minutes “Connection failed. If the problem persists, please check your internet connection or VPN”. And dont get me started on python MCP servers!