Slow Pool Information

To be clear, you can avoid this problem entirely buy ensuring you have Premium model requests available. Check https://www.cursor.com/settings to see how many you’ve consumed for the month.

You get 500 of these Premium requests per month for your $20/month subscription and you can purchase more at any time in allotments of 500 for an additional $20.

For example, I am using about 3000 Premium model requests a month, so my monthly subscription is $120/month. This comes out to $0.04 per message and reply. I never use slow pool because I keep my credits stocked, and so I’ve had no issues while Cursor/Anthropic is dealing with this issue. For me, one of the super powers of Cursor is its speed, so it’s well worth it to pay to ensure my responses are as fast as possible.

I know not everyone has the budget to pay more than $20/month, but I am posting this for those who may be impacted by this issue and not aware it can be solved immediately by buying another 500 requests. IMO, this is a small price to pay for the benefit Cursor gives me and it’s still far less expensive than the competing products.

5 Likes

But you get no greater context depth for $120? Correct?

This is exactly what I have been saying and I have no problem paying to get work done at the speed I am able to.

I will likely hit roughly $200 this month before my subscription starts over again and I am fine with that until other models are introduced, etc…

1 Like

I believe the situation here is as follows: obviously, everyone needs to pay their bills at the end of the month, and Cursor is no exception. The pricing may have been inconsistent with the volume of requests many users were making, and now that they’ve completed a Series B funding round, it’s possible their investors are pushing them to focus on profitability rather than just customer acquisition.

I personally have no issue with Cursor’s decision—it’s clearly within their rights. However, I can’t deny that the way this change was handled was somewhat chaotic.

First, they mentioned the limits were due to Anthropic’s constraints. Fine. But then why is OpenAI also rate-limited?

Second, why does it seem like the wait times are now hard-set? Previously, they varied based on the time of day, with shorter waits at some times and longer at others. Now, it seems like all users are stuck with the same long, predefined wait time, regardless of the day or hour.

Third, if you’re going to implement such a drastic change, it would be ideal to notify your users a few days in advance so they can decide whether to keep their subscriptions. Instead, the change was made abruptly and announced casually on a forum that likely less than 10% of users have access to.

In the end, I think this indecision and lack of communication are what have caused the most problems here. Changing the pricing model and encouraging people to purchase additional requests? That’s fine, but at the very least, there should be a formal announcement across all social media platforms.

Is this a temporary change? Okay, but is there any timeline for things to return to normal?

Wouldn’t a fair usage policy also make sense to prevent users from abusing the system in some way?

Of course, the product is marketed as “slow mode unlimited,” not “slow mode (<1 min) unlimited.” But when users experience something at a certain level and then it worsens by 20 times, it becomes hard to defend such a drastic change.

14 Likes

Correct. You are just not at the mercy of the current performance of the slow pool of requests. You are served immediately every time.

The OP is a founder at Cursor. I do not think he’s lying about this being an Anthropic capacity issue as cover to make more profit from his customers. A company at this stage and funding is focused on growth not nickel and dime’ing their customers.

Both Cursor and Anthropic are seeing explosive growth. It’s reasonable to think an issue like this could surface as capacity gets constrained. Especially with the introduction of Agent mode which has likely spiked LLM calls significantly.

In OP’s post, he states that in situations like this, your wait times will get worse and worse the more slow pool requests you make, so it also makes sense that performance continues to worsen for some users.

I don’t have an answer for why other slow pool models are also limited, but I wouldn’t jump to conclusions or conspiracies. Perhaps they don’t have logic baked in to throttle by model just yet and they’re working on that too.

Again, drop $20 and buy another 500 Premium credits if this is really an issue for you. The problem is solved with a couple clicks.

I see Premium credits as the professional pricing plan. If Cursor is critical for your work and you’re making money with this tool, it’s a very small price to pay to ensure you always have fast access. It’s like paying for cable or DSL versus dial-up … that is, if you remember the painful days of dial-up.

1 Like

If it’s already hard for you, who are Americans and get paid in dollars, imagine how it is for me, a Brazilian earning in reais. Here, 1 dollar is equivalent to 6 reais, and 95% of the population earns less than $1,000 per month (around $250 per week). Additionally, more than 90% of the population lives on just $180 per week. For me, it’s not just bad—it’s terrible.

At the moment, this is the only chance I have to provide a better quality of life for my family. That’s why I work 10 hours a day, and when I get home, I spend another 4 to 6 hours studying on Cursor, trying to build something that can sustain my household. However, with these recent changes, this dream seems to be slipping further away…

I want to thank the Cursor team for igniting in me the hope of changing my life, but unfortunately, I feel that this same hope has now been taken away…

9 Likes

I apologize for the outburst, but I’m truly exhausted. The hopes I once had are fading, and all that’s left is to try to accept the situation.

1 Like

If it helps, look into GH Copilot. They offer sonnet 3.5 as well and it’s truly unlimited. I personally use it with Cursor, lot of the query I do are just question in Chat, I don’t need the composer all the time, so it makes my 500 fast request last much longer. Maybe Copilot would be enough for all your needs, and it’s half the price.

3 Likes

Very well said.

Excellent news. deepseek v3 is available now which can also be used in composer/normal mode with quick reponse!

yeah, the problem is that is not as capable as claude or gpt and it can’t use agent mode

@hiteck How did you manage to get it to work in Composer / agent mode?

Just normal mode works. But for me, I don’t like to use agent mode.

Hey! do you know if Deepseek models or Gemini models are counted toward Premium fast request quota ?

Enable the Composer mode for DeepSeek V3 exclusively for Premium subscribers. This way, you can increase revenue by attracting more subscribers rather than relying on additional usage charges. What you’re doing now is comparable to Netflix charging for both the subscription and minutes watched. Focus on gaining as many subscribers as Netflix. The unique value of Cursor lies in the intelligence of the tool, as LLM models are widely available in the market.

1 Like

Do you know if deepseek supports images? I’ve found it very useful with Claude.

deepseek does support images, but not in Cursor yet.

Does deepseek-v3 count towards fast requests?

I just tested, and for now it doesn’t count anywhere.