I asked DeepSeek-R1 about Cursor's pricing model

Hello,

TL;DR: “does / will Cursor consider DeepSeek models as premium models and limit them to 500 queries per month as well?”

First of all, great respect for building such an awesome tool. Just wanted to share some thoughts about Cursor’s pricing model with regards to the new DeepSeek-R1 model, etc…

Actually, I asked DeepSeek-R1 about Cursor’s pricing and gave it the API costs for each model, and this was the response:

1. Cost per Query (Output-Centric):
Assumptions: 1 query = 1k output tokens (primary cost driver).

  • Claude Sonnet 3.5:
    $15.00 / 1M tokens$0.015 per query
  • DeepSeek-R1:
    $2.19 / 1M tokens$0.00219 per query (6.8x cheaper than Sonnet)
  • DeepSeek-Chat:
    $0.28 / 1M tokens$0.00028 per query (53.6x cheaper than Sonnet)

2. Total Cost for 500 Queries:

  • Sonnet 3.5: 500 * $0.015 = $7.50
  • DeepSeek-R1: 500 * $0.00219 = $1.10
  • DeepSeek-Chat: 500 * $0.00028 = $0.14

Cursor charges $20/month for 500 queries$12.50 markup over raw Sonnet costs (167% markup).


3. Price/Performance Gap:

  • Equivalent $20 Subscription with DeepSeek-R1:
    $20 / $0.002199,132 queries (18x more than current 500).
  • With DeepSeek-Chat:
    $20 / $0.0002871,428 queries (143x more).

4. Output Price Multiples (Per Million Tokens):

  • Sonnet 3.5 vs. DeepSeek-R1: $15.00 / $2.19 = 6.85x
  • Sonnet 3.5 vs. DeepSeek-Chat: $15.00 / $0.28 = 53.57x

Conclusion:
Cursor’s $20 for 500 queries (Sonnet 3.5) is ~18–143x costlier than equivalent DeepSeek-based pricing.


By the way, don’t get me wrong, personally, I’m willing to pay Cursor a lot more than 20 USD, as long as it makes sense. I know Cursor has a lot of features that are really interesting and very useful for dev workflow, and I’m willing to pay for those. What bothers me is the 500 limited queries, especially when new models like DeepSeek-R1 emerge which are on par with o1, and which I can use as an Architect with another (instruction-follower) model like DeepSeek-V3.

I’m okay with DeepSeek-V3 following instructions even if it’s a bit lower than Sonnet, given the price gap (I’m willing to do one or a couple of prompt shots more with V3 for a lot less money) ! What’s important is that I have a “stronger” model (R1) watching over and guiding it… And to be fair, Sonnet is good, but sometimes it also struggles, especially in medium-larger codebases…

So, in a nutshell: does / will Cursor consider DeepSeek models (chat / R1) as premium models? Will they also be subject to the 500 queries per month limit ?

Looking forward to reading the Cursor Dev team & the Cursor’s community takes on this.

(sorry for the long thread)

16 Likes

I understood your logic but I believe that we are suffering in advance, by the way it is going I believe that if it is included in 500 requests it will be better
If we always use the sonet in the fast ones and then in the unlimited slow one we can use the deepseek-ai that probably the queue will be several times smaller than the sonnet kkk
So for me it’s great

I’m getting Agent composer is currently only supported using Anthropic models or GPT-4o, please reselect the model and try again. after using deekseep-v1 for a min did they decide to nix it?

On some other thread a community manager said team is working on r1 agent

@helLf1nGer is correct, we are actively working on getting DeepSeek as a functional agent in Composer!

5 Likes

@danperks : and is it going to be considered a “premium” model ?

Considering that deepseek is open source they could just set it up and provide it along side or in place of cursor small. It would give users better output so there wouldn’t be as much reliance on third party apis given the cost could make more sense for them. Deepseek changed the game people.

This is amazing !

I think I heard that DeepSeek a normal, but DeepSeek R1 is a premium model!

1 Like

Cursor AI, you should reflect on this.

How long do you think you can sustain this with such an expensive query pricing model?

Reflect on it and lower the price.

2 Likes

My company has business licenses for Cursor AI , I asked the same thing to get them to allow using my own Deepseek api key but they seem to be stalling and I don’t blame them as Deepseek is upending the business model for so many AI companies.

Now that Github Copilot has agent mode as well: GitHub Copilot: The agent awakens - The GitHub Blog

I think the best way for CursorAI to move forward is to accept changes in the AI landscape and facilitate the usage of our own api keys from Deepseek or our own instance of Deepseek running on Hugging Face or elsewhere.

I think they can still make money by offering more tools.

2 Likes

Hey. Is there ever going to be an option to use composer with groq? R1 in groq and R1 in fireworks are like Bugatti and Camry.

Cursor are using Fireworks (US hosted) for Deepseek R1 it costs $8 / 1M tokens output.