Cursor & DeepSeek

I think you are underestimating the costs of running DeepSeek at scale.

Reserving enough compute to satisfy peak demand is much more expensive than you might think. To the point where—with current compute constraints—it can actually be more expensive to run DeekSeek vs proprietary APIs.

Further, the value proposition of Cursor far extends API requests. Anyone can access these APIs at cost, or run these models locally, for whatever those APIs cost them.

The value proposition for Cursor is that this is all embedded within an IDE, allowing you to seamlessly interact with your code base. This offers extensive productivity benefits to the user outside of making the API requests themselves or flip-flopping between chat windows at ChatGPT/Claude.

And that doesn’t even consider the composer, which literally transforms these API requests into changes in your code base.

To date the Cursor team have shown a commitment to keeping their product competitively priced, and as these models become more efficient and cost-effective to run at scale, will continue to ensure paying users are getting far more value than the subscription price.

28 Likes