How does Cursor use ChatGPT for so cheap?

I used my own oAI key and the price skyrocketed so I decided to go with their pro plan. How is their model sustainable and how do they access ChatGPT API for discounted prices?

We use OpenAI’s reserved capacity option. This gives us a set of dedicated servers that only go to Cursor traffic which 1) drastically improves the caching behavior of our prompts 2) means tokens don’t mean much to us for pricing, just the time it takes to process a request and 3) effectively gives a volume discount.

2 Likes

Dedicated instances are really from Azure? Or OpenAI? Do the dedicated instances include gpt-4-32k?

That’s really neat, and helps us out a lot. As OpenAI recently announced the enterprise option for ChatGPT, what might this mean for cursor? Could we see 32k contexts offered more?

2 Likes