Deepseek V3 in cursor is far slower than deepseek V3 from other tools like aider / cline

Firstly, thank you so much for adding deepseek v3 to cursor. I’ve been using it since yesterday and for my tasks my dependency on claude has reduced a lot.

However I am noticing that deepseek v3 on cursor is far slower than using it from other tools. For example, I added the same context and asked cursor to complete a function using deepseek and it took almost a minute before it started responding. The same thing using aider was done almost immediately.

I understand it is experimental, so there would be some rough edges. But please check the latency issues.

Thanks.

2 Likes

Hey, are you using this model in Composer? If so, there are many other processes happening behind the scenes in Composer, and yes, it’s currently experimental. Thanks for sharing this.

Hi, no I am not using it in Composer. I am using it in Chat tab directly.

1 Like

Ditto, slow in chat as well

1 Like

Yes, I think I understand the issue. We’re using the API not directly from DeepSeek but through our provider, Fireworks. They run the model on their own equipment, which is why the speed is significantly lower. I believe they’ll add more capacity soon, and the model will noticeably speed up.

4 Likes

May I ask how do I install other models? I can only access sonnet 3.5 for now? Thanks!

Go to cursor settings (in mac that would be cmd + ,), then click on the cursor tab in the side panel, then click on show full settings, and then pick models. There you can tick on other models you want to enable. One you do that in chat and composer, you can change the model by clicking on the downward arrow beside model name

I manually add deepseek-chat model and the deepseek.com base url where it asks for the openai url and api key. Works great. Extremely fast even with a full codebase call.

Glad to hear it’s helping reduce your dependency on Claude! It could be worth checking your internet connection or trying at different times, as latency might vary based on server load. Hopefully, they’ll optimize it soon.

how you add that? I’ve tried many times always gives me an error

Do you suggest waiting a couple of weeks before trying it as you mention it’s experimental? Are you trying to do some ongoing fine tuning to improve its performance?

Other than speed differences, is there any functionality you miss out on when adding it. Also any advice on how to add it ?

It changes everyday. @deanrie could tell you more. It worked great for weeks using the base url with https://apiDOTdeepseekDOTcom/chat/completions but then @deanrie changed something so I removed the /chat/completions and it started pumping out correct code again.
The way to do it… Open Cursor Preferences > Cursor Settings > Models
On the models section, under models, click the + icon.
Add deepseek-chat and deepseek-reasoner OR @deanrie may advise you use their deepseek clones that were recently added to the default list of models… Isn’t that right @deanrie ???

Next… And this is the most fun part for me… Still in the models section of settings… Scroll down to where it asks for an OpenAI key (LMMFAO) and put in your deepseek api key (still laughing :slight_smile: … Put the base url in above. You may need to try a few different base urls… @deanrie May tell you the right one to use… Dismiss all the “warnings.” Make sure the little green light stays green… For the last few weeks, @deanrie and team seemed to be trying to ???what’s the word??? deepseek and the little green light would go off for no reason. Just click it back on and go :slight_smile:

1 Like

@blu3 If you do this, then does that provide you with less functionality in Cursor? Also to do this, did you have to get an API key? I think now Deepseek are not providing API keys any more as last time I checked their site was under maintenance.

When it was running well, I don’t think cursor wanted it to be used. It was just a hack. Instead of adding an UnopenAI key I just added the DS key and instead of an UnopenAI url endpoint, I just put in the DS endpoint. At first I could use deepseek-coder, but then something changed so I had to switch to deepseek-chat. (At the time I was on the free version and Composer wasn’t an option) Then back to coder. THEN… Everyone started to notice how much better DS performed than all the other “unopen” versions… And the attacks began. Less functionality??? You will have to ask the overlords of the Unopened for that answer.