Kimi K2 is now on Cursor - Consider Provider

Cursor has now added the Kimi K2 model.

While the model performs quite well, the token generation speed appears to be somewhat slower compared to other models. According to the usage dashboard, it is currently being served via Fireworks.

Since Groq offers significantly faster speeds—around 200 to 300 tokens per second—it may be beneficial for Cursor to consider integrating Kimi K2 through Groq for improved performance.

12 Likes

Hey, thanks for the feedback. Since fireworks.ai is our partner, it was chosen as the provider for this model. Regarding the generation speed, I believe they will soon increase their capacity, and the generation speed will improve.

2 Likes

there is currently a bug with kimi after some time in its answer it breaks
request id if it helps a8848a3b-9f19-4e05-a18e-eba0138f446d

15 Likes

Are the servers in US or routed to China?

they are partnered with inference provider fireworks ai which hosts deepseek and now kimi and is US based company, so no routing to china

1 Like

I see no point on releasing it in this state, it’s practically unusable.

1 Like

It’s not working for me plus can you add the free model for moonshotai or the paid model?

same

Congrats!

1 Like

is the model is unlimited with the pro version ?

It has just been added, we have to await full public announcement. Models other than Auto are counting as cost.

Note that its capabilities still have to be tested more and any issues found should be reported so they can be fixed.

1 Like

For those who want it to be served through groq for speed, they use like Q4 quant which reduces quality and has max output of 16k

1 Like

Im trying using Kilo Code, kimi was very smart and cheap
my fist impression is this model could be claude 4 replacement :joy:

I toggled-on Kimi-K2 Instruct in
Models in Cursor Settings, however I don’t see it in the chat under models selection… I saw something on the forum about regions… Is it because I am in Canada? I don’t see any warning when I toggle on Kimi-K2 in settings models…


@Martin_Perreault please check if you can scroll on the model selection, Kimi may show at the end there.

Too slow to use this model, bad UE

2 Likes

@ayqy thanks for the feedback, here is more info:

come on,Cursor! Please improve Kimi-K2 LLM’s perfermence quickly!

1 Like

@condor I feel so dumb LOL… yes of course, scrolling! I didn’t realize because there was no scrolling bar. Happy to report scrolling works! Thanks!

Looking forward to be able to actually use it though… I tried it now, and had to wait 17 minutes for it to do my request of creating two readme.md files: the first one was completed after 15 minutes, and the second one was aborted (failed generating) at 17 minutes…

1 Like

@Martin_Perreault no worries, there are still improvements coming and the provider is scaling the model up.

1 Like