Kimi K2 Thinking in Cursor

Feature request for product/service

AI Models

Describe the request

Please add Kimi K2 Thinking, I would love to try it in Cursor. It is on par with the best coding models when looking at benchmarks and it is much cheaper.

Pricing

  • $0.60/M input tokens
  • $2.50/M output tokens

Screenshot / Screen Recording

53 Likes

kimi-k2-thinking is a sota level LLM in code and agent while it is much cheaper than claude sonnet 4.5, please add it to cursor.

10 Likes

Hey, thanks for the feature request, we’ll consider it.

19 Likes

composer is cooked

8 Likes

Yes please! We need this model, it is extremely good.

5 Likes

They haven’t bothered to fix the formatting issue with regular Kimi-k2, a issue that is on Cursors side, for at least 6 months - So don’t hold your breath.

11 Likes

New model - new set of bugs :smirking_face:

5 Likes

After they fix the K2 formatting issue, Net Effect: if Cursor supports kimi-k2-thinking as a selectable model, with self-host and API options, we would get better long-horizon reasoning for code changes without breaking the bank. In my neck of the woods, that’s like swapping in a fresh timing belt instead of praying the old one holds.

1 Like

+1 and this time Cursor please test the model before releasing it. Regular Kimi K2 has a formatting bug and agentic tool bugs that were never fixed even after many people asked for a solution and many many months for you to fix it..

8 Likes

I’m not one to make differences between the schemes of countries in the AI race, but ■■■■ it if some open source models would be great to use on Cursor.

Granted the recent Claude A.i. rug pull on a certain IDE I won’t name, im thinking Cursor has a shot to capitalize on the open source modal.

2 Likes

Please offer diverse sets of open source models that is cost efficient, and just as performant as the closed source counterparts.

its just absolutely ridiculous that don’t have proper integration with openrouter/groq at this point. the amount of challenge you’re faced trying to use alt openai endpoint is ridiculous. think its about time the community just makes an improved version of cursor that just supports other sota models. a version of cursor, that doesn’t balance profit/performance and just focuses purely on performance… cursor team doesn’t deserve our money at this point, all the obscure price changes, avoiding publishing certain models, prioritizing others. now publishing their own its become evident their intentions.

8 Likes

We need a swift and bug-free implementation to prove that Cursor is not married to its own model (and business upside), and agreements with monolithic providers. Thanks

6 Likes

+1 . We want it!

2 Likes

I added it using openrouter, but it fails miserably. I prompt it to do a thing, it thinks for a long time, then responds that it will do the thing, and that’s it. that’s the whole response. it doesn’t actually do the thing.

i push it to do the thing and it starts on the thing, then just finished again, still without doing the thing.

not sure if this is an openrouter setup issue, or a K2 thinking issue, but it’s useless at this point

4 Likes

Thanks for the post, I was about to do that

3 Likes

We need Kimi K2 in Cursor

1 Like

I believe it is openrouter + cursor issue. There’s always a problem when using models this way compared to built-in ones and more importantly models that cursor team has fine-tuned within the cursor workflow.

We need Kimi K2 in Cursor

How did we get like 10 models for chatgpt 5.1 so fast but still no Kimi 2 Thinking? Does Cursor have a relationship with OpenAI that speeds it up? Is there less profit margin for Cursor with Kimi 2 Thinking? Just curious. If Kimi 2 Thinking is really as good and cheap as people are saying, we need it, please. Personally, I love working in Cursor but even the Ultra plan is lasting me less than two weeks and I can’t afford to work all month on hobby projects.

5 Likes