Kimi K2 Thinking in Cursor

are you adding these by overriding OpenAI Base URL?

1 Like

yes 5000 requisitions per day of models open source

1 Like

could you give a documentation or guideline link to do this? i only tried github repository that makes GitHub’s premium/free requests as OpenAI api so i can use in Cursor: GitHub - ericc-ch/copilot-api: Turn GitHub Copilot into OpenAI/Anthropic API compatible server. Usable with Claude Code!

So like I said, it’s just Q&A…

To make it work, it basically goes like this: you’ll need to go inside the opinion API within the cursor, insert your inference PIN there, then disable 100% of all the models currently in the course and add the name of the model you want into the API.
Once you flip the switch to subscribe, you’ll notice that the models will auto-enable themselves and start working with all the tool calls from the course—provided the model you added is compatible. If it isn’t compatible, it won’t work.
For example, GLM 4, 6, or models like Queen, OMI Max, and others already have model calls that support tool usage, so they will use the course tools by default.
Finally, all you need to do is add a prompt explaining how the tool calls should be made. After that, it’s very simple—they’ll work normally.

1 Like

My friend, I really think you didn’t watch the video, so you’re a bit blind to what happened. Didn’t you see that I used the debug mode available inside the cursor version? It directly interacted with my whole idea, manipulating the files, making specific corrections, and performing tool calls—all 100% autonomously.

Open source FTW baby.

We can’t understand anything you’re saying. Can’t you simply use AI to translate your messages instead of forcing an Anglo-Spanish mix on us?

When we ask you how you do it, could you be any more imprecise and brief? /s
You mention 5000 “requisitions” per day for open-source models. What would it cost you to provide the right API endpoint address and the provider? We imagine it to be Chutes.ai here.

Your ‘proof’ videos never show K2 Thinking functioning in Cursor, but in Roo Code. Then the second video shows us MiniMax M2 in Cursor, which is not the subject of this topic.