are you adding these by overriding OpenAI Base URL?
yes 5000 requisitions per day of models open source
could you give a documentation or guideline link to do this? i only tried github repository that makes GitHubâs premium/free requests as OpenAI api so i can use in Cursor: GitHub - ericc-ch/copilot-api: Turn GitHub Copilot into OpenAI/Anthropic API compatible server. Usable with Claude Code!
So like I said, itâs just Q&AâŚ
To make it work, it basically goes like this: youâll need to go inside the opinion API within the cursor, insert your inference PIN there, then disable 100% of all the models currently in the course and add the name of the model you want into the API.
Once you flip the switch to subscribe, youâll notice that the models will auto-enable themselves and start working with all the tool calls from the courseâprovided the model you added is compatible. If it isnât compatible, it wonât work.
For example, GLM 4, 6, or models like Queen, OMI Max, and others already have model calls that support tool usage, so they will use the course tools by default.
Finally, all you need to do is add a prompt explaining how the tool calls should be made. After that, itâs very simpleâtheyâll work normally.
Open source FTW baby.
We canât understand anything youâre saying. Canât you simply use AI to translate your messages instead of forcing an Anglo-Spanish mix on us?
When we ask you how you do it, could you be any more imprecise and brief? /s
You mention 5000 ârequisitionsâ per day for open-source models. What would it cost you to provide the right API endpoint address and the provider? We imagine it to be Chutes.ai here.
Your âproofâ videos never show K2 Thinking functioning in Cursor, but in Roo Code. Then the second video shows us MiniMax M2 in Cursor, which is not the subject of this topic.