I wanting to know about the availability of information on this topic or any plans to implement this in the near future:
This August, Xai released a beta API program for CodeGrok and Grok 2. During my testing and evaluation of the model on X’s platform, I found it to be quite impressive and comparable to GPT 4/4o. While I have not yet had the opportunity to test CodeGrok, I believe it would be beneficial to offer it as a premium usage option.
Grok isn’t currently available native to Cursor but you can use the OpenRouter connector by supplying the OpenRouter URL as the “Custom OpenAI URL” in Cursor’s settings. You can then add Grok as a LLM option in Cursor.
Note that this will bypass Cursor’s subscription entirely, and your queries will be billing via OpenRouter at their rate.
4) Verify your xAI API key
//-----------------------------------\
The API is public today with free credits.
It’s fully compatible with the OpenAI REST API, so hopefully not too difficult to support natively. https://docs.x.ai/api/endpoints
With Grok 3 trained on the world’s largest GPU cluster expected by EOY (Elon Time) it would be nice to get a performance baseline with Grok Beta (successor to Grok 2).
November 4, 2024
API Public Beta
Starting today, developers can build on our Grok foundation models using our newly released API. We will run a public beta program until the end of 2024 during which everyone will get $25 of free API credits per month. xAI API General Access
So here we are 5 months later, a few weeks away from the release of “scary smart” Grok 3, trained on the world’s largest supercomputer, yet Cursor still doesn’t support xAI.
Elon Musk: “Grok 3 has very powerful reasoning capabilities, so in the tests that we’ve done thus far, Grok 3 is outperforming anything that’s been released, that we’re aware of, so that’s a good sign,”
“We think it’ll be better than anything else, and then maybe this might be the last time that any AI is better than Grok,”
Hacking a custom OpenAI URL is not an acceptable solution because it bypasses the subscription users are paying for, and requires fiddling with the settings and disabling all other models to work.
Why not just add a separate OpenAI compatible API endpoint for using models without native support that doesn’t bypass our subscription?
I’d love to know why there’s an option for Azure, which maybe one person uses, but not Open Router?