How to use Cursor with external models (such as Groq LLMs)?

How to use Cursor with external models (such as Groq LLMs)?

For example:
When I use this model for code editing purposes, it gives me good results:

Meta Llama 3 70B

Can I configure Cursor to use such a model? I currently am on a free plan for Cursor. Is it possible to do that on a free plan?

6 Likes

how to change inference engine?

This is my exact question. Did you figure this out? If so, lmk.

Check it out, its a template I got working with groq in cursor. lets you run any llm or provider you want. You have to do the hardwork of integrating it but theres 2 working examples in the codebase.

1 Like

DOPE! Thank you!
For those who are not familiar with GITHUB, here is a video on how download and use a project from GitHub:

:winking_face_with_tongue:

In Cursor Settings, then Models, you have the option to change the base URL for OpenAI, which all other inference endpoints support OpenAI API as their standards.

For instance, if you wanted to use OpenRouter.ai, you would add https://openrouter.ai/api/v1 as the Base URL to be used.

Once you’ve overridden the OpenAI Base URL above it, click where it reads + Add model, add the model card name you want to use from the API you’re now querying. For instance, if you wanted to use DeepSeek V3 Chat Free from OpenRouter, you would add deepseek/deepseek-chat: free and then + Add model again to add it.

It will not refresh the UI. I have to close Cursor Settings and then reopen it to see the new model. In the agent chat, change to the model you’d like to use, and you’re good to go.

1 Like

Thank you!!! I appreciate your response broski!
Blessings your way!

1 Like

Any time.