Custom model section in Settings > Models

feature request: ability to add custom models (such as those from openrouter, groq, etc) and still use the built-in cursor models.

suggestion:
in settings > models:
a “custom model” section with:

baseurl
model name
api key

this could be a catch-all for openrouter and similar services that provide a chat/completions endpoint. (not having to hack the “openai” section to use openrouter/etc, and not having to “miss out” on the proprietary cursor sauce by wanting to add a model of our own.)

ok if we assume the endpoint must be openai client compatible.

related: Openrouter.ai stopped working - #12 by deanrie

22 Likes

this is what it should look like:

2 Likes

This isn’t the first request for this I’ve seen on the forum and I’ve logged it on our internal feedback log so you may see this in a future update!

4 Likes

I could not upvote this enough! Please this would be really appreciated! Thanks

1 Like

especially considering all of the issues you are having with anthropic being able to keep up with cursor’s use, it would really do us all a favor if you could make it easier to access other models.

sonnet is particularly slow today (despite being an enterprise customer with the extra add on payments setup)… and it really makes me want to switch to a different product just because you are being strangled by your own popularity.

cc @danperks

1 Like

soo many anthropic errors today

+1000 for this one. I was testing out a custom model deployed in our intranet, but could not get it done. The current approach via “reusing” the Open API credentials is very confusing. I’ve also created a simple wrapper to make the APIs compatible, but even though the request via cURL displayed by Cursor was successful in my terminal, Cursor would still throw an error and fail to use it.

I hope you could improve this setup in the following weeks :slight_smile: , would be awesome :heart:

1 Like

Hey, just to clarify, Cursor does not work with locally hosted models, as Cursor’s servers need to be able to access the model for everything to work correctly!

However, we are hoping to improve the UX around custom models within the editor soon.

1 Like

Thx for the clarification, i understand now. Maybe it would be possible via some kind of proxy installed as a Cursor extension. You could even encrypt some parts if you are concerned about know how and then just call a privately deployed model. This would be very interesting for business/enterprise customers for sure.

i think the solution for this is:

  • when the user can choose a model, they can fully choose the model and the provieder (ie, the user can also choose their own base_url, api_key, model_id)
  • when there is a proprietary model doing something cool in the background that the user isn’t aware of anyways, then just have cursor do that. (we don’t need to see or replace the secret fast file editing llm.)

if the user really wants to host locally, they need to make their own proxy server (hence their own base_url, api_key, model_id) - so cursor can see it

1 Like