Everything worked fine and on one of the messages, it gave an error. Now the key does not pass the check. Tried to create a new key. No changes.
Hey, try disabling all other models and keep only the one you manually added. Based on the screenshot, it seems the error was caused by the cursor-small
model.
How is cursor-small related to OpenAI?
This is our own model that’s included in the subscription.
I know that. I’ve been working with you on the pro plan for about 2 months now. I have a paid subscription. I mean, what do third-party plug-in models have to do with this model?
I disabled cursor-small. And that helped. I’ll enable it when needed. Thank you.
New problem. It stopped working again after some time. And it works successfully and then stops by itself.
I just tested this and didn’t encounter the error. However, I’ve seen this issue discussed on the forum, and the suggested solution was to disable models from your subscription.
I’ll check this.
Are you trying to use this model in Composer? At the moment, your API key is only functional in the chat.
Yes, in Composer. Not in the agent. In the standard one. So it works, works, then breaks. Everything worked fine before. Why is that?
I want to use my openrouter ai with my new wordpress for blogging automation but each time that process it from the plugin here is the information i got “OpenAI API error: License is not valid, please visit the plugin settings page and add your correct purchase code
No links found for this keyword”
I added $5 to the openrouter yet it doesnt work, please what can i do resolve this issue?
Hey, try disabling all models and only keep those related to OpenRouter.
feature request: ability to add openrouter models and still use the built-in cursor models. preferably there could just be a “custom model” section with: baseurl, model name, api key. this could be a catch-all for openrouter and similar services. (not having to hack the “openai” section, and not having to “miss out” on the proprietary cursor sauce by wanting to add a model of our own.)
edit, please upvote all y’all openrouter peeps!
This is mostly caused by leaving OpeanAI models selected. If I keep Anthropic or Google models enabled, then there’s no error. Please consider separating custom inference APIs configuration from OpenAI API configuration. This is kind of an urgent feature tbh
Now that I double-check this stop working. it is not even working with non OpenAI models. Even if I disable the usage of the OpenAI key (with modified URL pointing to Openrouter) then trying to use Google models still fails because Cursor is trying to use the custom OpenAI Base URL. That means that if you modified the Base URL then Cursor will always try to use that one, instead of the appropriates for Anthropic, google etc.
same error here as well w/ only openrouter models chosen.
Hey, which models are you using?