Openrouter gemini2.5-pro is invalid

got this error when using openrouter gemini2.5-pro

Request failed with status code 400: {“error”:{“message”:“gemini-2.5-pro-exp-03-25:with_thoughts is not a valid model ID”,“code”:400},“user_id”:“user_2qIKG6H58ZFfBNAdmxx8hgcJsDa”}

3 Likes

@deanrie

you need disable other models and enable custom openai settings.


Hey add this base url, add your open router api key and enable and verify the key

Also try to disable all non custom models from your models list and then try to verify

i think i configure it correctly because yesterday it worked and now it cannot work

i just found from the error message that cursor change my model, i enter the model name as google/gemini-2.5-pro-exp-03-25:free but from the error message it changes to gemini-2.5-pro-preview-03-25:with_thoughts

error message

Request failed with status code 400: {"error":{"message":"gemini-2.5-pro-preview-03-25:with_thoughts is not a valid model ID","code":400},"user_id":"user_2qIKG6H58ZFfBNAdmxx8hgcJsDa"}

image
Select your gemini model here too and then try again


i think i chose the right model

if i chose the other model, it worked. so i think there is something wrong for gemini-2.5-pro in cursor

Yeah then it might be the issue with the gemini model

but i can request openrouter gemini2.5-pro using other chatbot platform, are you sure this is not the bug of curosr?

It could be because I faced these kind of issues few days back

i just found the built-in gemini2.5-pro in cursor will output reasoning content and before it didn’t. maybe this change breaks the openrouter gemini2.5-pro behaviro.

1 Like

if there are any progress, please let me know.

any progress? @deanrie

1 Like

any progress? @deanrie

Hey, we’re still looking into this but can’t see a reason why this model in particular is not working!

Thank you for your hard work on this issue. I believe the error occurs because the cursor is adding :with_thoughts after the actual model name.

2 Likes

Same here