Using Gemini 3 Pro Preview with Openrouter Fails

Where does the bug appear (feature/product)?

Cursor IDE

Describe the Bug

After submitting a prompt(ASK mode) with Gemini 3 pro preview(through openrouter), I get an error popup stating “Unable to reach the model provider”(see detail below with the error json), however Openrouter is charging me for the request and generating tokens. Using a different model through openrouter with the same prompt works fine with the same Openrouter API key so i know the API key is good.

Steps to Reproduce

submit a prompt
“Explored 3 files” displayed by agent under the prompt. Nothing else happens until I click in the cursor chat pane, and get the “Unable to reach the model provider” dialog at the bottom of the pane - see screenshot.

Expected Behavior

Agent should do work out output the results under the prompt.

Screenshots / Screen Recordings

Operating System

MacOS

Current Cursor Version (Menu → About Cursor → Copy)

Version: 2.1.50 (Universal)
VSCode Version: 1.105.1
Commit: 56f0a83df8e9eb48585fcc4858a9440db4cc7770
Date: 2025-12-06T23:39:52.834Z
Electron: 37.7.0
Chromium: 138.0.7204.251
Node.js: 22.20.0
V8: 13.8.258.32-electron.0
OS: Darwin arm64 23.6.0

For AI issues: which model did you use?

Gemini 3 Pro Preview with Openrouter

For AI issues: add Request ID with privacy disabled

0998c0ef-e0bd-434a-8086-24c1c828e2ef

Additional Information

This is the JSON copied from the error dialog:

Request ID: 0998c0ef-e0bd-434a-8086-24c1c828e2ef
{“error”:“ERROR_OPENAI”,“details”:{“title”:“Unable to reach the model provider”,“detail”:“We encountered an issue when using your API key: Provider was unable to process your request\n\nAPI Error:\n\n\nRequest failed with status code 400: {\"error\":{\"message\":\"Provider returned error\",\"code\":400,\"metadata\":{\"raw\":\"Gemini models require OpenRouter reasoning details to be preserved in each request. Please refer to our docs: https://openrouter.ai/docs/guides/best-practices/reasoning-tokens#preserving-reasoning-blocks. Upstream error: {\\n \\\"error\\\": {\\n \\\"code\\\": 400,\\n \\\"message\\\": \\\"Function call is missing a thought_signature in functionCall parts. This is required for tools to work correctly, and missing thought_signature may lead to degraded model performance. Additional data, function call `default_api:list_dir` , position 2. Please refer to https://ai.google.dev/gemini-api/docs/thought-signatures for more details.\\\",\\n \\\"status\\\": \\\"INVALID_ARGUMENT\\\"\\n }\\n}\\n\",\"provider_name\":\"Google AI Studio\"}},\"user_id\":\"HIDDEN_FOR_PRIVACY\"}\n”,“additionalInfo”:{},“buttons”:,“planChoices”:},“isExpected”:true}

Does this stop you from using Cursor

Yes - Cursor is unusable

Hey, thanks for the report. OpenRouter isn’t officially supported in Cursor, so new models like Gemini 3 Pro Preview via OpenRouter often fail with request format errors (for example, a required thought_signature). Confirmation and details here: Cursor is practically unusable with any new model through OpenRouter

What you can do:

  • Use built-in Cursor Pro models
  • Or add an official provider API key in Settings → Models: Google (for Gemini), Anthropic, OpenAI, Azure, AWS Bedrock - API Keys | Cursor Docs

Could you try a Google AI key for Gemini and let us know if it helps? If the issue remains after that, please share a screenshot of the error and the new Request ID.

Sorry, I should have caught this myself.

We are a startup and need to be able to not only control our costs, but also fine-grained model control.

Thanks for your help and responsiveness- greatly appreciated !

Kelly K.

i am sorry, but even the older models from OpenRouter don’t work! You are saying this is new, but Gemini 2.5 models don’t work, Sonnet 4 or Opus 4.1 don’t work, the list is long.

Cursor, doesn’t support OpenRouter. i haven’t found a single model that works! and to answer your request, we can’t use the providers’ keys or azure/bedrock, we only access to OpenRouter via external projects.

This topic was automatically closed 22 days after the last reply. New replies are no longer allowed.