Wrong tools handling on OpenAI compatible endpoint

Where does the bug appear (feature/product)?

Cursor IDE

Describe the Bug

Anthropic-style request on OpenAI endpoint while using anthropic model.

Steps to Reproduce

  1. Setup liteLLM proxy
  2. Expose Anthropic models from Vertex AI
  3. Use the endpoint in Cursor and invoke tool use

Expected Behavior

No error happening… OpenAI Compatible endpoint should handle openAI responses

Operating System

MacOS

Current Cursor Version (Menu → About Cursor → Copy)

Version: 2.2.0-pre.31.patch.0
VSCode Version: 1.105.1
Commit: 638f7f37c8f4b6a0323f5fd46b815c78185bd580
Date: 2025-12-08T08:24:14.597Z
Electron: 37.7.0
Chromium: 138.0.7204.251
Node.js: 22.20.0
V8: 13.8.258.32-electron.0
OS: Darwin arm64 25.0.0

For AI issues: which model did you use?

Claude 4.5 opus / sonnet / haiku

For AI issues: add Request ID with privacy disabled

Request ID: 95222a4f-8abd-431a-a5b8-549c887a5ed3
{“error”:“ERROR_OPENAI”,“details”:{“title”:“Unable to reach the model provider”,“detail”:“We encountered an issue when using your API key: Provider was unable to process your request\n\nAPI Error:\n\n\nRequest failed with status code 500: {\"error\":{\"message\":\"litellm.APIConnectionError: Invalid user message={'role': 'user', 'content': [{'type': 'tool_result', 'tool_use_id': 'toolu_vrtx_01FFHWXQRAEDhjwwmqWXF8f6', 'content': [{'type': 'text', 'text': '/\\\\n - user-GitLab/\\\\n - tools/\\\\n - create_issue.json\\\\n - create_merge_request.json\\\\n - create_workitem_note.json\\\\n - get_issue.json\\\\n - get_mcp_server_version.json\\\\n - get_merge_request_commits.json\\\\n - get_merge_request_diffs.json\\\\n - get_merge_request_pipelines.json\\\\n - get_merge_request.json\\\\n - get_pipeline_jobs.json\\\\n - get_workitem_notes.json\\\\n - gitlab_search.json\\\\n - semantic_code_search.json\\\\n - user-XcodeBuildMCP/\\\\n - resources/\\\\n - devices.json\\\\n - doctor.json\\\\n - simulators.json\\\\n - tools/\\\\n - boot_sim.json\\\\n - build_device.json\\\\n - build_macos.json\\\\n - build_run_macos.json\\\\n - build_run_sim.json\\\\n - build_sim.json\\\\n - button.json\\\\n - clean.json\\\\n - describe_ui.json\\\\n - discover_projs.json\\\\n - doctor.json\\\\n - erase_sims.json\\\\n - gesture.json\\\\n - get_app_bundle_id.json\\\\n - get_device_app_path.json\\\\n - get_mac_app_path.json\\\\n - get_mac_bundle_id.json\\\\n - get_sim_app_path.json\\\\n - install_app_device.json\\\\n - install_app_sim.json\\\\n - key_press.json\\\\n - key_sequence.json\\\\n - launch_app_device.json\\\\n - launch_app_logs_sim.json\\\\n - launch_app_sim.json\\\\n - launch_mac_app.json\\\\n - list_devices.json\\\\n - list_schemes.json\\\\n - list_sims.json\\\\n - long_press.json\\\\n - open_sim.json\\\\n - record_sim_video.json\\\\n - reset_sim_location.json\\\\n - scaffold_ios_project.json\\\\n - scaffold_macos_project.json\\\\n - screenshot.json\\\\n - set_sim_appearance.json\\\\n - set_sim_location.json\\\\n - show_build_settings.json\\\\n - sim_statusbar.json\\\\n - start_device_log_cap.json\\\\n - start_sim_log_cap.json\\\\n - stop_app_device.json\\\\n - stop_app_sim.json\\\\n - stop_device_log_cap.json\\\\n - stop_mac_app.json\\\\n - stop_sim_log_cap.json\\\\n - swift_package_build.json\\\\n - swift_package_clean.json\\\\n - swift_package_list.json\\\\n - swift_package_run.json\\\\n - swift_package_stop.json\\\\n - swift_package_test.json\\\\n - swipe.json\\\\n - tap.json\\\\n - test_device.json\\\\n - test_macos.json\\\\n - test_sim.json\\\\n - touch.json\\\\n - type_text.json\\\\n'}], 'cache_control': {'type': 'ephemeral'}}]} at index 6. Please ensure all user messages are valid OpenAI chat completion messages.\\nTraceback (most recent call last):\\n File \\\"/usr/lib/python3.13/site-packages/litellm/utils.py\\\", line 6956, in validate_chat_completion_user_messages\\n raise Exception(\\\"invalid content type\\\")\\nException: invalid content type\\n\\nDuring handling of the above exception, another exception occurred:\\n\\nTraceback (most recent call last):\\n File \\\"/usr/lib/python3.13/site-packages/litellm/main.py\\\", line 591, in acompletion\\n init_response = await loop.run_in_executor(None, func_with_context)\\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n File \\\"/usr/lib/python3.13/concurrent/futures/thread.py\\\", line 59, in run\\n result = self.fn(*self.args, **self.kwargs)\\n File \\\"/usr/lib/python3.13/site-packages/litellm/utils.py\\\", line 1103, in wrapper\\n result = original_function(*args, **kwargs)\\n File \\\"/usr/lib/python3.13/site-packages/litellm/main.py\\\", line 1041, in completion\\n messages = validate_and_fix_openai_messages(messages=messages)\\n File \\\"/usr/lib/python3.13/site-packages/litellm/utils.py\\\", line 6903, in validate_and_fix_openai_messages\\n return validate_chat_completion_user_messages(messages=new_messages)\\n File \\\"/usr/lib/python3.13/site-packages/litellm/utils.py\\\", line 6963, in validate_chat_completion_user_messages\\n raise Exception(\\n f\\\"Invalid user message={m} at index {idx}. Please ensure all user messages are valid OpenAI chat completion messages.\\\"\\n )\\nException: Invalid user message={'role': 'user', 'content': [{'type': 'tool_result', 'tool_use_id': 'toolu_vrtx_01FFHWXQRAEDhjwwmqWXF8f6', 'content': [{'type': 'text', 'text': '\\\n - user-GitLab/\\\\n - tools/\\\\n - create_issue.json\\\\n - create_merge_request.json\\\\n - create_workitem_note.json\\\\n - get_issue.json\\\\n - get_mcp_server_version.json\\\\n - get_merge_request_commits.json\\\\n - get_merge_request_diffs.json\\\\n - get_merge_request_pipelines.json\\\\n - get_merge_request.json\\\\n - get_pipeline_jobs.json\\\\n - get_workitem_notes.json\\\\n - gitlab_search.json\\\\n - semantic_code_search.json\\\\n - user-XcodeBuildMCP/\\\\n - resources/\\\\n - devices.json\\\\n - doctor.json\\\\n - simulators.json\\\\n - tools/\\\\n - boot_sim.json\\\\n - build_device.json\\\\n - build_macos.json\\\\n - build_run_macos.json\\\\n - build_run_sim.json\\\\n - build_sim.json\\\\n - button.json\\\\n - clean.json\\\\n - describe_ui.json\\\\n - discover_projs.json\\\\n - doctor.json\\\\n - erase_sims.json\\\\n - gesture.json\\\\n - get_app_bundle_id.json\\\\n - get_device_app_path.json\\\\n - get_mac_app_path.json\\\\n - get_mac_bundle_id.json\\\\n - get_sim_app_path.json\\\\n - install_app_device.json\\\\n - install_app_sim.json\\\\n - key_press.json\\\\n - key_sequence.json\\\\n - launch_app_device.json\\\\n - launch_app_logs_sim.json\\\\n - launch_app_sim.json\\\\n - launch_mac_app.json\\\\n - list_devices.json\\\\n - list_schemes.json\\\\n - list_sims.json\\\\n - long_press.json\\\\n - open_sim.json\\\\n - record_sim_video.json\\\\n - reset_sim_location.json\\\\n - scaffold_ios_project.json\\\\n - scaffold_macos_project.json\\\\n - screenshot.json\\\\n - set_sim_appearance.json\\\\n - set_sim_location.json\\\\n - show_build_settings.json\\\\n - sim_statusbar.json\\\\n - start_device_log_cap.json\\\\n - start_sim_log_cap.json\\\\n - stop_app_device.json\\\\n - stop_app_sim.json\\\\n - stop_device_log_cap.json\\\\n - stop_mac_app.json\\\\n - stop_sim_log_cap.json\\\\n - swift_package_build.json\\\\n - swift_package_clean.json\\\\n - swift_package_list.json\\\\n - swift_package_run.json\\\\n - swift_package_stop.json\\\\n - swift_package_test.json\\\\n - swipe.json\\\\n - tap.json\\\\n - test_device.json\\\\n - test_macos.json\\\\n - test_sim.json\\\\n - touch.json\\\\n - type_text.json\\\\n'}], 'cache_control': {'type': 'ephemeral'}}]} at index 6. Please ensure all user messages are valid OpenAI chat completion messages.\\n. Received Model Group=claude-4.5-opus-high-thinking\\nAvailable Model Group Fallbacks=None\",\"type\":null,\"param\":null,\"code\":\"500\"}}\n”,“additionalInfo”:{},“buttons”:,“planChoices”:},“isExpected”:true}
ConnectError: [unauthenticated] Error
at BHc.$endAiConnectTransportReportError (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:12147:454002)
at HFo._doInvokeHandler (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:12799:22831)
at HFo._invokeHandler (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:12799:22573)
at HFo._receiveRequest (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:12799:21335)
at HFo._receiveOneMessage (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:12799:20152)
at gLt.value (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:12799:18244)
at Te._deliver (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:49:2962)
at Te.fire (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:49:3283)
at Gpt.fire (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:12132:12156)
at MessagePort. (vscode-file://vscode-app/Applications/Cursor.app/Contents/Resources/app/out/vs/workbench/workbench.desktop.main.js:14817:18433)

Does this stop you from using Cursor

Yes - Cursor is unusable

Hey, thanks for the report.

Looks like Cursor is sending an Anthropic-style tool_result to an OpenAI endpoint, and LiteLLM crashes on it. To reproduce, we need details about your setup:

  • LiteLLM config (version, routing for Anthropic via Vertex)
  • Base URL and model in Cursor Settings > Models
  • LiteLLM logs with the incoming request from Cursor
  • Cursor console (Help > Toggle Developer Tools) at the time of the error

Temporary workarounds:

  • Use a direct Anthropic API key in Cursor: API Keys | Cursor Docs
  • Or switch to an OpenAI model through your endpoint
  • Temporarily disable MCP servers (Settings > Features > disable MCP)

Send the logs and config so we can move forward.

Yep, that’s why, when using OpenAI endpoint, Cursor should send OpenAI - compatible payloads. One of the solutions here would be to introduce VertexAI integration - since we already have Azure, Bedrock, Anthropic, OpenAI, Google API… take a look at OpenCode it’s open-source so you could see how it’s done.

  • LiteLLM config (version, routing for Anthropic via Vertex)

    • it’s routing through openAI-compatible LiteLLM endpoint into VertexAI-hosted Anthropic models. I do not have Anthropic API key nor a way to obtain one.
  • Base URL and model in Cursor Settings > Models

    • Can’t post it publicly, as the LiteLLM instance is hosted by my company internally (with public endpoint ofc). Model is mapped 1:1 to what Cursor uses. For example: claude-4.5-opus-high-thinking
  • LiteLLM logs with the incoming request from Cursor

    • Sure, I’ll paste examples below
  • Cursor console (Help > Toggle Developer Tools) at the time of the error

    • Sure, I’ll paste example below

Temporary workarounds:

  • Use a direct Anthropic API key in Cursor: API Keys | Cursor Docs

    • As mentioned, I do not have a way to obtain Anthropic API key
  • Or switch to an OpenAI model through your endpoint

    • I use LiteLLM explicitly for access to my Anthropic models on VertexAI
  • Temporarily disable MCP servers (Settings > Features > disable MCP)

    • This makes Cursor unusable as my workflow uses local tools exposed via MCP

There is 32k characters limit, so I had to paste the logs over to pastebin

LiteLLM logs with the SUCCESSFUL incoming request from Cursor

LiteLLM logs with the FAILED incoming request from Cursor

Cursor logs at time of error

In short, again, request is not OpenAI compatible.

Thanks for the detailed logs. I confirm the bug: with an OpenAI-compatible endpoint, Cursor sends the Anthropic-format tool_result as a user message to /v1/chat/completions after tool_use. This causes LiteLLM 1.80.5 validator to fail with “Invalid user message”. I’ll pass it to the team.

1 Like

This topic was automatically closed 22 days after the last reply. New replies are no longer allowed.