Extend MCP Server Calling to More Cursor Use Cases

Problem:
Currently in my experience, MCP server integration in Cursor is limited to the Agent mode when used with Claude models (e.g., claude-3.5-haiku). This restricted implementation means that other valuable use cases within Cursor do not benefit from MCP server capabilities. Specifically:

  • The Ask mode, even when using Claude models, does not call MCP servers.
  • All modes (Agent, Ask, Edit) when using other LLM providers such as OpenAI, Gemini, and DeepSeek do not call MCP servers.

This limited scope of MCP server utilization creates an inconsistent user experience and potentially underutilizes the capabilities of MCP servers within Cursor.

Impact:
The current restriction on MCP server calls has several negative impacts:

  • Reduced Functionality: Users are unable to leverage the potential benefits of MCP servers in Ask and Edit modes, and when using LLMs other than Claude in Agent mode. This may limit the effectiveness and capabilities of Cursor in these scenarios.
  • Inconsistent User Experience: The inconsistent application of MCP server calls across different modes and LLM providers can be confusing for users and make it harder to understand when and how MCP servers are being utilized.
  • Missed Opportunities: By not fully leveraging MCP servers across all relevant use cases, Cursor may be missing opportunities to enhance its performance, features, and overall value proposition to users.

Solution:
To address these issues and maximize the benefits of MCP server integration, we propose extending MCP server calling to the following use cases:

  • Enable MCP server calls in Ask mode when using Claude models. This will ensure feature parity with Agent mode for Claude models regarding MCP server utilization.
  • Enable MCP server calls in Agent, Ask, and Edit modes when using other LLM providers (OpenAI, Gemini, DeepSeek, etc.). This will broaden the benefits of MCP servers to a wider range of LLM options within Cursor, providing a more consistent and powerful experience regardless of the chosen LLM.

Benefits:
Implementing this feature request will result in:

  • Enhanced Functionality across Cursor Modes: Users will benefit from potentially improved performance and features in Ask and Edit modes, and with a wider range of LLM providers, due to MCP server integration.
  • Consistent and Intuitive User Experience: A more consistent approach to MCP server utilization will simplify the user experience and make it clearer when MCP servers are active.
  • Increased Value and Versatility of Cursor: By fully leveraging MCP servers, Cursor will become a more powerful and versatile tool, capable of delivering enhanced performance and features across a broader range of use cases and LLM providers.
3 Likes

Yeah it would make sense to have things equal everywhere.

You can create custom modes nowadays with MCP support.

Also Agent mode depends on certain features that only some models have, thats why they are enabled for Agent and others not. Cursor team mentioned several times in the forum that they are looking into adding more models for agent use, and that they need to test and adjust things to work with each model.

As there are people who dont want to use MCP in Ask mode or no tools or edit features too, it makes sense why default modes have those presets.

This would be an excellent feature to enhance all the different modes in Cursor!

1 Like

@T1000 Thank you for sharing the information. The Custom modes feature looks great.

According to my research, we can configure the tools only with Claude models. However, I made sure that a customized mode similar to the Ask mode with claude models. So, it would be great to enhance the tool customization with other models in addition to Claude.

Here is the version of mine:

Version: 0.48.1
VSCode Version: 1.96.2
Commit: 0139db98f117ab50fcaaf7a0b1c69d345bd98a10
Date: 2025-03-24T21:08:12.186Z
Electron: 34.3.4
Chromium: 132.0.6834.210
Node.js: 20.18.3
V8: 13.2.152.41-electron.0
OS: Darwin arm64 23.5.0

Custom modes with claude-3.5-haiku

The configuration form for tools was shown.

Custom mode with gemini-2.0-flash

The configuration form for tools wasn’t shown.

Gemini is not listed as an Agent mode configured model. Which is required to use tools.

Haiku is also not on the Agent capable list.

Likely the small models have a capability/context limitation that prevents this from being practical for most users.

That makes sense for a larger model to appropriately handle context to call MCP tools under automated runs with the Agent mode. Meanwhile, we may want to utilize MCP servers just on the Ask model as a MCP client. I think Cursor has huge potential to be the center not only of development but also the client to handle tasks on a daily basis.

Aside from that, in my experience, even models, which is not so large as gemini-2.0-flash, can almost appropriately understand the context of what I want to. For instance, I played around Goose which is a desktop chat app can be integrated with MCP servers. I can take advantage of MCP server(s) as expected.