Run MCP servers in WSL

Yes it is

But don’t expect me to reply every time someone asks me that

It might make more sense for you to just assume it is still working for me, and post instead an explanation why it is not working for you - then sometime might be able to help you with your problem, instead of me just replying “yes it is” to your question here which doesn’t really help you or anybody else in this forum

2 Likes

For anyone seeking a specific working example of the github mcp:

{
  "mcpServers": {
    "github": {
      "command": "wsl",
      "args": [
        "bash",
        "-c",
        "'source /home/devlinux/.nvm/nvm.sh && export NVM_DIR=/home/devlinux/.nvm && [ -s \"$NVM_DIR/nvm.sh\" ] && . \"$NVM_DIR/nvm.sh\" && cd /home/devlinux/devlinux/mcp/src/github && GITHUB_PERSONAL_ACCESS_TOKEN=ghp_the_rest_of_your_token /home/devlinux/.nvm/versions/node/v20.11.1/bin/npx -y @modelcontextprotocol/server-github'"
      ],
      "enabled": true
    }
  }
} 

Same issue here - I haven’t been able to get the fix working.

My json -

{
“mcpServers”: {
“memorymesh”: {
“command”: “wsl”,
“args”: [
“bash”,
“-c”,
“source /home/penguin/.nvm/nvm.sh && /home/penguin/.nvm/versions/node/v23.10.0/bin/node /home/penguin/rpg-game/dist/index.js”
],
“enabled”: true
}
}
}

I’m running a server called MemoryMesh in the rpg-game directory, if that matters - GitHub - CheMiguel23/MemoryMesh: A knowledge graph server that uses the Model Context Protocol (MCP) to provide structured memory persistence for AI models. v0.2.8

Also, I’ll admit I didn’t fully understand loubacker’s suggestion, but figured maybe someone could see an issue I missed here. It does run using node instead of npx as well.

Cheers!

If NPM is installed on WSL, and the path is correct, does this simplified version work:

No luck with that one either, I’m afraid.

However, after feeding this discussion to Sonnet and doing some trial and error, I got the following configuration:

{
“mcpServers”: {
“memorymesh”: {
“command”: “C:\Windows\System32\wsl.exe”,
“args”: [
“bash”,
“-c”,
“cd /home/penguin/rpg-game && /home/penguin/.nvm/versions/node/v23.10.0/bin/node /home/penguin/rpg-game/dist/index.js”
],
“enabled”: true
}
}
}

Which seems to work! Thanks for the examples everyone, that really helped the AI (and me).

The solution below ended up working for me. However, there was a key detail I needed to figure out before it did:

wsl commands are run in the default distro, which for me ended up being my docker integration. So for me all looked normal, all the while the commands were sent to the docker WSL. Make sure your distro is default!

Has anyone had success getting something like Brave Search MCP working in WSL2?

I want to add it through UI so it’s global in Cursor.

Setting

name:
brave-search

type:
command

command:
export BRAVE_API_KEY=XXXXXXXXXXXX && \ [ -x “/home/miikka/.nvm/versions/node/v18.19.1/bin/npx” ] && \ /home/miikka/.nvm/versions/node/v18.19.1/bin/npx -y @modelcontextprotocol/server-brave-search

is not giving be any success and just “Client closed”.

Through research it appeared just using “npx” does not work with WSL. :woman_shrugging:t2:

Thanks, I managed to get it to work with bun.

{
“mcpServers”: {
“context7”: {
“command”: “wsl”,
“args”: [
“bash”,
“-c”,
“‘/home/user/.bun/bin/bunx -y @upstash/context7-mcp@latest’”
],
“enabled”: true
}
}
}

Managed to get context7 working at least. :hugs:

Like so:

    "context7": {
      "command": "wsl",
      "args": [
        "bash",
        "-c",
        "'source /home/miikka/.nvm/nvm.sh && /home/miikka/.nvm/versions/node/v22.15.0/bin/npx -y @upstash/context7-mcp@latest'"
      ],
      "enabled": true
    },

Thanks to HYPRGK info in some other thread. :folded_hands:t2:

1 Like

This is the way, works with python MCP servers as well even with a venv. The 2 key points in case anyone is still struggling with this:

  1. When the wsl command is run, it’s not in the context of a login shell so your PATH is not configured. That means that for something like nvm, you need to first source nvm using <absolut path to your home dir>/.nvm/nvm.sh and then the rest of the command needs to be within the same session, using && to chain commands in the same context.
    For python, if the MCP server requires uvx to run, <path-to-home>/.local/bin/uvx(or whatever the path is when you run which uvx). For a venv it will be similar if using the python interpreter in the venv: /path/to/venv/bin/python && <rest of command>
  2. This is the thing that tripped me up. Everything after bash -c needs to be in a single quoted string that is inside of a double quoted string. Notice in @Miikka example above everything is inside of a single quoted string, that is also surrounded by double quotes(") since double quotes are required for JSON.

For those who don’t understand but are curious, the reason for the single quotes is that we need to pass the entire command to BASH without interpreting any symbols. If you don’t use the single quotes, then when WSL receives the command bash -c source /home/miikka/.nvm/nvm.sh && /home/miikka/.nvm/versions/node/v22.15.0/bin/npx -y @upstash/context7-mcp@latest it will send source /home/miikka/.nvm/nvm.sh to Bash to execute and then interpret the && as a second command to run after the bash -c ... command.

I have this fixed and will get it out ideally in 50.4 (we’re rolling out 50.3 now
If anyone would like to beta test to verify it works (running mcp stdio servers in the WSL environment) please let me know

3 Likes

yes, please!

I’m also open to beta test :slight_smile:

i’d love to beta test this

The changelog shows “Run stdio from remote workspace (WSL, Remote SSH)”. How do we get cursor to run the mcp server command within WSL?

I hope this contribution is useful for the comunity, I found a way to run mcp servers inside Wsl2 (wsl) in Windows

{
	"mcpServers": {
    "taskmaster-ai": {
			"command": "wsl.exe",
			"args": ["bash", "-c", "'source /home/mpc/.nvm/nvm.sh && /home/mpc/.nvm/versions/node/v22.15.1/bin/npx -y --package=task-master-ai task-master-ai'"],
			"env": {
				"OPENROUTER_API_KEY": "your-secret"
			}
		},
        "sequential-thinking": {
            "command": "wsl.exe",
            "args": [
                "bash",
                "-c",
                "'source /home/mpc/.nvm/nvm.sh && /home/mpc/.nvm/versions/node/v22.15.1/bin/npx -y @modelcontextprotocol/server-sequential-thinking'"
            ]
        }
    }
}

No need for workarounds anymore in 0.50.4, just run npx and they work!

2 Likes

I figured out why it wasn’t working for me.

The folder I have opened for my project is /home/kyle/Projects/deno/juniper. In my project’s .cursor/.mcp.json, I had the following.

{
  "mcpServers": {
    "juniper": {
      "command": "deno",
      "args": [
        "run",
        "-A",
        "mcp/server.ts"
      ]
    }
  }
}

Cursor was saying no tools available. It’s really hard to tell what is going wrong from the cursor MCP output.

If I open up my terminal and run deno run -A mcp/server.ts, the file runs and I can tell because the console.log gets output. I also have test cases using @modelcontextprotocol/sdk client testing that it works.

image

I accidentally found this, but if I add ./ to the start of the file path, it works in cursor now.

{
  "mcpServers": {
    "juniper": {
      "command": "deno",
      "args": [
        "run",
        "-A",
        "./mcp/server.ts"
      ]
    }
  }
}

They probably added some detection that if you are using a remote desktop like WSL, it would try to execute the MCP server commands inside the WSL instance instead of the main OS.