Perplexity AI in Cursor via MCP — perplexity-tools + perplexity-chat

Perplexity AI in Cursor via MCP — perplexity-tools + perplexity-chat

Why this guide exists:

Invested a few hours figuring out the best options to invoke Perplexity AI within Cursor as of 08/11/25 that may be of benefit to others.

Cursor currently prioritizes “models” over MCP tools, and Perplexity isn’t a selectable model yet. SSE configs don’t allow auth headers, so they hang on “Loading tools.” The reliable path today is two local stdio MCP servers that read your Perplexity API key from env, announce tools immediately, and respond over stdio. This guide should be first-time-right.

Prerequisites/My Stack

  • macOS (Apple Silicon or Intel)

  • Node.js 18+ (use Homebrew)

  • Perplexity API key (pplx-…)

  • Cursor (latest)

Install Node (if needed)

Create folders

  • mkdir -p ~/mcp/perplexity ~/mcp/perplexity-chat

Initialize “perplexity-tools”

  • cd ~/mcp/perplexity

  • npm init -y

  • npm install node-fetch@3

  • Create package.json with:
    {
    “type”: “module”,
    “dependencies”: { “node-fetch”: “^3.3.2” }
    }

Server file 1 — ~/mcp/perplexity/index.js
import fetch from ‘node-fetch’;

const KEY = process.env.PPLX_API_KEY;
if (!KEY) { console.error(‘Missing PPLX_API_KEY’); process.exit(1); }

const tools = [{
name: ‘perplexity.search’,
description: ‘Web search via Perplexity; returns summarized result text’,
inputSchema: { type: ‘object’, properties: { q: { type: ‘string’ } }, required: [‘q’] }
}];

const send = (o) => process.stdout.write(JSON.stringify(o) + ‘\n’);
send({ type: ‘tool_list’, tools, capabilities: { tools: {} } });

process.stdin.on(‘data’, async (buf) => {
const lines = buf.toString().trim().split(‘\n’).filter(Boolean);
for (const line of lines) {
let msg; try { msg = JSON.parse(line); } catch { continue; }
if (msg.type === ‘tool_call’ && msg.tool === ‘perplexity.search’) {
const q = msg.args?.q ?? ‘’;
try {
const res = await fetch(‘https://api.perplexity.ai/chat/completions’, {
method: ‘POST’,
headers: { ‘Authorization’: Bearer ${KEY}, ‘Content-Type’: ‘application/json’ },
body: JSON.stringify({
model: ‘sonar-pro’,
messages: [
{ role: ‘system’, content: ‘You are a research assistant. Provide concise answers.’ },
{ role: ‘user’, content: q }
],
temperature: 0.2
})
});
const json = await res.json();
const text = json?.choices?.?.message?.content ?? JSON.stringify(json);
send({ type: ‘tool_result’, callId: msg.callId, content: text });
} catch (e) {
send({ type: ‘tool_error’, callId: msg.callId, error: String(e) });
}
}
}
});

Server file 2 — ~/mcp/perplexity-chat/index.js
import fetch from ‘node-fetch’;

const KEY = process.env.PPLX_API_KEY;
if (!KEY) { console.error(‘Missing PPLX_API_KEY’); process.exit(1); }

const tools = [{
name: ‘perplexity.chat’,
description: ‘Free-form chat completion via Perplexity; returns assistant message content’,
inputSchema: { type: ‘object’, properties: { message: { type: ‘string’ } }, required: [‘message’] }
}];

const send = (o) => process.stdout.write(JSON.stringify(o) + ‘\n’);
send({ type: ‘tool_list’, tools, capabilities: { tools: {} } });

process.stdin.on(‘data’, async (buf) => {
const lines = buf.toString().trim().split(‘\n’).filter(Boolean);
for (const line of lines) {
let msg; try { msg = JSON.parse(line); } catch { continue; }
if (msg.type === ‘tool_call’ && msg.tool === ‘perplexity.chat’) {
const message = msg.args?.message ?? ‘’;
try {
const res = await fetch(‘https://api.perplexity.ai/chat/completions’, {
method: ‘POST’,
headers: { ‘Authorization’: Bearer ${KEY}, ‘Content-Type’: ‘application/json’ },
body: JSON.stringify({
model: ‘sonar-pro’,
messages: [
{ role: ‘system’, content: ‘You are a helpful assistant.’ },
{ role: ‘user’, content: message }
],
temperature: 0.3
})
});
const json = await res.json();
const text = json?.choices?.?.message?.content ?? JSON.stringify(json);
send({ type: ‘tool_result’, callId: msg.callId, content: text });
} catch (e) {
send({ type: ‘tool_error’, callId: msg.callId, error: String(e) });
}
}
}
});

Cursor MCP config — ~/.cursor/mcp.json
Replace USERNAME and pplx-YOUR_API_KEY, and ensure node path matches which node. Absolute paths are required.

{
“mcpServers”: {
“perplexity-tools”: {
“type”: “stdio”,
“command”: “/opt/homebrew/bin/node”,
“args”: [“/Users/USERNAME/mcp/perplexity/index.js”],
“env”: { “PPLX_API_KEY”: “pplx-YOUR_API_KEY” }
},
“perplexity-chat”: {
“type”: “stdio”,
“command”: “/opt/homebrew/bin/node”,
“args”: [“/Users/USERNAME/mcp/perplexity-chat/index.js”],
“env”: { “PPLX_API_KEY”: “pplx-YOUR_API_KEY” }
}
}
}

Restart Cursor

  • Settings → Tools & Integrations → MCP Tools

  • You should see two entries, each with a green dot and “1 tools enabled”

How to use in Composer

  • Explicit tool calls (since Perplexity is not a selectable model yet):

    • Use tool perplexity.search with { “q”: “Neon Postgres branching best practices 2025” }

    • Use tool perplexity.chat with { “message”: “Summarize ARCHITECTURE‑Perplexity_v2.1.md highlights.” }

What this setup solves

  • No “Loading tools” spinner (servers emit tool_list immediately)

  • No SSE/CORS/auth issues (env var holds your API key)

  • Works with Cursor’s current MCP tool model (not chat models)

Known limitations

  • Perplexity won’t appear in Cursor’s model picker. This is expected until Cursor adds it as a first-class model.

  • Tools are invoked explicitly; they won’t “run the show” unless the selected model chooses to call them.

  • API usage bills against your Perplexity key; be mindful of call volume.

Troubleshooting quick hits

  • Spinner stuck: Your server didn’t print tool_list on start, or wrong path. Run it manually:

    • PPLX_API_KEY=“pplx-XXX” node ~/mcp/perplexity/index.js

    • Expect one JSON line with type: “tool_list”

  • 401/403: Bad/missing key. Confirm PPLX_API_KEY in mcp.json env.

  • “command not found”: Use absolute node path (e.g., /opt/homebrew/bin/node on Apple Silicon).

  • “0 tools enabled”: Ensure send({ type: ‘tool_list’, tools, capabilities: { tools: {} } }) is present and executes at startup.

  • Works in terminal but not in Cursor: Ensure config is in ~/.cursor/mcp.json, then fully restart Cursor.

Now to clean up the clumps of hair I pulled out to get the above working! Still not what I was hoping to achieve but as implemented with inclusive instructions in .cursorrules to call Perplexity AI e.g. latest UI rifle-shots it is better than relying on 2024 outdated top-tier LLM’s. Yes, some models can be forced to perform a web search but Perplexity AI exists in near real-time which is why I seek to have invited to my coding corral.

1 Like

I wanted to enable Perplexity AI to have more capabilities beyond the default “1 Tool” search-only. Was pleased to discover the full suite of MCP tools could be enabled which for my flow allows Perplexity AI to be fully functional buy with limitations as compared to LLM’s listed in Cursor’s native Model Picker.

Setup Steps (1 Tool → 12 Tools):

  1. Initial Setup: Created ~/.cursor/mcp.json with stdio server configuration

  2. Basic Tool: Started with single perplexity.search tool

  3. Expansion: Added 11 filesystem tools (read_file, write_file, list_directory, etc.)

  4. Protocol Fix: Implemented proper JSON-RPC protocol with initialize and tools/list handlers

  5. Debugging: Added console logging to troubleshoot connection issues

  6. Final Config: Server shows green dot with 12 tools enabled

Key Capabilities:

:white_check_mark: Powerful Features:

  • Real-time web search with current information and citations

  • Filesystem operations (read/write/list/create/delete/search files)

  • Workspace-aware with security boundaries

  • JSON-RPC protocol for reliable communication

  • Debug logging for troubleshooting

:cross_mark: Limitations vs. Native Perplexity LLM:

  • No session memory - each tool call is independent

  • No conversational context - can’t maintain chat history

  • Tool-only interface - no direct chat completion

  • Limited to 12 specific functions vs. full LLM capabilities

  • Requires explicit tool invocation vs. natural language interaction

  • No model selection - fixed to Sonar Pro for search

Why This Matters:

MCP Server Advantages:

  • Integration - Works within Cursor’s existing AI workflow

  • Specialization - Focused tools for specific tasks

  • Reliability - Consistent API for web search and file operations

  • Security - Workspace-bounded filesystem access

Use Case: Perfect for research-heavy development where you need current information and file manipulation, but not full conversational AI capabilities.

Bottom Line: MCP server = specialized tools for specific tasks; Native LLM = full conversational AI with memory and context.