This Perplexity MCP server provides a bridge between Claude and Perplexity AI's language models, enabling enhanced AI interactions through tool use. It implements two main tools: perplexity_chat for advanced chat completions with full message history support, and perplexity_ask for simplified single-query interactions. The server is built with TypeScript, uses the @modelcontextprotocol/sdk for MCP implementation, and communicates via stdio transport. It features comprehensive error handling, environment-based configuration, and a modular tool system with type-safe handlers. This implementation is particularly useful for developers and researchers looking to leverage Perplexity AI's capabilities within Claude, enabling applications like advanced chatbots, question-answering systems, and AI-assisted research tools.
Generate a chat completion using Perplexity AI. Parameters: model (optional string), messages (array of {role, content} objects - The conversation history), temperature (optional number - Sampling temperature between 0-2)
Send a simple query to Perplexity AI. Parameters: query (string - The question or prompt to send), model (optional string - One of: llama-3.1-sonar-small-128k-online, llama-3.1-sonar-large-128k-online, llama-3.1-sonar-huge-128k-online)
No reviews yet. Be the first to review!
Sign in to join the conversation