The 'mcp-perplexity-search' project is a Model Context Protocol server designed to integrate Perplexity's AI API with large language models (LLMs). It enhances chat interactions by providing advanced completion capabilities using Perplexity's models, such as Sonar and LLaMA. This server is particularly useful for generating technical documentation, analyzing security best practices, conducting code reviews, and creating structured API documentation. It supports multiple output formats, including text, markdown, and JSON, and allows for customizable prompt templates tailored to specific use cases. Users can configure model parameters like temperature and max tokens, and optionally include source URLs in responses. The server is versatile, supporting various environments through configurations in Cline MCP settings and WSL for Claude Desktop. Overall, this project is valuable for developers and organizations looking to leverage Perplexity's AI to automate and enhance content generation and analysis tasks.
Generate chat completions using the Perplexity API with support for specialized prompt templates. Parameters: messages (array, required), prompt_template (string, optional), custom_template (object, optional), format (string, optional), include_sources (boolean, optional), model (string, optional), temperature (number, optional), max_tokens (number, optional)
No reviews yet. Be the first to review!
Sign in to join the conversation
After downloading, you can run the MCP server in any client or IDE:
node path/to/downloaded/file.mjs