Perplexity
Summary
The 'mcp-perplexity-search' project is a Model Context Protocol server designed to integrate Perplexity's AI API with large language models (LLMs). It enhances chat interactions by providing advanced completion capabilities using Perplexity's models, such as Sonar and LLaMA. This server is particularly useful for generating technical documentation, analyzing security best practices, conducting code reviews, and creating structured API documentation. It supports multiple output formats, including text, markdown, and JSON, and allows for customizable prompt templates tailored to specific use cases. Users can configure model parameters like temperature and max tokens, and optionally include source URLs in responses. The server is versatile, supporting various environments through configurations in Cline MCP settings and WSL for Claude Desktop. Overall, this project is valuable for developers and organizations looking to leverage Perplexity's AI to automate and enhance content generation and analysis tasks.
Available Actions(1)
chat_completion
Generate chat completions using the Perplexity API with support for specialized prompt templates. Parameters include messages (array, required), prompt_template (string, optional), custom_template (object, optional), format (string, optional), include_sources (boolean, optional), model (string, optional), temperature (number, optional), and max_tokens (number, optional).
コミュニティレビュー
まだレビューはありません. 最初のレビューを投稿しましょう!
会話に参加するにはサインインしてください