MCP Conversation Server provides a standardized interface for managing conversations with OpenRouter's language models. It enables applications to create and manage multiple conversations, send messages with streaming support, and persist conversation state to the filesystem. Built with TypeScript and the OpenAI SDK, it features automatic token counting, context window management, and support for various models including Claude 3 Opus, Claude 3 Sonnet, and Llama 2 70B. The server loads configuration from YAML files, handles error states appropriately, and is particularly valuable for developers who need a unified conversation management system across different AI models without managing provider-specific implementations.
まだレビューはありません. 最初のレビューを投稿しましょう!
会話に参加するにはサインインしてください
Creates a new conversation with a specified OpenRouter model. Parameters: provider (string, always 'openrouter'), model (string, OpenRouter model ID), title (optional string, title of the conversation)
Sends a message to an ongoing conversation. Parameters: conversationId (string, ID of the conversation), content (string, content of the message), stream (optional boolean, enable streaming responses)
Lists all active conversations with optional filtering. Parameters: filter (optional object with model (string, filter by model), startDate (string, filter by start date), endDate (string, filter by end date))