MCP Conversation Server provides a standardized interface for managing conversations with OpenRouter's language models. It enables applications to create and manage multiple conversations, send messages with streaming support, and persist conversation state to the filesystem. Built with TypeScript and the OpenAI SDK, it features automatic token counting, context window management, and support for various models including Claude 3 Opus, Claude 3 Sonnet, and Llama 2 70B. The server loads configuration from YAML files, handles error states appropriately, and is particularly valuable for developers who need a unified conversation management system across different AI models without managing provider-specific implementations.
まだレビューはありません. 最初のレビューを投稿しましょう!
会話に参加するにはサインインしてください
Create a new conversation with the specified OpenRouter model. Parameters: provider (string, always 'openrouter'), model (string, OpenRouter model ID), title (optional string, conversation title)
Send a message to a specified conversation. Parameters: conversationId (string, ID of the conversation), content (string, message content), stream (optional boolean, enable streaming responses)
Retrieve a list of active conversations with optional filters. Parameters: filter (optional object, can include model (string), startDate (string), endDate (string))