MCP Conversation Server provides a standardized interface for managing conversations with OpenRouter's language models. It enables applications to create and manage multiple conversations, send messages with streaming support, and persist conversation state to the filesystem. Built with TypeScript and the OpenAI SDK, it features automatic token counting, context window management, and support for various models including Claude 3 Opus, Claude 3 Sonnet, and Llama 2 70B. The server loads configuration from YAML files, handles error states appropriately, and is particularly valuable for developers who need a unified conversation management system across different AI models without managing provider-specific implementations.
Create a new conversation with a specified OpenRouter model. Parameters: provider (string, always 'openrouter'), model (string, OpenRouter model ID), title (optional string, conversation title)
Send a message to a specific conversation. Parameters: conversationId (string, the ID of the conversation), content (string, the message content), stream (optional boolean, enable streaming responses)
List all active conversations with optional filtering. Parameters: filter (optional object, can include model (string), startDate (string), endDate (string))
No reviews yet. Be the first to review!
Sign in to join the conversation
Our bundler currently only supports TypeScript-based servers. Check back soon!