This OpenRouter MCP server, developed by bossying, provides a unified interface for accessing various AI models through OpenRouter.ai. Built with TypeScript and leveraging the OpenAI SDK, it offers tools for model management, chat completion, and detailed model information retrieval. The server implements intelligent rate limiting, caching, and error handling to optimize API usage. Key features include model search, capability validation, and default model configuration. By abstracting the complexities of API communication and model selection, it enables AI systems to easily leverage a wide range of language models. This implementation is particularly valuable for applications requiring flexible access to multiple AI models, facilitating use cases such as chatbots, content generation, and AI-assisted analysis across various domains and capabilities.
Nessuna recensione ancora. Sii il primo a recensire!
Accedi per unirti alla conversazione
Sends a request to the OpenRouter Chat Completions API. Parameters: model (optional string), messages (array, required), temperature (optional number), max_tokens (optional number), provider (optional object with routing configuration).
Search and filter available models based on query, provider, minimum context length, and capabilities. Parameters: query (optional string), provider (optional string), minContextLength (optional number), capabilities (optional object with functions and vision flags).
Get detailed information about a specific model. Parameters: model (string, required).
Check if a model ID is valid. Parameters: model (string, required).