Just Prompt is a lightweight MCP server that provides a unified interface to multiple LLM providers including OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama. It offers tools for sending prompts to multiple models in parallel, automatic model name correction, and saving responses to files. The implementation supports provider shorthand notation, thinking tokens for Claude models, and includes comprehensive test coverage for all providers. Ideal for developers who need to interact with multiple LLM APIs through a single, consistent interface.
暂无评论. 成为第一个评论的人!
登录以参与讨论
Send a prompt to multiple LLM models. Parameters: text (string), models_prefixed_by_provider (optional list of strings)
Send a prompt from a file to multiple LLM models. Parameters: file (string), models_prefixed_by_provider (optional list of strings)
Send a prompt from a file to multiple LLM models and save responses as markdown files. Parameters: file (string), models_prefixed_by_provider (optional list of strings), output_dir (default: '.')
Send a prompt to multiple 'board member' models and have a 'CEO' model make a decision based on their responses. Parameters: file (string), models_prefixed_by_provider (optional list of strings), output_dir (default: '.'), ceo_model (default: 'openai:o3')
List all available LLM providers. Parameters: None
List all available models for a specific LLM provider. Parameters: provider (string)