The LLM Responses MCP Server enables multiple AI agents to share and read each other's responses to the same prompt, facilitating collaborative analysis and reflection. Built with TypeScript using the Model Context Protocol SDK, it provides two main tools: 'submit-response' for LLMs to submit their answers to a prompt, and 'get-responses' to retrieve all responses from other LLMs for a specific question. The implementation includes Docker configuration for easy deployment to EC2 instances and uses Bun as its JavaScript runtime. This server is particularly valuable for scenarios where multiple AI agents need to analyze the same problem and learn from each other's perspectives.
Submit an LLM's response to a prompt. Parameters: llmId (string), prompt (string), response (string)
Retrieve all LLM responses, optionally filtered by prompt. Parameters: prompt (optional string)
No reviews yet. Be the first to review!
Sign in to join the conversation
Our bundler currently only supports TypeScript-based servers. Check back soon!