MCP-Gemini-Server provides a robust interface to Google's Gemini AI models through a collection of specialized tools. Built with TypeScript and the Model Context Protocol SDK, it enables AI assistants to leverage Gemini's capabilities including content generation, chat functionality, function calling, and file/cache management. The server implements both streaming and non-streaming content generation, supports stateful chat sessions with function execution, and offers comprehensive file and cache operations for optimizing performance. This implementation stands out by providing fine-grained control over generation parameters, safety settings, and tool configurations while handling error cases gracefully. It's particularly valuable for developers building AI applications that need direct access to Gemini's advanced features through a standardized protocol.
Generates non-streaming text content from a prompt. Required Params: prompt (string). Optional Params: modelName (string), generationConfig (object), safetySettings (array).
Generates text content via streaming. Required Params: prompt (string). Optional Params: modelName (string), generationConfig (object), safetySettings (array).
Sends a prompt and function declarations to the model, returning either a text response or a requested function call object (as a JSON string). Required Params: prompt (string), functionDeclarations (array). Optional Params: modelName (string), generationConfig (object), safetySettings (array), toolConfig (object).
Initiates a new stateful chat session and returns a unique sessionId. Required Params: None. Optional Params: modelName (string), history (array), tools (array), generationConfig (object), safetySettings (array).
Sends a message within an existing chat session. Required Params: sessionId (string), message (string). Optional Params: generationConfig (object), safetySettings (array), tools (array), toolConfig (object).
Sends the result of a function execution back to a chat session. Required Params: sessionId (string), functionResponses (array). Optional Params: generationConfig (object), safetySettings (array).
Uploads a file from a local path. Required Params: filePath (string - must be an absolute path). Optional Params: displayName (string), mimeType (string).
Lists previously uploaded files. Required Params: None. Optional Params: pageSize (number), pageToken (string).
Retrieves metadata for a specific uploaded file. Required Params: fileName (string - e.g., files/abc123xyz).
Deletes an uploaded file. Required Params: fileName (string - e.g., files/abc123xyz).
Creates cached content for compatible models (e.g., gemini-1.5-flash). Required Params: contents (array). Optional Params: modelName (string), displayName (string), systemInstruction (object), ttl (string - e.g., '3600s').
Lists existing cached content. Required Params: None. Optional Params: pageSize (number), pageToken (string).
Retrieves metadata for specific cached content. Required Params: cacheName (string - e.g., cachedContents/abc123xyz).
Updates metadata (TTL, displayName) for cached content. Required Params: cacheName (string). Optional Params: ttl (string), displayName (string).
Deletes cached content. Required Params: cacheName (string - e.g., cachedContents/abc123xyz).
No reviews yet. Be the first to review!
Sign in to join the conversation
Our bundler currently only supports TypeScript-based servers. Check back soon!