MCPServers
Gemini AI - MCP server logo

Gemini AI

11
0

Summary

MCP-Gemini-Server provides a robust interface to Google's Gemini AI models through a collection of specialized tools. Built with TypeScript and the Model Context Protocol SDK, it enables AI assistants to leverage Gemini's capabilities including content generation, chat functionality, function calling, and file/cache management. The server implements both streaming and non-streaming content generation, supports stateful chat sessions with function execution, and offers comprehensive file and cache operations for optimizing performance. This implementation stands out by providing fine-grained control over generation parameters, safety settings, and tool configurations while handling error cases gracefully. It's particularly valuable for developers building AI applications that need direct access to Gemini's advanced features through a standardized protocol.

Available Actions(15)

gemini_generateContent

Generates non-streaming text content from a prompt. Required Params: prompt (string); Optional Params: modelName (string), generationConfig (object), safetySettings (array)

gemini_generateContentStream

Generates text content via streaming. Required Params: prompt (string); Optional Params: modelName (string), generationConfig (object), safetySettings (array)

gemini_functionCall

Sends a prompt and function declarations to the model, returning either a text response or a requested function call object. Required Params: prompt (string), functionDeclarations (array); Optional Params: modelName (string), generationConfig (object), safetySettings (array), toolConfig (object)

gemini_startChat

Initiates a new stateful chat session and returns a unique sessionId. Required Params: None; Optional Params: modelName (string), history (array), tools (array), generationConfig (object), safetySettings (array)

gemini_sendMessage

Sends a message within an existing chat session. Required Params: sessionId (string), message (string); Optional Params: generationConfig (object), safetySettings (array), tools (array), toolConfig (object)

gemini_sendFunctionResult

Sends the result of a function execution back to a chat session. Required Params: sessionId (string), functionResponses (array); Optional Params: generationConfig (object), safetySettings (array)

gemini_uploadFile

Uploads a file from a local path. Required Params: filePath (string - must be an absolute path); Optional Params: displayName (string), mimeType (string)

gemini_listFiles

Lists previously uploaded files. Required Params: None; Optional Params: pageSize (number), pageToken (string)

gemini_getFile

Retrieves metadata for a specific uploaded file. Required Params: fileName (string - e.g., files/abc123xyz)

gemini_deleteFile

Deletes an uploaded file. Required Params: fileName (string - e.g., files/abc123xyz)

gemini_createCache

Creates cached content for compatible models. Required Params: contents (array); Optional Params: modelName (string), displayName (string), systemInstruction (object), ttl (string - e.g., '3600s')

gemini_listCaches

Lists existing cached content. Required Params: None; Optional Params: pageSize (number), pageToken (string)

gemini_getCache

Retrieves metadata for specific cached content. Required Params: cacheName (string - e.g., cachedContents/abc123xyz)

gemini_updateCache

Updates metadata (TTL, displayName) for cached content. Required Params: cacheName (string); Optional Params: ttl (string), displayName (string)

gemini_deleteCache

Deletes cached content. Required Params: cacheName (string - e.g., cachedContents/abc123xyz)

Last Updated: April 17, 2025

社区评论

0.0
0 条评论
5
0
4
0
3
0
2
0
1
0

暂无评论. 成为第一个评论的人!

登录以参与讨论

Coming soon to
HighlightHighlight AI

语言

TypeScript

分类

标签