MCP Gemini Server provides a Flask-based implementation that enables AI assistants to interact with Google's Gemini API through a standardized protocol. It supports three key operations: text generation, text analysis (with sentiment, summary, and keyword extraction capabilities), and chat conversations. The server handles client-server communication through a RESTful API, processes requests with appropriate error handling, and securely manages API keys through environment variables. Particularly useful for developers looking to extend AI assistant capabilities with Google's generative models without direct API integration.
まだレビューはありません. 最初のレビューを投稿しましょう!
会話に参加するにはサインインしてください
Generate text content with Gemini. Parameters: prompt (required string), temperature (optional float), max_tokens (optional integer)
Analyze text content. Parameters: text (required string), analysis_type (optional string - 'sentiment', 'summary', 'keywords', or 'general')
Have a conversation with Gemini. Parameters: messages (required array of message objects with 'role' and 'content'), temperature (optional float)