MCP Gemini Server provides a Flask-based implementation that enables AI assistants to interact with Google's Gemini API through a standardized protocol. It supports three key operations: text generation, text analysis (with sentiment, summary, and keyword extraction capabilities), and chat conversations. The server handles client-server communication through a RESTful API, processes requests with appropriate error handling, and securely manages API keys through environment variables. Particularly useful for developers looking to extend AI assistant capabilities with Google's generative models without direct API integration.
아직 리뷰가 없습니다. 첫 번째 리뷰를 작성해 보세요!
대화에 참여하려면 로그인하세요
Generate text content with Gemini. Parameters: prompt (required string), temperature (optional float), max_tokens (optional integer)
Analyze text content. Parameters: text (required string), analysis_type (optional string - 'sentiment', 'summary', 'keywords', or 'general')
Have a conversation with Gemini. Parameters: messages (required array of message objects with 'role' and 'content'), temperature (optional float)