MCP-Qdrant Server provides a vector database integration for AI assistants, combining a Qdrant vector database with a specialized server that enables knowledge storage and retrieval. The implementation uses Docker containers to run both the Qdrant database and the MCP server, with the server utilizing the sentence-transformers embedding model to convert natural language into vector representations. It exposes two primary tools: one for storing code snippets with natural language descriptions, and another for searching the knowledge base using semantic queries. This setup is particularly useful for AI assistants that need to maintain persistent memory of code examples and technical knowledge across conversations.
Aucun avis encore. Soyez le premier à donner votre avis !
Connectez-vous pour rejoindre la conversation
Stores information in the Qdrant database. Parameters: information (string): the information to be stored, metadata (JSON): optional metadata.
Searches for relevant information in the Qdrant database. Parameters: query (string): the query to use for the search.