An MCP server that enables semantic search capabilities through Qdrant vector database integration. It allows AI assistants to retrieve semantically similar documents across multiple collections using natural language queries, with configurable result counts and collection source tracking. The server supports both stdio and HTTP transports, includes REST API endpoints with OpenAPI documentation, and uses embedding models like Xenova/all-MiniLM-L6-v2 to generate vector representations for similarity matching. Particularly useful for knowledge retrieval workflows where semantic understanding is more important than exact keyword matching.
아직 리뷰가 없습니다. 첫 번째 리뷰를 작성해 보세요!
대화에 참여하려면 로그인하세요
Retrieves semantically similar documents from multiple Qdrant vector store collections based on multiple queries. Inputs: collectionNames (string[]), topK (number), query (string[]). Returns: results array containing the query, collectionName, text, and score.