This Replicate MCP server implementation provides a bridge between the Model Context Protocol and Replicate's AI model hosting platform. Developed as an open-source project, it enables seamless interaction with Replicate's diverse collection of machine learning models through a standardized MCP interface. The server supports key operations such as listing available models, creating and managing predictions, and accessing model metadata. Built with TypeScript and leveraging the official Replicate API, it offers robust error handling, caching mechanisms, and webhook support. This implementation is particularly useful for developers and researchers looking to integrate Replicate's hosted AI models into their MCP-compatible applications, enabling easy access to a wide range of pre-trained models for tasks like image generation, text processing, and more.
아직 리뷰가 없습니다. 첫 번째 리뷰를 작성해 보세요!
대화에 참여하려면 로그인하세요
Find models using semantic search.
Browse available models.
Get details about a specific model.
Browse model collections.
Get details about a specific collection.
Run a model with your inputs.
Run a model with your inputs and wait until it's completed.
Check a prediction's status.
Stop a running prediction.
See your recent predictions.
Open an image in your browser.
Clean up cached images.
Check cache usage.