
Replicate
Summary
This Replicate MCP server implementation provides a bridge between the Model Context Protocol and Replicate's AI model hosting platform. Developed as an open-source project, it enables seamless interaction with Replicate's diverse collection of machine learning models through a standardized MCP interface. The server supports key operations such as listing available models, creating and managing predictions, and accessing model metadata. Built with TypeScript and leveraging the official Replicate API, it offers robust error handling, caching mechanisms, and webhook support. This implementation is particularly useful for developers and researchers looking to integrate Replicate's hosted AI models into their MCP-compatible applications, enabling easy access to a wide range of pre-trained models for tasks like image generation, text processing, and more.
Available Actions(13)
search_models
Find models using semantic search.
list_models
Browse available models.
get_model
Get details about a specific model.
list_collections
Browse model collections.
get_collection
Get details about a specific collection.
create_prediction
Run a model with your inputs.
create_and_poll_prediction
Run a model with your inputs and wait until it's completed.
get_prediction
Check a prediction's status.
cancel_prediction
Stop a running prediction.
list_predictions
See your recent predictions.
view_image
Open an image in your browser.
clear_image_cache
Clean up cached images.
get_image_cache_stats
Check cache usage.
社区评论
暂无评论. 成为第一个评论的人!
登录以参与讨论