This MCP server provides a bridge between AI applications and Crawlab, a web-based distributed crawler admin platform. Developed by the Crawlab team, it offers tools for spider and task management, file operations, and resource access. The server uses FastMCP and integrates with Crawlab's API, enabling AI-driven web scraping, task automation, and data extraction workflows.
暂无评论. 成为第一个评论的人!
登录以参与讨论
Get details of a specific spider. Parameters: spider_id (string)
Create a new spider. Parameters: spider_details (object)
Update an existing spider. Parameters: spider_id (string), updates (object)
Delete a spider. Parameters: spider_id (string)
Get details of a specific task. Parameters: task_id (string)
Run a spider. Parameters: spider_id (string), options (optional object)
Cancel a running task. Parameters: task_id (string)
Restart a task. Parameters: task_id (string)
Get logs for a task. Parameters: task_id (string)
List files for a spider. Parameters: spider_id (string)
Get content of a specific file. Parameters: spider_id (string), file_name (string)
Save content to a file. Parameters: spider_id (string), file_name (string), content (string)