MCPServers
Ollama logo

Ollama for Cline

21
17.6k
0

Summary

This MCP server, developed by Matt Green, provides integration with Ollama for local large language model inference. Built using Python and leveraging the MCP CLI, it enables AI assistants to interact with Ollama's API for listing available models, retrieving model details, and generating text completions. The implementation focuses on providing a standardized interface to Ollama's capabilities, making it easier to incorporate local LLM inference into AI workflows. It's particularly useful for developers and researchers who want to leverage locally-run open-source language models, enabling use cases such as private AI assistants, custom model fine-tuning, and AI-augmented development without relying on cloud APIs.

Available Actions(3)

list_models

List all downloaded Ollama models.

show_model

Get detailed information about a specific model.

ask_model

Ask a question to a specified model.

Last Updated: April 28, 2025

Community Reviews

0.0
0 reviews
5
0
4
0
3
0
2
0
1
0

No reviews yet. Be the first to review!

Sign in to join the conversation

Coming soon to
HighlightHighlight AI

Language

TypeScript

Categories

Tags

Ollama MCP Server for Cline