LMStudio-MCP creates a bridge between Claude and locally running LLM models via LM Studio, allowing Claude to interact with private models running on your machine. The server exposes tools to check API health, list available models, identify the currently loaded model, and generate completions using local models through LM Studio's OpenAI-compatible API endpoints. This enables users to leverage their own models through Claude's interface, combining Claude's capabilities with private, locally-hosted language models.
Verify if LM Studio API is accessible.
Get a list of all available models in LM Studio.
Identify which model is currently loaded.
Generate text from your local model. Parameters: prompt (string), system_prompt (string), temperature (float), max_tokens (int).
No reviews yet. Be the first to review!
Sign in to join the conversation
Our bundler currently only supports TypeScript-based servers. Check back soon!