MCPServers
LM Studio - MCP server logo
16
0

Summary

LMStudio-MCP creates a bridge between Claude and locally running LLM models via LM Studio, allowing Claude to interact with private models running on your machine. The server exposes tools to check API health, list available models, identify the currently loaded model, and generate completions using local models through LM Studio's OpenAI-compatible API endpoints. This enables users to leverage their own models through Claude's interface, combining Claude's capabilities with private, locally-hosted language models.

Available Actions(4)

health_check

Verify if LM Studio API is accessible.

list_models

Get a list of all available models in LM Studio.

get_current_model

Identify which model is currently loaded.

chat_completion

Generate text from your local model. Parameters: prompt (string), system_prompt (optional string), temperature (float), max_tokens (integer).

Last Updated: April 17, 2025

コミュニティレビュー

0.0
0 レビュー
5
0
4
0
3
0
2
0
1
0

まだレビューはありません. 最初のレビューを投稿しましょう!

会話に参加するにはサインインしてください

Coming soon to
HighlightHighlight AI

言語

TypeScript

カテゴリ

タグ