Claude-LMStudio Bridge enables Claude to interact with local LLMs running in LM Studio through a robust MCP server implementation. It provides tools for checking connectivity, listing available models, generating text, and handling chat completions with local models. The bridge uses httpx for API communication with LM Studio's server, supports customizable parameters like temperature and token limits, and includes comprehensive error handling. Particularly valuable for users who want to leverage their local models' capabilities directly within Claude conversations, reducing reliance on cloud-based LLMs while maintaining a seamless conversation experience.
暂无评论. 成为第一个评论的人!
登录以参与讨论
Check if the LM Studio server is running.
List the available models in the local LM Studio.
Generate text using a local model. Parameters: prompt (string).
Send a chat completion request to the local LLM. Parameters: message (string).