Claude-LMStudio Bridge enables Claude to interact with local LLMs running in LM Studio through a robust MCP server implementation. It provides tools for checking connectivity, listing available models, generating text, and handling chat completions with local models. The bridge uses httpx for API communication with LM Studio's server, supports customizable parameters like temperature and token limits, and includes comprehensive error handling. Particularly valuable for users who want to leverage their local models' capabilities directly within Claude conversations, reducing reliance on cloud-based LLMs while maintaining a seamless conversation experience.
Check if the LM Studio server is running.
List the available models in the local LM Studio.
Generate text using a local LLM. Parameters: prompt (string).
Send a chat completion request to the local LLM. Parameters: message (string).
No reviews yet. Be the first to review!
Sign in to join the conversation
Our bundler currently only supports TypeScript-based servers. Check back soon!