@houtini/lm
MCP server for local LLMs — connects to LM Studio or any OpenAI-compatible endpoint
Homepage ↗
Repo ↗
Ai Ml
mcp
model-context-protocol
mcp-server
lm-studio
ollama
vllm
openai
openai-compatible
local-llm
claude
ai-tools
llama-cpp
ai
llm
⚙ Agent Friendliness
N/A
Not evaluated
Can an agent use this?
🔒 Security
N/A
Not evaluated
Is it safe for agents?
⚡ Reliability
N/A
Not evaluated
Does it work consistently?
Scores are editorial opinions as of unknown date.