ai_assistant_core
Simple, ergonomic Rust client & server for local LLMs (Ollama, LM Studio, OpenAI-compatible). Chat, list models, stream responses, serve your model remotely.
⚙ Agent Friendliness
N/A
Not evaluated
Can an agent use this?
🔒 Security
N/A
Not evaluated
Is it safe for agents?
⚡ Reliability
N/A
Not evaluated
Does it work consistently?
Scores are editorial opinions as of unknown date.