llm-relay
Shared Rust crate for LLM API types, format conversion, and HTTP client. Uses Anthropic as the canonical internal format and converts at the API boundary.
Evaluated Mar 18, 2026
(0d ago)
⚙ Agent Friendliness
61
/ 100
Can an agent use this?
🔒 Security
49
/ 100
Is it safe for agents?
⚡ Reliability
48
/ 100
Does it work consistently?
Score Breakdown
⚙ Agent Friendliness
MCP Quality
--
Documentation
--
Error Messages
--
Auth Simplicity
20
Rate Limits
75
🔒 Security
TLS Enforcement
70
Auth Strength
30
Scope Granularity
50
Dep. Hygiene
50
Secret Handling
50
⚡ Reliability
Uptime/SLA
40
Version Stability
60
Breaking Changes
50
Error Recovery
40
Interface
REST API
Yes
GraphQL
No
gRPC
No
MCP Server
Yes
SDK
No
Webhooks
No
Authentication
OAuth: No
Scopes: No
Pricing
Free tier:
No
Requires CC:
No
Agent Metadata
Idempotent
Unknown
Retry Guidance
Not documented
Full Evaluation Report
Comprehensive deep-dive: security analysis, reliability audit, agent experience review, cost modeling, competitive positioning, and improvement roadmap for llm-relay.
AI-powered analysis · PDF + markdown · Delivered within 30 minutes
$99
Package Brief
Quick verdict, integration guide, cost projections, gotchas with workarounds, and alternatives comparison.
Delivered within 10 minutes
$3
Score Monitoring
Get alerted when this package's AF, security, or reliability scores change significantly. Stay ahead of regressions.
Continuous monitoring
$3/mo
Scores are editorial opinions as of 2026-03-18.