{"id":"nutanix-nai-kserve-huggingfaceserver","name":"nai-kserve-huggingfaceserver","af_score":31.5,"security_score":31.5,"reliability_score":17.5,"what_it_does":"The nai-kserve-huggingfaceserver package provides a KServe server integration for serving Hugging Face models. It is intended to run Hugging Face model inference behind a KServe-compatible endpoint.","best_when":"You already use Kubernetes and KServe, and want to deploy Hugging Face models with minimal custom serving code.","avoid_when":"You do not operate Kubernetes/KServe, or you need a turnkey hosted API with built-in usage analytics and rate-limiting policies.","last_evaluated":"2026-04-04T21:34:16.842749+00:00","has_mcp":false,"has_api":false,"auth_methods":[],"has_free_tier":false,"known_gotchas":["No evidence of an agent-facing interface contract (OpenAPI/SDK/MCP) in the provided information","KServe deployments often rely on Kubernetes ingress for auth/rate limits, so agent callers must align with the ingress behavior"],"error_quality":0.0}