frappe-mcp-server
Provides an MCP server (and local HTTP endpoint) that lets AI assistants query and manipulate ERPNext/Frappe data using generic “doctype” tools, plus wrapper project analytics tools. It can connect to ERPNext via Frappe/ERPNext APIs and to an OpenAI-compatible LLM provider (including local Ollama).
Score Breakdown
⚙ Agent Friendliness
🔒 Security
Security details are not evidenced in the provided README (e.g., TLS requirements for the local HTTP API, auth mechanism for ERPNext, request validation, audit logging, or how secrets are stored/handled). The project’s approach of allowing generic CRUD across doctypes elevates the need for strict least-privilege ERPNext credentials and network isolation. Treat the README’s “production ready” claim as non-verifiable marketing without additional evidence.
⚡ Reliability
Best When
You have an ERPNext/Frappe deployment and want to integrate AI agents via MCP with tools that map to Frappe doctypes, optionally using a local model (Ollama) for privacy.
Avoid When
You need a fully standardized, audited auth model (e.g., OAuth scopes per action) for third-party access, or you cannot validate the server’s security posture (because docs reviewed here don’t show detailed security guarantees).
Use Cases
- • Answer natural-language questions about ERPNext/Frappe documents (any doctype, standard or custom)
- • Assist with CRUD-style operations on Frappe documents via MCP tools
- • Search and analyze documents and related records
- • Provide project-oriented summaries/metrics such as status and portfolio dashboards
Not For
- • Directly exposing ERPNext production data to untrusted clients without proper network/auth controls
- • Use as a general-purpose ERP integration API independent of Frappe/ERPNext models
- • Use where strict enterprise compliance requirements (SOC2, ISO, etc.) must be contractually documented
Interface
Authentication
Auth for ERPNext/Frappe appears to be handled via config.yaml, but the provided README does not specify whether it uses API keys, session cookies, OAuth, or fine-grained permissions. LLM provider auth is described as a simple api_key for OpenAI-compatible endpoints.
Pricing
Open-source project (MIT). Costs depend on your chosen LLM provider and your ERPNext hosting.
Agent Metadata
Known Gotchas
- ⚠ CRUD tools for “ANY doctype” increase the risk of accidental writes/overwrites if the agent is not constrained; ensure the agent is granted least-privilege permissions in ERPNext.
- ⚠ If LLM outputs unstructured or ambiguous intents, the server may attempt broader queries (e.g., search/analyze) that can be slow or return large result sets; constrain queries via doctype and filters.
- ⚠ Because pagination/limits aren’t evidenced in the README, agents may request large datasets without safeguards—implement client-side limits if needed.
Alternatives
Full Evaluation Report
Comprehensive deep-dive: security analysis, reliability audit, agent experience review, cost modeling, competitive positioning, and improvement roadmap for frappe-mcp-server.
AI-powered analysis · PDF + markdown · Delivered within 30 minutes
Package Brief
Quick verdict, integration guide, cost projections, gotchas with workarounds, and alternatives comparison.
Delivered within 10 minutes
Score Monitoring
Get alerted when this package's AF, security, or reliability scores change significantly. Stay ahead of regressions.
Continuous monitoring
Scores are editorial opinions as of 2026-04-04.