databricks-mcp-server-by-cdata

Provides a local, read-only Model Context Protocol (MCP) server that exposes Databricks data through CData’s JDBC driver as MCP tools (e.g., get_tables, get_columns, run_query) for LLM clients like Claude Desktop.

Evaluated Apr 04, 2026 (16d ago)
Repo ↗ API Gateway mcp databricks jdbc cdata claude-desktop java data-access read-only
⚙ Agent Friendliness
52
/ 100
Can an agent use this?
🔒 Security
48
/ 100
Is it safe for agents?
⚡ Reliability
25
/ 100
Does it work consistently?

Score Breakdown

⚙ Agent Friendliness

MCP Quality
75
Documentation
70
Error Messages
0
Auth Simplicity
65
Rate Limits
10

🔒 Security

TLS Enforcement
60
Auth Strength
55
Scope Granularity
30
Dep. Hygiene
40
Secret Handling
50

Security posture is constrained by local stdio operation (reduces network exposure) but secrets/credentials are placed into a local .prp file (not described how stored/handled). The README does not document TLS requirements, auth scopes/granularity, or safe query controls (e.g., SQL injection protections at the MCP layer). Because run_query executes provided SQL, agent-driven query composition should be treated as a risk (data exfiltration within the allowed Databricks permissions).

⚡ Reliability

Uptime/SLA
0
Version Stability
35
Breaking Changes
40
Error Recovery
25
AF Security Reliability

Best When

You want a local MCP integration for read-only querying of Databricks via CData JDBC and you can run the server on the same machine as the MCP client.

Avoid When

You need robust server-side controls for query safety, multi-tenant isolation, or a network-accessible API; also avoid if you cannot provide/handle JDBC driver licensing and connection configuration securely.

Use Cases

  • Allow an LLM to query live Databricks data using natural language without writing SQL
  • Discover available Databricks tables/columns via MCP tools
  • Run read-only SQL SELECT queries through an MCP tool interface

Not For

  • Internet-facing or remote MCP access (server uses stdio and is intended to run locally)
  • Use cases requiring full CRUD or write/update/delete operations
  • Environments that require an HTTP API, webhooks, or centralized API gateways

Interface

REST API
No
GraphQL
No
gRPC
No
MCP Server
Yes
SDK
No
Webhooks
No

Authentication

Methods: JDBC connection using CData JDBC Driver for Databricks (may use OAuth depending on connection settings) CData JDBC driver licensing via java -jar ... --license (trial or license key)
OAuth: Yes Scopes: No

README notes OAuth may require browser authentication when testing the JDBC connection string, but does not describe MCP-level auth. The MCP server relies on the JDBC driver configuration in the .prp file.

Pricing

Free tier: No
Requires CC: No

No pricing numbers are provided for this repository itself; it depends on CData JDBC driver licensing and potentially CData’s managed MCP offerings.

Agent Metadata

Pagination
none
Idempotent
False
Retry Guidance
Not documented

Known Gotchas

  • Server uses stdio; only works with MCP clients running on the same machine.
  • Tool outputs are described as CSV; agents may need to parse CSV reliably.
  • No explicit mention of rate limiting, pagination, or query size limits; large queries may fail or time out depending on Databricks/JDBC driver behavior.

Alternatives

Full Evaluation Report

Comprehensive deep-dive: security analysis, reliability audit, agent experience review, cost modeling, competitive positioning, and improvement roadmap for databricks-mcp-server-by-cdata.

AI-powered analysis · PDF + markdown · Delivered within 30 minutes

$99

Package Brief

Quick verdict, integration guide, cost projections, gotchas with workarounds, and alternatives comparison.

Delivered within 10 minutes

$3

Score Monitoring

Get alerted when this package's AF, security, or reliability scores change significantly. Stay ahead of regressions.

Continuous monitoring

$3/mo

Scores are editorial opinions as of 2026-04-04.

8642
Packages Evaluated
17761
Need Evaluation
586
Need Re-evaluation
Community Powered