Dive

An open-source desktop application that serves as an MCP host, connecting any LLM (ChatGPT, Anthropic, Ollama, OpenAI-compatible) with MCP servers through a unified chat interface. Supports stdio and SSE MCP transports, includes built-in tools (Fetch, File Manager, Bash), offers 24+ language support, and provides an installer agent for automatic MCP server configuration. Available in Electron and Tauri variants across Windows, macOS, and Linux.

Evaluated Mar 06, 2026 (0d ago) v0.14.1
Homepage ↗ Repo ↗ AI & Machine Learning mcp desktop-app electron tauri ollama openai anthropic multi-llm chat-interface mcp-host
⚙ Agent Friendliness
47
/ 100
Can an agent use this?
🔒 Security
70
/ 100
Is it safe for agents?
⚡ Reliability
64
/ 100
Does it work consistently?

Score Breakdown

⚙ Agent Friendliness

MCP Quality
0
Documentation
65
Error Messages
50
Auth Simplicity
68
Rate Limits
55

🔒 Security

TLS Enforcement
80
Auth Strength
75
Scope Granularity
60
Dep. Hygiene
70
Secret Handling
65

Community/specialized tool. Apply standard security practices for category. Review documentation for specific security requirements.

⚡ Reliability

Uptime/SLA
70
Version Stability
65
Breaking Changes
60
Error Recovery
60
AF Security Reliability

Best When

You want a visual desktop app to interact with MCP servers using various LLM providers, especially if you use local models via Ollama and want a polished chat UI.

Avoid When

You need headless/programmatic MCP hosting, prefer CLI workflows, or already have a well-configured MCP client like Claude Code or Cursor.

Use Cases

  • Using local Ollama models with MCP tools via a desktop GUI
  • Connecting multiple LLM providers to MCP servers without coding
  • Having a unified chat interface that works with any MCP-compatible tool
  • Quick MCP server testing and experimentation through a visual interface
  • Non-technical users who want MCP tool access without CLI setup

Not For

  • Headless or server-side MCP hosting (it is a desktop GUI app)
  • Production agent deployment pipelines
  • Users who prefer CLI-only workflows
  • Building custom MCP servers (it consumes them, not creates them)

Interface

REST API
No
GraphQL
No
gRPC
No
MCP Server
No
SDK
No
Webhooks
No

Authentication

OAuth: No Scopes: No

No authentication for the app itself. Users provide their own API keys for LLM providers (OpenAI, Anthropic, etc.) via local configuration. MCP server auth support is noted as unstable.

Pricing

Model: open_source
Free tier: Yes
Requires CC: No

MIT licensed. Fully free and open source. OAPHub.ai cloud integration available for managed MCP servers but pricing not documented.

Agent Metadata

Pagination
unknown
Idempotent
Unknown
Retry Guidance
Not documented

Known Gotchas

  • This is an MCP host/client, NOT an MCP server - it does not expose tools via MCP
  • MCP server authentication support is marked as unstable
  • macOS requires manual Python and Node.js installation
  • Electron and Tauri variants have different platform availability
  • Built-in Bash tool gives LLMs shell access which has security implications

Alternatives

Full Evaluation Report

Detailed scoring breakdown, competitive positioning, security analysis, and improvement recommendations for Dive.

$99

Scores are editorial opinions as of 2026-03-06.

5173
Packages Evaluated
26151
Need Evaluation
173
Need Re-evaluation
Community Powered