Packages
1176 resultsmuapi-cli
Official CLI for muapi.ai platform — generates images, videos, and audio from the terminal with 14 AI models. Supports text-to-image, image editing, upscaling, background removal, face swap, text-to-video, lip-sync, and audio generation. Includes MCP server mode for AI agent integration.
Advanced GitLab MCP Server
Community MCP server for GitLab providing AI agents access to repositories, merge requests, issues, pipelines, and project management operations via the GitLab REST API.
Google Vertex AI MCP Server
MCP server for Google Vertex AI — Google Cloud's enterprise ML platform providing access to Gemini models, PaLM, Imagen, and hundreds of third-party models (Llama, Mistral, Claude via Model Garden). Enables AI agents to call Vertex AI models with enterprise compliance, GCP IAM, VPC integration, and Google's data processing commitments.
HMDA LAR Compliance Checker MCP (Clarid AI)
MCP server for validating HMDA (Home Mortgage Disclosure Act) LAR (Loan Application Register) files against CFPB edit checks. Built for community banks and credit unions needing AI-assisted HMDA compliance validation — checking LAR data for regulatory errors before filing.
Iron Manus MCP
Iron Manus MCP server providing multi-agent orchestration capabilities — managing task graphs, coordinating parallel agent execution, maintaining shared state between agents, and enabling complex multi-step workflows where multiple AI agents collaborate on a single task. Designed as an orchestration layer for building sophisticated agent pipelines using the MCP protocol.
J-Grants MCP Server
Official MCP server from Digital Agency Japan (digital-go-jp org) for J-Grants — Japan's national government grant and subsidy portal. Enables AI agents to search and query Japanese government grants, subsidies, and public funding opportunities through the official J-Grants database API.
JEB MCP Server
JEB MCP server enabling AI agents to interact with JEB — PNF Software's commercial Android and native binary decompiler — querying decompiled Dalvik/DEX bytecode, retrieving Java pseudocode, accessing cross-references and symbol information, and integrating JEB's Android analysis capabilities into agent-driven mobile app reverse engineering workflows.
MCP Checklists
MCP server for managing checklists and structured task lists. Enables AI agents to create, read, update, and manage checklists — tracking completion status, organizing items, and supporting workflow management through AI-assisted checklist operations.
MCP Proxy Server
MCP Proxy Server that aggregates multiple MCP servers into a single endpoint — acting as a multiplexer that allows a single MCP client connection to access tools from multiple backend MCP servers, simplifying agent configuration by presenting a unified tool namespace from many servers.
MS SQL Server MCP
MS SQL Server MCP server enabling AI agents to interact with Microsoft SQL Server databases — executing T-SQL queries, listing tables and schemas, describing table structure, running stored procedures, and integrating SQL Server data access into agent-driven database administration and analytics workflows.
Moondream MCP
Moondream MCP server enabling AI agents to use Moondream — a tiny but capable vision language model that runs locally. Provides image understanding capabilities (describing images, answering questions about images, detecting objects) that can run on CPU or modest hardware. Enables privacy-preserving local image analysis without cloud API dependencies.
OmniFocus MCP Server
OmniFocus MCP server enabling AI agents to interact with OmniFocus — the premium macOS/iOS GTD task manager — querying tasks, projects, and tags, creating and completing tasks, and integrating OmniFocus data into agent-driven productivity and task management workflows on Apple platforms.
ProxyPin MCP Server
Official ProxyPin MCP server enabling AI agents to interact with ProxyPin — an open-source HTTP/HTTPS debugging proxy — capturing network traffic, inspecting request and response data, replaying HTTP requests, analyzing API calls, and integrating traffic capture and analysis into agent-driven API debugging and network analysis workflows.
Rhino MCP Server
Rhino 3D MCP server enabling AI agents to interact with Rhinoceros 3D (Rhino) — the professional NURBS-based 3D modeling software widely used in architecture, industrial design, and engineering. Enables agents to create and modify 3D geometry, run Grasshopper scripts, query model data, and integrate AI assistance into Rhino-based design workflows.
VibeCoder MCP — AI Coding Workflow Orchestrator
VibeCoder MCP server providing an opinionated AI coding workflow orchestrator — managing context across long coding sessions, coordinating file edits, tracking task state, running tests, and implementing structured coding workflows that help AI agents write code more effectively and systematically for complex software projects.
Wazuh MCP Server
Wazuh MCP server enabling AI agents to interact with Wazuh SIEM/XDR platform — querying security alerts and events, retrieving agent status and inventory, searching threat intelligence data, accessing compliance reports, and integrating Wazuh's open-source security monitoring into agent-driven threat detection, incident response, and security operations center (SOC) automation workflows.
jCodeMunch MCP
MCP server that indexes codebases using tree-sitter AST parsing, enabling agents to retrieve specific symbols with byte-level precision instead of loading entire files, reducing token consumption by up to 99.5%.
AgentMail MCP Server
Official MCP server from AgentMail (agentmail-to org) for AgentMail — an email inbox API purpose-built for AI agents. Gives agents their own email inboxes to send, receive, and manage email programmatically. Designed specifically for agent-to-agent and agent-to-human email workflows, not a wrapper around personal email accounts.
Azure MCP (microsoft/mcp)
Microsoft's official MCP implementation repository providing tools and integrations for Azure AI services — Azure OpenAI, Cognitive Services, and Azure AI Studio via the Model Context Protocol.
Cerebras Inference MCP Server
MCP server for Cerebras AI inference — providing ultra-fast LLM inference using Cerebras custom AI chips (CS-3). Enables AI agents to call open-weight models (Llama 3.3 70B, etc.) at speeds far exceeding GPU-based providers (~2000 tokens/second vs ~50-100 tokens/second on GPUs). Best-in-class latency for interactive agents.