Packages
32190 resultssystembridge-mcp
System Bridge MCP Server — token management, transformation, validation, and evolution
systeminfo_rust_mcp
A Model Context Protocol (MCP) server for system information retrieval
systemprompt
systemprompt.io - Extensible AI agent orchestration framework
systemprompt-code-orchestrator
MCP server for orchestrating AI coding agents (Claude Code CLI & Gemini CLI). Features task management, process execution, Git integration, and dynamic resource discovery. Full TypeScript implementation with Docker support and Cloudflare Tunnel integration.
systemprompt-mcp-core
The core MCP extension for Systemprompt MCP multimodal client
systemprompt-mcp-notion
This an Model Context Protocol (MCP) server that integrates Notion into your AI workflows. This server enables seamless access to Notion through MCP, allowing AI agents to interact with pages, databases, and comments.
systemprompt-mcp-server
A complete, production-ready implementation of a Model Context Protocol (MCP) server demonstrating OAuth 2.1, tools, prompts, resources, sampling, and notifications using Reddit as a real-world integration example.
systemprompt-template
Production AI agent mesh in 3 commands. MCP servers, playbooks, and multi-agent orchestration built on systemprompt-core.
sysutils-rust
System Utilities MCP in Rust
sysutils-stdiokey-rust
System Utilities with API Key MCP Stdio transport in Rust
szge-lolwiki-mcp
Generate friendly greetings for any audience. Toggle Pirate Mode for a playful, swashbuckling styl…
tabby-mcp-server
MCP server for control Tabby terminal
tabby-vscode-agent
MCP server for control Tabby terminal
tablestore-mcp-server
MCP server for retrieving context from a tablestore vector database
tacacs_server
tachibot-mcp
Multi-model AI orchestration with 31 tools, YAML workflows, and 5 token-optimized profiles.
tadaaa-mcp
TADAAA! MCP server
tagesschau-mcp-server
It's an MCP server for tagesschau.de
tagger-server
tagny-mcp-server
An MCP server with web search, URL text fetching, and more tools to enhance locally served LLMs