37 open source tools compared. Sorted by stars. Scroll down for our analysis.
| Tool | Stars | Velocity | Score |
|---|---|---|---|
memory A basic implementation of persistent memory using a local knowledge graph. This lets Claude remember information about the user across chats. | 84.0k | — | 60 |
time <!-- mcp-name: io.github.modelcontextprotocol/server-time --> | 84.0k | — | 60 |
filesystem Node.js server implementing Model Context Protocol (MCP) for filesystem operations. | 84.0k | — | 60 |
sequentialthinking An MCP server implementation that provides a tool for dynamic and reflective problem-solving through a structured thinking process. | 84.0k | — | 60 |
git <!-- mcp-name: io.github.modelcontextprotocol/server-git --> | 84.0k | — | 60 |
everything **[Architecture](docs/architecture.md) | 84.0k | — | 60 |
fetch <!-- mcp-name: io.github.modelcontextprotocol/server-fetch --> | 84.0k | — | 60 |
github-mcp-server GitHub's official MCP Server | 29.0k | — | 57 |
| 17.3k | — | 63 | |
| 15.9k | — | 67 | |
| 14.4k | — | 73 | |
ecs-mcp-server An MCP server for containerizing applications, deploying applications to Amazon Elastic Container Service (ECS), troubleshooting ECS deployments, and managing ECS resources. This server enables AI ass | 8.8k | — | 53 |
memcached-mcp-server MCP server for interacting with Amazon ElastiCache Memcached through a secure and reliable connection | 8.8k | — | 53 |
eks-mcp-server The Amazon EKS MCP server provides AI code assistants with resource management tools and real-time cluster state visibility. This provides large language models (LLMs) with essential tooling and conte | 8.8k | — | 53 |
amazon-keyspaces-mcp-server An Amazon Keyspaces (for Apache Cassandra) MCP server for interacting with Amazon Keyspaces and Apache Cassandra. | 8.8k | — | 53 |
| 8.0k | — | 53 | |
XcodeBuildMCP MCP server for building, testing, and managing Xcode projects with AI assistants | 5.2k | — | 67 |
mobile-mcp MCP server for iOS and Android device automation via accessibility APIs | 4.6k | — | 61 |
| 4.3k | — | 59 | |
| 4.2k | — | 63 | |
| 4.1k | — | 57 | |
| 3.9k | — | 55 | |
frontend MCP server: frontend | 3.9k | — | 47 |
server MCP server: server | 3.9k | — | 47 |
| 3.7k | — | 57 | |
| 3.6k | — | 51 | |
fastmcp A TypeScript framework for building MCP servers. | 3.0k | — | 47 |
spelling This directory contains a script to run cspell (Code Spell Checker) on the repository using the dependencies defined in the adjacent `package*.json` files. | 3.0k | — | 47 |
vscode MCP server: vscode | 3.0k | — | 47 |
vsix-tools MCP server: vsix-tools | 3.0k | — | 47 |
supabase-mcp Connect Supabase to your AI assistants | 2.6k | — | 47 |
agentgateway Next Generation Agentic Proxy for AI Agents and MCP servers | 2.4k | — | 47 |
tavily-mcp Production ready MCP server with real-time search, extract, map & crawl. | 1.8k | — | 43 |
platform MCP server: platform | 1.2k | — | 43 |
spelling MCP server: spelling | 1.2k | — | 43 |
wrapper MCP server: wrapper | 1.2k | — | 43 |
git-mcp-server An MCP (Model Context Protocol) server enabling LLMs and AI agents to interact with Git repositories. Provides tools for comprehensive Git operations including clone, commit, branch, diff, log, status, push, pull, merge, rebase, worktree, tag management, and more, via the MCP standard. STDIO & HTTP. | 207 | — | 34 |
Stay ahead of the category
New tools and momentum shifts, every Wednesday.
Gives your AI assistant persistent memory via a local knowledge graph. It stores entities, relationships, and observations in a JSON file that survives between sessions. The model can remember context about you, your projects, and your preferences. You get entity creation, relationship mapping, and observation tracking. Data lives in a local file you control. No cloud, no API keys, no external dependencies. Completely free. Maintained by the official Model Context Protocol team. If you want your AI to build up context over time instead of starting fresh every session, this is the simplest path. Lightweight and effective.
Gives your AI assistant accurate time and timezone awareness. The model can get the current time in any timezone and convert between them. Simple, but it solves the real problem of models not knowing what time it is. You get current time queries and timezone conversions. No API keys, no external services, runs entirely locally. Completely free. Maintained by the official Model Context Protocol team. A small utility, but surprisingly useful when your AI workflow involves scheduling, deadlines, or coordinating across timezones. Trivial to install, no reason not to have it.
Gives your AI assistant direct access to your local file system. Read files, write files, create directories, move things around. The model becomes a hands-on collaborator instead of just a talker. You get full CRUD on files and directories, with configurable root paths to keep the model sandboxed where you want it. Setup takes one config entry. No API keys needed. Completely free with no underlying costs. Maintained by the official Model Context Protocol team. This is foundational infrastructure. If you want your AI to actually do work on your machine, not just talk about it, this is non-negotiable.
Gives your AI assistant a structured reasoning tool. Instead of jumping straight to answers, the model can break problems into numbered steps, revise earlier thinking, and branch into alternative approaches. Think of it as a scratchpad for complex problem-solving. You get step-by-step reasoning with revision and branching capabilities. No external dependencies, no API keys, no data leaves your machine. Completely free. Maintained by the official Model Context Protocol team. Useful for complex debugging, architectural decisions, or any task where thinking out loud produces better results. Low-cost addition to any MCP setup.
Connects your AI assistant to Git repositories. It can read commit history, diff branches, stage changes, and commit code. The model gets real version control awareness instead of guessing at repo state. You get log inspection, diff generation, branch operations, and commit creation. Point it at any local repo and it works. No API keys, no GitHub account needed, just a local Git installation. Completely free. Maintained by the official Model Context Protocol team. Essential for any AI-assisted development workflow. If your agent writes code, it should understand the repo it is working in. Install it.
A test and demo server that exercises every MCP capability: tools, resources, prompts, sampling, logging, completions. It exists so developers building MCP clients can verify their implementation handles all protocol features. You get synthetic endpoints for every MCP feature type. Useful for integration testing and learning the protocol shape. Zero setup, zero config. Completely free with no underlying costs. Maintained by the official Model Context Protocol team. This is not a production tool. It is a reference implementation for MCP client developers. If you are building or debugging an MCP client, grab it. Otherwise, skip it.
Connects your AI assistant to the open web. Point it at any URL and it pulls back clean, readable content, converting HTML to markdown so the model can actually process it. You get web fetching with automatic content extraction, robots.txt compliance, and configurable request headers. Setup is one line in your MCP config. No API keys, no dependencies beyond the server itself. Completely free, no underlying service costs. Maintained by the official Model Context Protocol team at Anthropic. This is the reference implementation for web access. If your AI workflow touches URLs at all, install this first. It is the baseline.
GitHub's official MCP server gives your AI assistant direct access to repos, issues, PRs, code search, and file operations. No scraping, no workarounds. You talk to your assistant, it talks to GitHub. Setup is one config block plus a personal access token. Maintained by GitHub themselves, so it stays current with API changes. Covers the operations you actually use daily: creating branches, reviewing diffs, searching code across repos, managing issues. The scope is broad enough that it replaces most of your tab-switching. Worth installing immediately if you use GitHub. This is the one MCP server that pays for itself on day one. The catch: it can only do what the GitHub API allows, so anything requiring the web UI (like editing Actions workflows visually) is still manual.
Connects your AI coding agent directly to Figma designs. Instead of screenshots or manual descriptions, the model reads your actual design files, including layout structure, styles, and component hierarchies. It bridges the gap between design and implementation. You get read access to Figma files, frames, and components with structured context the model can act on. Requires a Figma access token. The MCP server is free, Figma itself requires a paid plan for most team usage. Maintained by the community (GLips). This is the go-to Figma integration for AI coding workflows. If you are translating Figma designs into code, this saves serious time. Worth installing.
Connects your AI assistant to Amazon ECS for container management. You can check service health, inspect task definitions, view running tasks, and troubleshoot deployment issues. Covers both Fargate and EC2 launch types. The MCP server is free and open source. ECS itself is free (you pay for underlying Fargate or EC2 compute). Setup requires AWS credentials with ECS permissions. Be cautious with write operations, stopping tasks or updating services in production needs careful IAM scoping. Maintained by AWS Labs. If you manage ECS services and spend time checking deployments or debugging task failures, this puts that workflow in your editor. The ability to inspect stopped task reasons conversationally is worth the setup alone.
Memcached MCP connects your AI assistant to Amazon ElastiCache Memcached clusters. Get, set, and inspect cache keys directly from your editor. Config is your cluster endpoint plus AWS credentials. ElastiCache bills per node-hour. The MCP is free. AWS Labs maintains it alongside the rest of their MCP suite. Useful for debugging cache issues without writing throwaway scripts. The catch: Memcached is simpler than Redis but also less capable. If you are already on Valkey or Redis, this is not the server you want.
Connects your AI assistant to Amazon EKS for Kubernetes cluster management. You can check cluster status, inspect node groups, review workloads, and troubleshoot pod issues. Bridges the gap between kubectl and your AI assistant. The MCP server is free and open source. EKS charges $0.10/hour per cluster, plus EC2 or Fargate compute costs. Setup requires AWS credentials, a configured kubeconfig, and the right RBAC permissions. More setup than most MCP servers. Maintained by AWS Labs. Kubernetes debugging is notoriously painful. Having an AI assistant that can inspect pod logs, check events, and correlate cluster state is valuable. The setup overhead is worth it if you're running EKS daily.
Connects your AI assistant to Amazon Keyspaces, AWS's managed Cassandra-compatible database. Create tables, run CQL queries, and manage schemas through natural language instead of wrestling with the Cassandra CLI. Setup needs AWS credentials and a Keyspaces endpoint. If you already use Keyspaces, this slots right in. The MCP server is free. Keyspaces pricing is pay-per-use based on reads, writes, and storage, with a modest free tier. Maintained by AWS Labs. Useful if you're already running Keyspaces workloads. For everyone else, there are simpler database MCP servers to start with.
Lets your AI assistant build, test, and manage Xcode projects directly. Run builds, launch simulators, install on devices, and validate macros, all through MCP tool calls from Claude or Cursor. Install via Homebrew or npm, point it at your Xcode workspace, and it runs as a per-workspace daemon. Setup takes under five minutes. Official Sentry team maintains this. Free, open source. If you do iOS development with an AI assistant, this removes the constant terminal switching. Genuinely useful for the Xcode workflow.
MCP server for automating iOS and Android devices. Your AI assistant can tap buttons, read screen content, take screenshots, and navigate apps using accessibility data instead of computer vision. Requires Node.js 22+, Xcode (for iOS), or Android SDK. One npm install, then connect to your MCP client. Works with real devices and emulators. Community maintained. Free, open source. Fast and lightweight because it uses native accessibility APIs, not image recognition. The best option if you need your AI to interact with mobile apps directly.
Connects your AI assistant to Exa's neural search API. Run semantic web searches, find similar pages, and get cleaned content from URLs. Think of it as giving your AI a research assistant that actually understands context. Setup is simple: get an Exa API key, add it to your config, and you're searching. The MCP server is free. Exa's API uses credit-based pricing, so your costs scale with usage. Maintained by Exa Labs (the company behind the API). Worth installing if you want your AI to pull real-time web data during conversations. One of the more practical MCP servers out there.
The official C# SDK for building MCP servers and clients. If you're in the .NET ecosystem and want to expose tools, resources, or prompts to AI assistants, this is the foundation you build on. Add the NuGet package, define your tools as methods, and the SDK handles protocol negotiation and transport. Clean API design that feels native to C#. No external service dependencies. Maintained by the Model Context Protocol organization, the same team behind the spec itself. Essential if you're building MCP integrations in C# or .NET. Skip it if you're consuming servers, not building them.
Brings spec-driven development into your AI workflow. Define requirements as structured specs, then let the MCP server guide implementation step by step. Turns vague feature requests into concrete, trackable development tasks. Setup is straightforward. Point it at your project, write specs in the expected format, and your AI assistant becomes a disciplined developer that follows the plan instead of freestyling. Community-maintained by Pimzino. Interesting concept for teams that want more structure in AI-assisted coding. Worth trying if you find AI assistants too eager to write code before understanding requirements.
Google's MCP frontend library for building client-side MCP integrations. Provides the tooling to connect applications to MCP servers, handle protocol negotiation, and manage tool discovery from the client side. This is a library, not a standalone server. You integrate it into your own frontend application to consume MCP servers. Setup complexity depends on your application architecture. Maintained by Google as part of their MCP ecosystem. Useful if you're building a custom AI interface that needs to talk to MCP servers. Most developers will use existing clients like Claude Code instead of building their own.
Google's official MCP server for Google Cloud services. Connects your AI assistant to BigQuery, Cloud SQL, Spanner, Cloud Storage, and other GCP resources. Run queries, manage infrastructure, and inspect services through natural conversation. Requires Google Cloud credentials and appropriate IAM permissions. The MCP server is free. The underlying Google Cloud services have their own pricing, ranging from generous free tiers (BigQuery's 1TB/month free queries) to significant enterprise costs. Maintained by Google. Essential if your stack runs on GCP. The breadth of service coverage makes this one of the more ambitious cloud provider MCP servers.
Connects your AI assistant to Excel and spreadsheet files. Read, write, and manipulate .xlsx files through natural language. Create sheets, update cells, apply formulas, and extract data without opening Excel. Setup is minimal. Install the server, point it at your files, and start asking questions about your spreadsheets. Works with local files, no cloud service needed. Community-maintained. A practical pick for anyone who deals with spreadsheets regularly and wants AI to handle the tedious parts. Particularly useful for data extraction and report generation workflows.
FastMCP is a TypeScript framework for building MCP servers. Instead of wiring up JSON-RPC handlers and transport layers yourself, you define tools and resources with decorators and let FastMCP handle the protocol plumbing. Think Express.js but for MCP. Setup is npm install plus a few lines of code. You get type-safe tool definitions, automatic schema generation, and built-in validation. The API is clean enough that a working MCP server takes under 50 lines. Use this if you're building MCP servers in TypeScript. It cuts the boilerplate dramatically. The catch: you're adding a dependency layer between your code and the protocol, so breaking changes in FastMCP mean updating your servers.
Runs cspell (Code Spell Checker) across your repository through MCP. Catches typos in code, comments, documentation, and configuration files. Uses the same dictionary infrastructure that powers the VS Code spell checker extension. Setup is minimal. The server uses existing cspell configuration if you have one, or works with sensible defaults. No external service needed. Maintained by Microsoft as part of their MCP monorepo. Handy for CI-style spell checking during AI-assisted code reviews. A small but useful addition to your MCP toolkit.
Microsoft's MCP integration for VS Code. Connects your AI assistant to the editor itself, enabling workspace management, file operations, and editor commands through the MCP protocol. Works within the VS Code ecosystem. Setup depends on your VS Code configuration, but the integration is designed to be seamless with Microsoft's AI tooling. Maintained by Microsoft. Relevant if you use VS Code as your primary editor and want deeper AI integration beyond what's built into Copilot. Part of Microsoft's broader MCP strategy.
Tools for packaging and managing VS Code extensions (.vsix files) through MCP. Build, validate, and inspect extension packages without memorizing the vsce CLI flags. This is a narrow-purpose server aimed at VS Code extension developers. Setup is straightforward if you're already in the extension development workflow. Maintained by Microsoft as part of their MCP monorepo. Only worth installing if you actively build and publish VS Code extensions. Everyone else can skip it.
Supabase MCP connects your AI assistant to your Supabase project, covering database queries, auth management, storage buckets, and edge functions. One config file, one API key, and your assistant can run SQL, manage users, and check storage without you opening the dashboard. Community-maintained, not official Supabase. That said, it tracks the Supabase API closely and covers the core operations. Self-hosted Supabase instances work too. Setup takes about two minutes if you already have your project URL and service key. Install this if Supabase is your backend. Skips the dashboard for 80% of what you do. The catch: community project, so new Supabase features may lag behind.
AgentGateway sits between your AI agents and your MCP servers as a proxy layer. Route requests, add auth, enforce rate limits, and monitor what your agents are doing across multiple MCP connections from one control point. Supports both MCP and A2A protocols. Setup requires running the gateway process and configuring your MCP servers to route through it. More moving parts than a single MCP server, but that is the point: it manages the complexity when you have many. Skip this if you run fewer than five MCP servers. Install it when your MCP setup gets complex enough that you need visibility and control. The catch: adds a hop between your agent and every MCP server, which means another thing to debug when something breaks.
Tavily's MCP server gives your AI assistant real-time web search, page extraction, site mapping, and crawling. Ask your assistant to research something and it pulls live results instead of relying on training data. Official, maintained by Tavily. Setup requires an API key from tavily.com. Free tier gives you 1,000 searches per month. Paid plans start at $100/mo for higher volume. The search quality is solid for research and fact-checking tasks. Install this if your AI workflow involves regular web research. The free tier is enough to evaluate it properly. The catch: API key required, and heavy usage means paying. If you only need occasional searches, a free alternative might be enough.
The core platform component of Azure's MCP ecosystem. Manages Azure resource operations, subscription handling, and service discovery. Think of it as the backbone that connects your AI assistant to Azure's control plane. Requires Azure credentials with appropriate permissions. The MCP server is free. You pay for whatever Azure resources you provision or query through it. Maintained by Microsoft's Azure team. Essential if you're using Azure MCP servers, as other components depend on this platform layer. Not a standalone install, part of the broader Azure MCP project.
Spell checking component within Azure's MCP monorepo. Runs cspell across the codebase to catch typos in code, docs, and configuration. Same approach as Microsoft's MCP spelling server, applied specifically to the Azure MCP project. This is a development tool for the Azure MCP project itself, not something end users install separately. It ensures code quality within the repo. Maintained by Microsoft's Azure team. You don't need this unless you're contributing to the Azure MCP project. It's internal tooling that happened to ship in the monorepo.
A wrapper layer in Azure's MCP ecosystem that provides standardized access patterns across Azure services. Handles authentication, request formatting, and response normalization so individual service integrations don't reinvent the wheel. This is infrastructure code, not a standalone server you install directly. It exists to support other Azure MCP components. Setup happens through the parent Azure MCP project. Maintained by Microsoft's Azure team. You won't interact with this directly. It's plumbing that makes the other Azure MCP servers work consistently.
This MCP server gives your AI assistant direct access to Git operations: commits, branches, diffs, logs, and file history. Your assistant can inspect repo state, compare branches, and read commit history without you running commands. Community-maintained and works with any Git repo on your local machine. No API keys, no remote service, just point it at a repo. Setup is one config entry with a path to your repository. Install this if you want your AI assistant to understand your repo's history and current state. Useful for code review workflows and understanding unfamiliar codebases. The catch: local repos only, and write operations (commits, merges) need careful permission scoping to avoid accidents.