Model Context Protocol (MCP)
An open standard that gives AI models a universal interface to connect with external tools, data sources, and services.
Last updated: 2026-04-12
Overview
Without MCP, an LLM is a “brain in a jar” — capable of reasoning but unable to act on external systems. It cannot access files, query databases, check email, search the web, or interact with APIs unless the application hard-codes each integration.
MCP solves this by defining one universal protocol. Build a connection once (an MCP server) and any AI model that supports MCP can use it — without custom integration work per model or per app.
The key distinction:
- Skills = teach the model HOW to do things (procedures, domain knowledge, judgment)
- MCP = give the model ACCESS to external systems (data, tools, services)
Skills and MCP are complementary. A skill might describe how to do a DCF valuation (the procedure); an MCP server provides access to the financial data the skill needs (the raw material).
How It Works
MCP servers expose tools that the model can call. Each server wraps a specific system (a database, an API, a file system) and presents it as a set of callable functions. The model decides which tools to call based on the task; the MCP server executes and returns results.
Servers are installed separately from the model. Most follow the same setup pattern: install (npm/pip), configure in Claude settings with API keys, restart.
Ecosystem Categories
The MCP server ecosystem breaks into broad functional categories:
| Category | What it provides |
|---|---|
| Search & Web | AI-optimized search, web crawling, URL fetching |
| File System & Local Data | Read/write local files, SQLite/Postgres queries, Excel manipulation |
| Developer Tools | GitHub, Git, browser automation, Docker, Sentry, codebase memory |
| Productivity & Communication | Google Drive, Gmail, Calendar, Slack, Notion, Linear |
| Data & Analytics | Snowflake, BigQuery, Supabase, MongoDB |
| Infrastructure & DevOps | AWS, Cloudflare, Kubernetes, Vercel |
| AI & Models | ElevenLabs TTS, Hugging Face, Replicate |
| Utility | Time/timezone, persistent memory, task management |
Key Points
- Start small: install a role-appropriate starter pack of 4–5 servers; don’t install all at once
- Official Anthropic servers are the simplest to set up — usually one command
- fastmcp is the fastest way to build custom servers when nothing existing fits (Python, afternoon build time)
- MCPHub manages multiple servers via a single dashboard — useful once you have 5+ running
- Context7 deserves special mention: injects up-to-date library documentation into context, eliminating hallucinated APIs and deprecated method calls
Starter Packs by Role
| Role | Recommended servers |
|---|---|
| Developers | Filesystem + GitHub + Context7 + Codebase Memory + Sentry |
| Knowledge Workers | Filesystem + Google Drive + Gmail + Google Calendar + Notion |
| Data Analysts | Filesystem + SQLite + PostgreSQL + Excel + Tavily |
| Content Creators | Filesystem + Tavily + Obsidian + markdownify + Slack |
| DevOps | Filesystem + Docker + GitHub + AWS + Kubernetes |
Connections
- claude-code-skills — skills and MCP are complementary: skills = HOW, MCP = ACCESS
- thin-harness-fat-skills — MCP servers are part of the “fat” extension layer the harness delegates to
- coding-agent — MCP servers are one of the tool categories available to a coding agent
- mcp-servers — curated reference list of 40 notable servers