The Model Context Protocol Registry is an open, standards-driven catalog and API for MCP servers. If you are building or running AI tools that speak MCP, the registry is the connective tissue that turns isolated servers into a discoverable ecosystem, so developers can find, trust, and run the right servers. It is developed in the open at modelcontextprotocol/registry and is already live in preview at registry.modelcontextprotocol.io (MCP Blog, 2025).
modelcontextprotocol
Organization
A registry built for the way MCP actually works
AI applications increasingly rely on MCP servers to connect models to tools and data. The problem is fragmentation: servers are published across NPM, PyPI, NuGet, Docker registries, or as raw MCPB artifacts, and clients have little uniformity for discovery or validation.
The MCP Registry addresses this by providing a single source of truth with an opinionated but permissive moderation policy and a simple, stable read API for clients. Server maintainers publish once, and downstream subregistries and client marketplaces can mirror, filter, and enrich data for their users.
Key features you can use today
- Open REST API for discovery: Read-only endpoints such as
GET /v0/servers
andGET /v0/servers/{id}
let clients fetch server catalogs with pagination, optional search, and anupdated_since
filter for incremental sync. See use-rest-api.md.- Publisher CLI with ownership verification: A cross-platform
mcp-publisher
tool helps authors generate and validateserver.json
, authenticate via GitHub OAuth/OIDC or domain-based DNS/HTTP challenges, and publish in one flow. See publish-server.md.- Multi-ecosystem package validation: Checks for NPM
mcpName
, PyPI/NuGet README mentions, OCI image labels, and MCPB file hashes ensure the published name maps to real, verifiable artifacts. See server-json reference.- Subregistries by design: Public marketplaces and private enterprise registries can mirror the official dataset, add metadata, enforce policy, or run security scanning, all while staying API-compatible. See building a subregistry.
- Permissive but clear moderation: The official registry removes illegal content, malware, spam, and non-functioning servers while leaving most curation to subregistries. See moderation-guidelines.md.
Why this project stands out
What makes the MCP Registry particularly compelling is its thoughtful balance of simplicity and flexibility. Instead of reinventing authentication or package distribution, it leverages existing infrastructure: GitHub OAuth for developer identity, DNS verification for domain ownership, and established package registries (NPM, PyPI, NuGet, Docker Hub) for actual code hosting. This "metaregistry" approach means lower operational overhead and higher reliability.
The ownership verification system is elegantly practical. Rather than complex certificate chains or custom cryptography, it uses simple signals already embedded in package ecosystems, a mcpName
field in your NPM package.json, a line in your PyPI README, or a Docker label. These lightweight proofs prevent namespace squatting while remaining accessible to any developer.
Most importantly, the registry treats subregistries as first-class citizens rather than competitors. The official registry focuses on being a reliable, vendor-neutral foundation that others can build upon.
Want enterprise security scanning? Curated quality ratings? Community-specific filtering? Build a subregistry that mirrors the official data and adds your own value. This federated approach prevents vendor lock-in while enabling specialization, a rare design that scales both technically and organizationally.
Under the hood: the tech and structure
The repo is primarily Go and organized cleanly for service development. The README.md outlines a structure that includes internal/api for HTTP routing, internal/database for PostgreSQL interactions (with seed data in data/), internal/auth for GitHub OAuth, GitHub OIDC, DNS, and HTTP verification, and pkg/api/v0 for public types.
The go.mod lists dependencies like huma
for API scaffolding, pgx
for Postgres, and OpenTelemetry/Prometheus for metrics.
A docker-compose.yml spins up the service with Postgres and seed data for local development.
The Official Registry API is described in official-registry-api.md with interactive docs at /docs and a machine-readable OpenAPI spec (OpenAPI Initiative, 2025).
Auth endpoints cover DNS, HTTP, GitHub OAuth, and OIDC, and admin surfaces include /metrics
for Prometheus scraping (Prometheus, 2025) and /v0/health
for liveness checks. For identity flows, the design aligns with OpenID Connect Core 1.0 (OpenID Foundation, 2014) and GitHub OAuth (GitHub Docs, 2025).
├── cmd/ # Application entry points
│ └── publisher/ # Server publishing tool
├── data/ # Seed data
├── deploy/ # Deployment configuration (Pulumi)
├── docs/ # Documentation
├── internal/ # Private application code
│ ├── api/ # HTTP handlers and routing
│ ├── auth/ # Authentication (GitHub OAuth, JWT, namespace blocking)
│ ├── config/ # Configuration management
│ ├── database/ # Data persistence (PostgreSQL, in-memory)
│ ├── service/ # Business logic
│ ├── telemetry/ # Metrics and monitoring
│ └── validators/ # Input validation
├── pkg/ # Public packages
│ ├── api/ # API types and structures
│ │ └── v0/ # Version 0 API types
│ └── model/ # Data models for server.json
├── scripts/ # Development and testing scripts
├── tests/ # Integration tests
└── tools/ # CLI tools and utilities
└── validate-*.sh # Schema validation tools
Server.json, in practice
Servers are described by a standard JSON schema and can include package pointers across ecosystems plus optional metadata for subregistries to enrich. Here is a concise example:
{
"$schema": "https://static.modelcontextprotocol.io/schemas/2025-07-09/server.schema.json",
"name": "io.github.username/weather-server",
"description": "MCP server for weather data",
"status": "active",
"version": "1.0.0",
"packages": [
{ "registry_type": "npm", "identifier": "weather-mcp-server", "version": "1.0.0" }
],
"_meta": {
"com.example.subregistry/custom": {
"user_rating": 4.7,
"security_scan": { "last_scanned": "2025-08-01T00:00:00Z", "vulnerabilities_found": 0 }
}
}
}
Client authors can fetch GET /v0/servers
, filter for status=active
, and transform packages
into whatever install configuration their client expects. Subregistries can append _meta
to add ratings, scan results, or compatibility notes without breaking clients that ignore unknown fields.
Where it fits: real-world use cases
For AI Application Developers: Instead of manually curating MCP servers or requiring users to find and configure each one separately, your client can fetch the registry's server catalog and present users with a searchable, categorized marketplace. Tools like Claude Desktop, Cline, and other MCP-enabled applications can offer one-click server installation by consuming registry metadata and transforming package references into their specific configuration formats.
For Enterprise Teams: Create private subregistries that mirror the official registry data, then layer on internal compliance, security scanning, and approval workflows. Your organization can maintain an allowlist of vetted servers while staying compatible with the broader ecosystem. Since the registry uses standard authentication (GitHub OAuth, DNS verification), it integrates naturally with existing identity and approval systems.
For Platform Builders: Companies like PulseMCP, Smithery, and VSCode are already building specialized registries that add value on top of the official dataset such as community ratings, enhanced search, compatibility matrices, or editorial curation. The API's incremental sync capabilities (updated_since
filters) and extensible _meta
fields make it efficient to build and maintain these enhanced views.
For DevOps and CI/CD: Use GitHub Actions with OIDC to automatically publish server updates as part of your release pipeline. The mcp-publisher
CLI can validate server.json files, verify package ownership, and publish to the registry without manual intervention. This keeps server catalogs fresh and reduces the friction between server development and distribution.
Community, contribution, and moderation
The project is maintained by a cross-org working group with maintainers from Anthropic, PulseMCP, and GitHub, among others, and grew from community discussions before the preview launch (MCP Blog, 2025).
Collaboration flows through Discord, GitHub Discussions, Issues, and PRs, all linked from the README.
Moderation for the official registry is intentionally light-touch to preserve openness, with clear removal criteria and a transparent appeals path via issues in this repo. See moderation-guidelines.md.
Usage and license
The code is MIT-licensed, granting broad rights to use, modify, distribute, and sell the software with attribution and without warranty. See LICENSE for the full text. Read APIs are open and unauthenticated, while publishing requires namespace-appropriate authentication.
For local development, the repo includes a docker-compose.yml and Make targets; for production integration, prefer consuming the hosted API or operating a subregistry that mirrors the official dataset.
Impact and what comes next
The registry turns MCP from a protocol with many islands into a network with discoverable ports. With subregistries as first-class citizens, ecosystems like IDEs, agents, and enterprise platforms can offer curated experiences without vendor lock-in. Expect more filters for synchronization, richer metadata conventions, and tighter tooling around provenance and security scanning as the spec and community mature (MCP Blog, 2025).
About the MCP project
The Model Context Protocol is a community-driven open standard and suite of SDKs for connecting models to external tools, data, and processes. You can learn more at modelcontextprotocol.io/docs and explore the broader GitHub org at github.com/modelcontextprotocol. The registry team credits contributors from PulseMCP and Block/Goose alongside maintainers from Anthropic and GitHub in the preview announcement (MCP Blog, 2025).
Conclusion
If you build MCP servers, publish them once and let the ecosystem discover them everywhere. If you build clients, consume the official API or mirror it into your own opinionated subregistry. Either way, the MCP Registry gives the protocol a backbone for discovery, provenance, and trust. Start with the publishing guide and the API usage guide, and read the full announcement to understand the philosophy behind it (MCP Blog, 2025).
The Model Context Protocol Registry: Building the Backbone for AI Server Discovery