Architecture
6 min read
MCP Hub Platform is composed of four components that work together as a pipeline. Each component has a distinct role and communicates with the others through well-defined interfaces: REST APIs, AMQP message queues, and S3-compatible object storage.
The Big Picture
MCP Hub Platform
=====================================================================
Developer
|
| mcp push / git commit / webhook / upload
v
+------------------+ AMQP: ANALYZE job +------------------+
| | -------------------------------------> | |
| MCP Hub | | MCP Scanner |
| (Control Plane) | <------------------------------------- | (Analysis Engine)|
| | AMQP: ANALYZE_COMPLETE | |
| - Web Dashboard | | - 46+ Detectors |
| - Cert Engine | +--- S3 (MinIO) ---+ | - 14 Vuln Classes|
| - Org Mgmt | ----> | Tarballs | <-------- | - Taint Analysis |
| - Billing | | Analysis Results | | - AI Detection |
+--------+---------+ +------------------+ +------------------+
|
| publishes certified artifact (REST + JWT)
v
+------------------+
| |
| MCP Registry |
| (Data Plane) |
| |
| - Manifests |
| - Bundles |
| - SHA-256 CAS |
+--------+---------+
^
| resolve / download / verify (REST + JWT)
|
+------------------+
| |
| MCP Client |
| (Exec Plane) |
| |
| - Sandbox |
| - Policy Check |
| - Resource Lim. |
+------------------+
|
| executes locally with isolation
v
MCP Server (sandboxed process)
Components
MCP Hub – Control Plane
The Hub is the brain of the platform. It provides the web dashboard where developers and administrators manage MCP servers, organizations, and certification policies. When source code is submitted, the Hub orchestrates the entire certification pipeline.
Key responsibilities:
- Ingest MCP server source from Git repositories, webhooks, or direct uploads
- Dispatch analysis jobs to scanner workers via AMQP
- Receive and process analysis results
- Compute deterministic security scores (0-100)
- Map scores to certification levels (0-3)
- Publish certified artifacts to the registry
- Manage organizations, users, RBAC, and billing
Runs as two processes:
| Process | Purpose | Command |
|---|---|---|
| Hub Web | Dashboard and REST API | make dev or make dev-hub |
| Hub Worker | Job orchestration (AMQP consumer/producer) | make dev-worker |
Both processes must be running for the full pipeline to function.
MCP Scanner – Analysis Engine
The Scanner is a specialized static security analyzer for MCP servers. It runs as a worker process that consumes ANALYZE jobs from the AMQP queue, performs deep code analysis, and returns structured results.
Key responsibilities:
- Download source tarballs from S3
- Run 46+ security detectors across 14 vulnerability classes (A through N)
- Perform pattern matching, taint analysis, and optional AI-powered detection
- Support Python, TypeScript, JavaScript, and Go codebases
- Upload analysis results to S3
- Publish ANALYZE_COMPLETE notifications to AMQP
Supported languages:
| Language | Detection Coverage |
|---|---|
| Python | Full (all 14 classes) |
| TypeScript | Full (all 14 classes) |
| JavaScript | Full (all 14 classes) |
| Go | Full (all 14 classes) |
MCP Registry – Data Plane
The Registry is the distribution layer. It stores and serves certified MCP bundles and manifests using content-addressed storage (SHA-256 digests). Every artifact is immutable once published.
Key responsibilities:
- Store certified bundles and manifests
- Serve the publish / resolve / download protocol
- Enforce JWT authentication and scope-based authorization
- Verify content integrity via SHA-256 digests
- Provide version resolution and package metadata
API Protocol:
| Endpoint | Method | Description |
|---|---|---|
/v1/publish | POST | Publish a certified artifact |
/v1/resolve/{name}/{version} | GET | Resolve package metadata |
/v1/download/{name}/{version} | GET | Download the artifact bundle |
MCP Client – Execution Plane
The Client is the end-user tool. It resolves packages from the registry, validates their integrity, enforces security policies, and executes them inside sandboxed environments with resource limits.
Key responsibilities:
- Resolve and download packages from the registry
- Validate SHA-256 digests against published manifests
- Enforce configurable security policies (minimum cert level, allowed origins)
- Apply resource limits (CPU, memory, network)
- Sandbox execution using platform-specific isolation (namespaces on Linux, sandbox-exec on macOS)
Communication Protocols
The components communicate through three distinct channels:
AMQP (LavinMQ)
Used for asynchronous job distribution between the Hub worker and Scanner workers.
Hub Worker ──publish──> [ANALYZE queue] ──consume──> Scan Worker
Hub Worker <──consume── [ANALYZE_COMPLETE queue] <──publish── Scan Worker
Message flow:
- Hub-worker publishes an
ANALYZEjob containing the S3 key for the source tarball - Scan-worker consumes the job, downloads the tarball, runs analysis, uploads results
- Scan-worker publishes an
ANALYZE_COMPLETEmessage with the S3 key for results - Hub-worker consumes the completion message and proceeds to scoring
REST APIs (HTTP + JWT)
Used for synchronous operations between components and external clients.
| From | To | Protocol | Purpose |
|---|---|---|---|
| Browser / CLI | Hub | HTTP (8080) | Dashboard, management API |
| Hub | Registry | HTTP (8081) | Publish certified artifacts |
| Client | Registry | HTTP (8081) | Resolve and download packages |
S3 (MinIO)
Used for transferring large binary data between components.
| Bucket | Contents | Writers | Readers |
|---|---|---|---|
| Source tarballs | Uploaded MCP server source code | Hub Worker | Scan Worker |
| Analysis results | Structured JSON findings | Scan Worker | Hub Worker |
| Certified artifacts | Published bundles and manifests | Hub | Registry |
Data Flow Through the Pipeline
Here is the complete sequence of events when an MCP server is submitted for certification:
1. Ingest
The developer submits source code via the web dashboard, a Git repository URL, a webhook trigger, or a CLI push. The Hub validates the input, creates a database record, and transitions the server to QUEUED status.
2. Dispatch
The Hub Worker picks up the queued server, compresses the source into a tarball, uploads it to S3, and publishes an ANALYZE message to the AMQP queue.
3. Analyze
A Scan Worker consumes the message, downloads the tarball from S3, extracts it, and runs the full suite of security detectors. The analysis produces a structured JSON report with findings, severity levels, affected files, and remediation guidance. The worker uploads the results to S3 and publishes ANALYZE_COMPLETE.
4. Post-Analyze
The Hub Worker receives the completion notification, downloads the analysis results from S3, and performs:
- Controls mapping: Maps findings to security controls
- Scoring: Computes a deterministic score from 0 to 100
- Snapshot building: Assembles the immutable security snapshot (findings, SBOM, attestation)
5. Certify
The score determines the certification level:
| Score | Level | Name |
|---|---|---|
| Any | 0 | Integrity Verified |
| >= 60 | 1 | Static Verified |
| >= 80 | 2 | Security Certified |
| >= 90 | 3 | Runtime Certified |
6. Publish
The Hub publishes the certified artifact – including the bundle, manifest, and security snapshot – to the Registry via its REST API. The Registry assigns a content-addressed SHA-256 digest.
7. Execute
End users run mcp-client pull and mcp-client run to download and execute the package. The client validates the digest, checks security policies, and launches the server in a sandboxed process.
Infrastructure Services
All components rely on shared infrastructure services managed through Docker Compose:
| Service | Port | Technology | Purpose |
|---|---|---|---|
| postgres | 15432 | PostgreSQL 16 | Primary database (two DBs: mcphub + mcp_registry) |
| redis | 6390 | Redis 7 | Caching, rate limiting, session storage |
| minio | 9000 / 9001 | MinIO | S3-compatible object storage for tarballs and artifacts |
| lavinmq | 5672 / 15672 | LavinMQ | AMQP message broker for async job distribution |
Deployment Topology
In development, all components run on a single machine. In production, the architecture supports horizontal scaling:
Production Topology
===================
Load Balancer
|
+--- Hub Web (N replicas)
|
+--- Registry (N replicas)
Hub Workers (N replicas) <--- AMQP ---> Scan Workers (N replicas)
Shared Services:
PostgreSQL (primary + replica)
Redis (cluster or sentinel)
MinIO / S3
LavinMQ / RabbitMQ (clustered)
- Hub Web and Registry scale horizontally behind a load balancer
- Hub Workers and Scan Workers scale independently based on analysis queue depth
- Infrastructure services should be deployed with replication and failover for production use
Next Steps
- Quick Start – Get the platform running locally in 5 minutes
- Your First Certification – Walk through the complete pipeline
- Self-Hosted Deployment – Deploy to production infrastructure