
Disclaimer: This series is a personal, educational reference architecture. All diagrams, opinions, and frameworks are my own and are not affiliated with, sponsored by, or representative of my employer. I’m publishing it on my own time and without using any confidential information.
© 2026 Sean Miller. All rights reserved.
The Lattice series describes an enterprise AI platform organized into five planes, each containing named components: AI Gateway, Orchestration Engine, Tool Gateway, Context Builder, Model Gateway, Control Plane. Every deep-dive post explains what a component does and how it’s governed. But a common question keeps coming up: what is each component in practice? The answer varies. Some are microservices. Some are libraries. Some are frontends.
This post maps every component to its concrete implementation artifact. When the series says “Tool Gateway,” it means a TypeScript microservice running on port 3104 with three REST endpoints and a set of middleware. When it says “Core,” it means a shared npm package that every service depends on for type definitions and validation schemas.
This is the decoder ring for the rest of the series. Each deep dive will make more sense once you know exactly what kind of artifact you’re reading about.
Lattice is a TypeScript monorepo. All services live under services/packages/. Each service is its own directory with its own package.json, its own Dockerfile, and its own entry point.
Every backend service is built with Hono, a lightweight HTTP framework, and @hono/node-server for the runtime. Each deploys as a Docker container using a multi-stage build on node:20-alpine. The build stage compiles TypeScript and installs dependencies. The production stage copies the compiled output into a minimal image with no dev tooling.
A shared library called @lattice/core provides TypeScript types, Zod validation schemas, database utilities, and shared HTTP clients. Every service depends on it. Core is an npm package, not a deployed service. It defines the contracts that services validate against at their boundaries.
The port convention is straightforward: 3101 through 3106 for backend services, 3110 for the dashboard. Each service owns a single port and exposes versioned REST endpoints under /v1/. This makes local development predictable. You always know where each service is running.
Below is every Lattice component mapped to its implementation artifact. For each one: what it is, how it deploys, what port it owns, and what it’s responsible for.
What it is: Hono microservice Deploys as: Docker container Port: 3101
The AI Gateway exposes four endpoints: /v1/ai/turn for processing a conversational turn, /v1/ai/workflow for workflow operations, /v1/ai/session for session management, and /v1/ai/feedback for feedback collection. It owns authentication and authorization, request validation, session lifecycle, and routing requests to the correct workflow in the Orchestrator. This is the single entry point for all client-facing AI interactions. Nothing reaches the backend without passing through it first.
What it is: Hono microservice Deploys as: Docker container Port: 3102
The Orchestration Engine exposes three endpoints: /v1/execute for running a workflow, /v1/workflows for listing and inspecting workflow definitions, and /v1/runs for managing workflow runs. It owns workflow execution, step sequencing, planner decisions, state management, and trace emission. The Orchestrator is the hub of the entire system. It calls every other backend service via typed HTTP clients: Tool Gateway for tool execution, Model Gateway for inference, Context Builder for retrieval, and Control Plane for policy checks.
What it is: Hono microservice Deploys as: Docker container Port: 3103
The Model Gateway exposes three endpoints: /v1/infer for text generation and structured output, /v1/classify for text classification, and /v1/embed for embedding generation. It owns model routing by class (fast, balanced, quality, embed, classify), structured output schema enforcement, cost and token budgets, and provider abstraction. It uses the OpenAI SDK as the provider interface, with extensibility for other providers.
What it is: Hono microservice Deploys as: Docker container Port: 3104
The Tool Gateway exposes three endpoints: /v1/tools for listing registered tools, /v1/tools/:toolId for tool details, and /v1/tools/:toolId/execute for executing a tool with RBAC, idempotency, and audit. It owns the tool registry and discovery, two-layer RBAC enforcement (role check plus contextual access), idempotency via HTTP cache for reads and a database action log for writes, and a structured audit trail for every execution.
What it is: Hono microservice Deploys as: Docker container Port: 3105
The Context Builder exposes a single endpoint: /v1/context/build for assembling a context package for a model call. It owns scope-aware document retrieval using both keyword and semantic search, PII redaction based on user role, citation and provenance tracking, and token budget management. It uses SQLite for its local document store.
What it is: Hono microservice Deploys as: Docker container Port: 3106
The Control Plane exposes four endpoints: /v1/policy/evaluate for runtime policy checks, /v1/workflows for the workflow registry, /v1/sessions for session state, and /v1/audit for audit log queries. It owns the policy engine with built-in policies covering rate limits, RBAC enforcement, human-in-the-loop gates for high-risk actions, and budget control. It also manages workflow version management, session persistence, and an immutable audit log. It uses SQLite for state persistence. Policies are loaded at startup and evaluated on every request that passes through the Orchestrator.
What it is: Shared npm package (@lattice/core)
Deploys as: Not deployed independently
Port: None
Core has no API surface. It is a library. It owns TypeScript type definitions for every service contract, Zod validation schemas, custom error classes, HTTP clients for audit and policy, and PostgreSQL utilities. Every service in the monorepo depends on it. Core defines the contracts that services validate against at their boundaries.
What it is: React + Vite single-page application Deploys as: Docker container Port: 3110
The Dashboard is frontend only. It calls backend services via their REST APIs and provides visibility into workflow runs, audit logs, and system state.
Protocol. All inter-service communication is synchronous HTTP/REST. There are no message queues in the reference implementation. Services call each other via typed HTTP clients defined in @lattice/core. Each client wraps fetch with proper error handling, timeout management, and header propagation.
Identity propagation. The AI Gateway authenticates the user and attaches identity headers: X-User-Id, X-User-Role, X-User-Department. The Orchestrator propagates these headers on every downstream call. The Tool Gateway uses them for RBAC enforcement. The Context Builder uses them for PII redaction scoping. The original human identity flows through the entire request chain, from the gateway to every tool call and model call. There is no point in the system where the human actor is unknown.
Correlation IDs. Every request gets an X-Correlation-ID header at the AI Gateway. This ID propagates through every inter-service call. It links the gateway request to the orchestration run, to individual tool calls, and to model calls. Audit events, trace logs, and error reports all reference the same correlation ID. When something goes wrong, you search for one ID and get the full picture.
Shared contracts. All request and response shapes are defined as Zod schemas in @lattice/core. Services validate incoming requests against these schemas at their boundaries. A ToolCallRequest validated by the Orchestrator uses the same schema the Tool Gateway validates on receipt. This eliminates an entire class of integration bugs where services disagree on the shape of a payload.
Service discovery. Environment variables point each service to its dependencies: TOOL_GATEWAY_URL, MODEL_GATEWAY_URL, and so on. In local development, these resolve to localhost with the port convention described above. In production, they point to service mesh endpoints or container DNS names.
This post is the decoder ring. The deep dives start with Part 1, which introduces the Five Planes architecture and explains why Lattice is structured the way it is.
This series will explore each component of the Lattice architecture in depth:
This series documents architectural patterns for enterprise AI platforms. Diagrams and frameworks are provided for educational purposes.