Skip to content

Architecture

BrewedSys edited this page Mar 30, 2026 · 1 revision

Architecture

Http-native is a hybrid Rust + JavaScript HTTP framework. The performance-critical HTTP layer lives in Rust, while application logic is written in familiar Express-style JavaScript.

High-Level Overview

graph TD
  Client["Client"] -->|TCP/TLS| Rust["Rust HTTP Layer<br/>(monoio + httparse)"]
  Rust -->|Route Match| Router["Router<br/>(Exact Map + Radix Tree)"]
  Router -->|Static Hit| Cache["Response Cache"]
  Cache -->|Cached| Rust
  Router -->|Dynamic| Bridge["Binary Bridge<br/>(NAPI-rs)"]
  Bridge -->|Invoke| JS["JavaScript Handler"]
  JS -->|Result| Bridge
  Bridge -->|Encode| Rust
  Rust -->|HTTP Response| Client

  style Client fill:#F1EFE8,stroke:#5F5E5A,stroke-width:2px,color:#2C2C2A
  style Rust fill:#FAECE7,stroke:#993C1D,stroke-width:2px,color:#4A1B0C
  style JS fill:#FAEEDA,stroke:#854F0B,stroke-width:2px,color:#412402
  style Router fill:#E8F1EF,stroke:#1D6B52,stroke-width:2px,color:#0A3D2E
  style Cache fill:#E8EBF1,stroke:#1D3B99,stroke-width:2px,color:#0C1F4A
  style Bridge fill:#F1E8F1,stroke:#6B1D6B,stroke-width:2px,color:#3D0A3D
Loading

Request Lifecycle

  1. TCP Accept — The Rust layer (powered by monoio) accepts incoming TCP connections with keep-alive support.
  2. HTTP Parsing — Raw bytes are parsed using httparse, extracting method, path, headers, and body.
  3. Route Matching — The router attempts an O(1) exact-match lookup first. If that fails, it falls back to a radix-tree scan for parameterized routes.
  4. Cache Check — If a response has been cached via res.ncache(), Rust serves it directly without invoking JavaScript.
  5. Bridge Dispatch — For uncached routes, the request is serialized into a compact binary envelope and dispatched to the JavaScript handler via NAPI-rs.
  6. Handler Execution — Your JavaScript handler runs with req and res objects. Middleware runs in order via next().
  7. Response Return — The response flows back through the bridge. Rust encodes the HTTP response and writes it to the socket.

Key Components

Rust Layer (rsrc/src/)

File Responsibility
lib.rs Main event loop, TCP listener, HTTP parsing, response writing
router.rs Route registration and matching (exact hash map + radix tree)
analyzer.rs Runtime analysis of route patterns for optimization
manifest.rs Configuration models and server manifest
session.rs In-memory session store with sharded RwLock
websocket.rs WebSocket upgrade and frame handling

JavaScript Layer (src/)

File Responsibility
index.js createApp() factory — registers routes, middleware, starts server
bridge.js Binary serialization/deserialization for JS-Rust communication
native.js Loads the compiled Rust .node binary
cors.js CORS middleware
session.js Session middleware (wraps Rust session store)
validate.js Schema-agnostic request validation
hot.js Hot-reload middleware for development

The Binary Bridge

Communication between Rust and JavaScript uses a custom binary protocol (not JSON) for minimal overhead. The bridge:

  • Encodes request metadata (method, path, headers) as length-prefixed binary fields
  • Avoids string allocation on the Rust side when possible
  • Uses thread-local buffer pools to reduce memory allocation pressure
  • Leverages NAPI-rs for zero-copy data transfer where supported

Router Design

The router uses a two-tier strategy:

Incoming Path
     |
     +-- Exact Match (HashMap) ---- O(1) lookup
     |    e.g. "/users", "/health"
     |
     +-- Radix Tree ---------------- O(M) lookup (M = path segments)
          e.g. "/users/:id", "/posts/:id/comments/:cid"

Static routes (no parameters) are stored in a hash map for constant-time lookup. Parameterized routes use a radix tree that walks path segments to extract named parameters.

Thread Model

Http-native uses monoio's io_uring-based (Linux) or kqueue-based (macOS) async runtime. The Rust server runs on a single thread with cooperative multitasking — no thread pool for request handling. JavaScript handlers are invoked on the main V8/JSC thread via NAPI callbacks.

This single-threaded model avoids lock contention and context-switch overhead, which is a key reason for the high throughput numbers.

Clone this wiki locally