Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1,854 changes: 1,061 additions & 793 deletions Cargo.lock

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -115,7 +115,7 @@ members = [
resolver = "2"

[workspace.package]
version = "2.0.4"
version = "2.0.5"
edition = "2021"
rust-version = "1.77"
license = "MIT"
Expand Down
16 changes: 8 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -113,7 +113,7 @@ Most vector databases are static — they store embeddings and search them. That
| 44 | **Drop into Postgres** | pgvector-compatible extension with SIMD acceleration |
| 45 | **MCP integration** | Model Context Protocol server for AI assistant tools |
| 46 | **Cloud deployment** | One-click deploy to Cloud Run, Kubernetes |
| 47 | **13 Rust crates + 4 npm packages** | [RVF SDK](./crates/rvf/README.md) published on [crates.io](https://crates.io/crates/rvf-runtime) and [npm](https://www.npmjs.com/package/@ruvector/rvf) |
| 47 | **22 Rust crates + 4 npm packages** | [RVF SDK](./crates/rvf/README.md) published on [crates.io](https://crates.io/crates/rvf-runtime) and [npm](https://www.npmjs.com/package/@ruvector/rvf) |

**Self-Learning & Adaptation**
| # | Capability | What It Does |
Expand Down Expand Up @@ -238,7 +238,7 @@ npx @ruvector/rvf-mcp-server --transport stdio # MCP server for AI agents
| DNA-style lineage | Parent/child derivation chains with cryptographic verification |
| 24 segment types | VEC, INDEX, KERNEL, EBPF, WASM, COW_MAP, WITNESS, CRYPTO, and 16 more |

**Rust crates** (13): [`rvf-types`](https://crates.io/crates/rvf-types) `rvf-wire` `rvf-manifest` `rvf-quant` `rvf-index` `rvf-crypto` [`rvf-runtime`](https://crates.io/crates/rvf-runtime) `rvf-kernel` `rvf-ebpf` `rvf-launch` `rvf-server` `rvf-import` [`rvf-cli`](https://crates.io/crates/rvf-cli)
**Rust crates** (22): [`rvf-types`](https://crates.io/crates/rvf-types) `rvf-wire` `rvf-manifest` `rvf-quant` `rvf-index` `rvf-crypto` [`rvf-runtime`](https://crates.io/crates/rvf-runtime) `rvf-kernel` `rvf-ebpf` `rvf-launch` `rvf-server` `rvf-import` [`rvf-cli`](https://crates.io/crates/rvf-cli) `rvf-wasm` `rvf-solver-wasm` `rvf-node` + 6 adapters (claude-flow, agentdb, ospipe, agentic-flow, rvlite, sona)

**npm packages** (4): [`@ruvector/rvf`](https://www.npmjs.com/package/@ruvector/rvf) [`@ruvector/rvf-node`](https://www.npmjs.com/package/@ruvector/rvf-node) [`@ruvector/rvf-wasm`](https://www.npmjs.com/package/@ruvector/rvf-wasm) [`@ruvector/rvf-mcp-server`](https://www.npmjs.com/package/@ruvector/rvf-mcp-server)

Expand All @@ -247,7 +247,7 @@ npx @ruvector/rvf-mcp-server --transport stdio # MCP server for AI agents
- **ADR-030**: [Cognitive Container Architecture](./docs/adr/ADR-030-rvf-cognitive-container.md)
- **ADR-031**: [COW Branching & Real Containers](./docs/adr/ADR-031-rvcow-branching-and-real-cognitive-containers.md)
- **ADR-042**: [Security RVF — AIDefence + TEE](./docs/adr/ADR-042-Security-RVF-AIDefence-TEE.md)
- **46 runnable examples**: [examples/rvf/examples/](./examples/rvf/examples/)
- **56 runnable examples**: [examples/rvf/examples/](./examples/rvf/examples/)

</details>

Expand Down Expand Up @@ -355,7 +355,7 @@ npx ruvector
| **Self-Learning (GNN)** | ✅ | ❌ | ❌ | ❌ | ❌ |
| **Runtime Adaptation (SONA)** | ✅ LoRA+EWC++ | ❌ | ❌ | ❌ | ❌ |
| **AI Agent Routing** | ✅ Tiny Dancer | ❌ | ❌ | ❌ | ❌ |
| **Attention Mechanisms** | ✅ 40 types | ❌ | ❌ | ❌ | ❌ |
| **Attention Mechanisms** | ✅ 46 types | ❌ | ❌ | ❌ | ❌ |
| **Coherence Gate** | ✅ Prime-Radiant | ❌ | ❌ | ❌ | ❌ |
| **Hyperbolic Embeddings** | ✅ Poincaré+Lorentz | ❌ | ❌ | ❌ | ❌ |
| **Local Embeddings** | ✅ 8+ models | ❌ | ❌ | ❌ | ❌ |
Expand Down Expand Up @@ -518,7 +518,7 @@ npx @ruvector/cli hooks install # Configure for Claude Code

| Feature | What It Does | Why It Matters |
|---------|--------------|----------------|
| **40 Mechanisms** | Dot-product, multi-head, flash, linear, sparse, cross-attention, CGT sheaf | Cover all transformer and GNN use cases |
| **46 Mechanisms** | Dot-product, multi-head, flash, linear, sparse, cross-attention, CGT sheaf | Cover all transformer and GNN use cases |
| **Graph Attention** | RoPE, edge-featured, local-global, neighborhood | Purpose-built for graph neural networks |
| **Hyperbolic Attention** | Poincaré ball operations, curved-space math | Better embeddings for hierarchical data |
| **SIMD Optimized** | Native Rust with AVX2/NEON acceleration | 2-10x faster than pure JS |
Expand Down Expand Up @@ -1187,7 +1187,7 @@ await dag.execute();
|---------|-------------|---------|-----------|
| [@ruvector/tiny-dancer](https://www.npmjs.com/package/@ruvector/tiny-dancer) | FastGRNN neural routing | [![npm](https://img.shields.io/npm/v/@ruvector/tiny-dancer.svg)](https://www.npmjs.com/package/@ruvector/tiny-dancer) | [![downloads](https://img.shields.io/npm/dt/@ruvector/tiny-dancer.svg)](https://www.npmjs.com/package/@ruvector/tiny-dancer) |
| [@ruvector/router](https://www.npmjs.com/package/@ruvector/router) | Semantic router + HNSW | [![npm](https://img.shields.io/npm/v/@ruvector/router.svg)](https://www.npmjs.com/package/@ruvector/router) | [![downloads](https://img.shields.io/npm/dt/@ruvector/router.svg)](https://www.npmjs.com/package/@ruvector/router) |
| [@ruvector/attention](https://www.npmjs.com/package/@ruvector/attention) | 40+ attention mechanisms | [![npm](https://img.shields.io/npm/v/@ruvector/attention.svg)](https://www.npmjs.com/package/@ruvector/attention) | [![downloads](https://img.shields.io/npm/dt/@ruvector/attention.svg)](https://www.npmjs.com/package/@ruvector/attention) |
| [@ruvector/attention](https://www.npmjs.com/package/@ruvector/attention) | 46 attention mechanisms | [![npm](https://img.shields.io/npm/v/@ruvector/attention.svg)](https://www.npmjs.com/package/@ruvector/attention) | [![downloads](https://img.shields.io/npm/dt/@ruvector/attention.svg)](https://www.npmjs.com/package/@ruvector/attention) |

#### Learning & Neural

Expand Down Expand Up @@ -1260,7 +1260,7 @@ await dag.execute();
| [@ruvector/wasm-unified](https://www.npmjs.com/package/@ruvector/wasm-unified) | Unified TypeScript API | [![npm](https://img.shields.io/npm/v/@ruvector/wasm-unified.svg)](https://www.npmjs.com/package/@ruvector/wasm-unified) | [![downloads](https://img.shields.io/npm/dt/@ruvector/wasm-unified.svg)](https://www.npmjs.com/package/@ruvector/wasm-unified) |
| [@ruvector/gnn-wasm](https://www.npmjs.com/package/@ruvector/gnn-wasm) | GNN WASM bindings | [![npm](https://img.shields.io/npm/v/@ruvector/gnn-wasm.svg)](https://www.npmjs.com/package/@ruvector/gnn-wasm) | [![downloads](https://img.shields.io/npm/dt/@ruvector/gnn-wasm.svg)](https://www.npmjs.com/package/@ruvector/gnn-wasm) |
| [@ruvector/attention-wasm](https://www.npmjs.com/package/@ruvector/attention-wasm) | Attention WASM bindings | [![npm](https://img.shields.io/npm/v/@ruvector/attention-wasm.svg)](https://www.npmjs.com/package/@ruvector/attention-wasm) | [![downloads](https://img.shields.io/npm/dt/@ruvector/attention-wasm.svg)](https://www.npmjs.com/package/@ruvector/attention-wasm) |
| [@ruvector/attention-unified-wasm](https://www.npmjs.com/package/@ruvector/attention-unified-wasm) | All 40+ attention mechanisms | [![npm](https://img.shields.io/npm/v/@ruvector/attention-unified-wasm.svg)](https://www.npmjs.com/package/@ruvector/attention-unified-wasm) | [![downloads](https://img.shields.io/npm/dt/@ruvector/attention-unified-wasm.svg)](https://www.npmjs.com/package/@ruvector/attention-unified-wasm) |
| [@ruvector/attention-unified-wasm](https://www.npmjs.com/package/@ruvector/attention-unified-wasm) | All 46 attention mechanisms | [![npm](https://img.shields.io/npm/v/@ruvector/attention-unified-wasm.svg)](https://www.npmjs.com/package/@ruvector/attention-unified-wasm) | [![downloads](https://img.shields.io/npm/dt/@ruvector/attention-unified-wasm.svg)](https://www.npmjs.com/package/@ruvector/attention-unified-wasm) |
| [@ruvector/tiny-dancer-wasm](https://www.npmjs.com/package/@ruvector/tiny-dancer-wasm) | AI routing WASM | [![npm](https://img.shields.io/npm/v/@ruvector/tiny-dancer-wasm.svg)](https://www.npmjs.com/package/@ruvector/tiny-dancer-wasm) | [![downloads](https://img.shields.io/npm/dt/@ruvector/tiny-dancer-wasm.svg)](https://www.npmjs.com/package/@ruvector/tiny-dancer-wasm) |
| [@ruvector/router-wasm](https://www.npmjs.com/package/@ruvector/router-wasm) | Semantic router WASM | [![npm](https://img.shields.io/npm/v/@ruvector/router-wasm.svg)](https://www.npmjs.com/package/@ruvector/router-wasm) | [![downloads](https://img.shields.io/npm/dt/@ruvector/router-wasm.svg)](https://www.npmjs.com/package/@ruvector/router-wasm) |
| [@ruvector/learning-wasm](https://www.npmjs.com/package/@ruvector/learning-wasm) | Learning module WASM | [![npm](https://img.shields.io/npm/v/@ruvector/learning-wasm.svg)](https://www.npmjs.com/package/@ruvector/learning-wasm) | [![downloads](https://img.shields.io/npm/dt/@ruvector/learning-wasm.svg)](https://www.npmjs.com/package/@ruvector/learning-wasm) |
Expand Down Expand Up @@ -1305,7 +1305,7 @@ All crates are published to [crates.io](https://crates.io) under the `ruvector-*

| Crate | Description | crates.io |
|-------|-------------|-----------|
| [ruvector-attention](./crates/ruvector-attention) | 40+ attention mechanisms (Flash, Hyperbolic, MoE, Graph) | [![crates.io](https://img.shields.io/crates/v/ruvector-attention.svg)](https://crates.io/crates/ruvector-attention) |
| [ruvector-attention](./crates/ruvector-attention) | 46 attention mechanisms (Flash, Hyperbolic, MoE, Graph) | [![crates.io](https://img.shields.io/crates/v/ruvector-attention.svg)](https://crates.io/crates/ruvector-attention) |
| [ruvector-attention-node](./crates/ruvector-attention-node) | Node.js bindings for attention mechanisms | [![crates.io](https://img.shields.io/crates/v/ruvector-attention-node.svg)](https://crates.io/crates/ruvector-attention-node) |
| [ruvector-attention-wasm](./crates/ruvector-attention-wasm) | WASM bindings for browser attention | [![crates.io](https://img.shields.io/crates/v/ruvector-attention-wasm.svg)](https://crates.io/crates/ruvector-attention-wasm) |
| [ruvector-attention-cli](./crates/ruvector-attention-cli) | CLI for attention testing and benchmarking | [![crates.io](https://img.shields.io/crates/v/ruvector-attention-cli.svg)](https://crates.io/crates/ruvector-attention-cli) |
Expand Down
16 changes: 9 additions & 7 deletions crates/ruvector-attention-unified-wasm/src/graph.rs
Original file line number Diff line number Diff line change
Expand Up @@ -43,14 +43,10 @@ impl WasmGNNLayer {
heads: usize,
dropout: f32,
) -> Result<WasmGNNLayer, JsError> {
if dropout < 0.0 || dropout > 1.0 {
return Err(JsError::new("Dropout must be between 0.0 and 1.0"));
}
let inner = RuvectorLayer::new(input_dim, hidden_dim, heads, dropout)
.map_err(|e| JsError::new(&e.to_string()))?;

Ok(WasmGNNLayer {
inner: RuvectorLayer::new(input_dim, hidden_dim, heads, dropout),
hidden_dim,
})
Ok(WasmGNNLayer { inner, hidden_dim })
}

/// Forward pass through the GNN layer
Expand Down Expand Up @@ -378,6 +374,12 @@ mod tests {
assert!(layer.is_err());
}

#[wasm_bindgen_test]
fn test_gnn_layer_invalid_heads() {
let layer = WasmGNNLayer::new(4, 7, 3, 0.1);
assert!(layer.is_err());
}

#[wasm_bindgen_test]
fn test_tensor_compress_creation() {
let compressor = WasmTensorCompress::new();
Expand Down
3 changes: 2 additions & 1 deletion crates/ruvector-cli/src/mcp/gnn_cache.rs
Original file line number Diff line number Diff line change
Expand Up @@ -189,7 +189,8 @@ impl GnnCache {
}

// Create new layer
let layer = RuvectorLayer::new(input_dim, hidden_dim, heads, dropout);
let layer = RuvectorLayer::new(input_dim, hidden_dim, heads, dropout)
.expect("GNN layer cache: invalid layer configuration");

// Cache it
{
Expand Down
12 changes: 6 additions & 6 deletions crates/ruvector-cli/tests/gnn_performance_test.rs
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ mod gnn_cache_tests {
#[test]
fn test_layer_creation_latency() {
let start = Instant::now();
let _layer = RuvectorLayer::new(128, 256, 4, 0.1);
let _layer = RuvectorLayer::new(128, 256, 4, 0.1).unwrap();
let elapsed = start.elapsed();

// Layer creation: 100ms in release, ~2000ms in debug
Expand All @@ -48,7 +48,7 @@ mod gnn_cache_tests {
/// Test that forward pass has acceptable latency
#[test]
fn test_forward_pass_latency() {
let layer = RuvectorLayer::new(128, 256, 4, 0.1);
let layer = RuvectorLayer::new(128, 256, 4, 0.1).unwrap();
let node = vec![0.5f32; 128];
let neighbors = vec![vec![0.3f32; 128], vec![0.7f32; 128]];
let weights = vec![0.5f32, 0.5f32];
Expand Down Expand Up @@ -83,7 +83,7 @@ mod gnn_cache_tests {
/// Test batch operations performance
#[test]
fn test_batch_operations_performance() {
let layer = RuvectorLayer::new(64, 128, 2, 0.1);
let layer = RuvectorLayer::new(64, 128, 2, 0.1).unwrap();

// Create batch of operations
let batch_size = 100;
Expand Down Expand Up @@ -139,7 +139,7 @@ mod gnn_cache_tests {
for (input, hidden, heads) in sizes {
// Measure creation
let start = Instant::now();
let layer = RuvectorLayer::new(input, hidden, heads, 0.1);
let layer = RuvectorLayer::new(input, hidden, heads, 0.1).unwrap();
let create_ms = start.elapsed().as_secs_f64() * 1000.0;

// Measure forward
Expand Down Expand Up @@ -216,7 +216,7 @@ mod gnn_cache_integration {

// First: measure time including layer creation
let start_cold = Instant::now();
let layer = RuvectorLayer::new(128, 256, 4, 0.1);
let layer = RuvectorLayer::new(128, 256, 4, 0.1).unwrap();
let node = vec![0.5f32; 128];
let neighbors = vec![vec![0.3f32; 128], vec![0.7f32; 128]];
let weights = vec![0.5f32, 0.5f32];
Expand Down Expand Up @@ -262,7 +262,7 @@ mod gnn_cache_integration {

// Create layer once
let start = Instant::now();
let layer = RuvectorLayer::new(64, 128, 2, 0.1);
let layer = RuvectorLayer::new(64, 128, 2, 0.1).unwrap();
let creation_time = start.elapsed();

let node = vec![0.5f32; 64];
Expand Down
4 changes: 3 additions & 1 deletion crates/ruvector-crv/src/stage_iii.rs
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,9 @@ impl StageIIIEncoder {
pub fn new(config: &CrvConfig) -> Self {
let dim = config.dimensions;
// Single GNN layer: input_dim -> hidden_dim, 1 head
let gnn_layer = RuvectorLayer::new(dim, dim, 1, 0.0);
// heads=1 always divides any dim, and dropout=0.0 is always valid
let gnn_layer = RuvectorLayer::new(dim, dim, 1, 0.0)
.expect("dim is always divisible by 1 head");

Self { dim, gnn_layer }
}
Expand Down
2 changes: 1 addition & 1 deletion crates/ruvector-gnn-node/npm/darwin-arm64/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@ruvector/gnn-darwin-arm64",
"version": "0.1.24",
"version": "0.1.25",
"os": [
"darwin"
],
Expand Down
2 changes: 1 addition & 1 deletion crates/ruvector-gnn-node/npm/darwin-x64/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@ruvector/gnn-darwin-x64",
"version": "0.1.24",
"version": "0.1.25",
"os": [
"darwin"
],
Expand Down
2 changes: 1 addition & 1 deletion crates/ruvector-gnn-node/npm/linux-arm64-gnu/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@ruvector/gnn-linux-arm64-gnu",
"version": "0.1.24",
"version": "0.1.25",
"os": [
"linux"
],
Expand Down
2 changes: 1 addition & 1 deletion crates/ruvector-gnn-node/npm/linux-arm64-musl/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@ruvector/gnn-linux-arm64-musl",
"version": "0.1.24",
"version": "0.1.25",
"os": [
"linux"
],
Expand Down
2 changes: 1 addition & 1 deletion crates/ruvector-gnn-node/npm/linux-x64-gnu/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@ruvector/gnn-linux-x64-gnu",
"version": "0.1.24",
"version": "0.1.25",
"os": [
"linux"
],
Expand Down
2 changes: 1 addition & 1 deletion crates/ruvector-gnn-node/npm/linux-x64-musl/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@ruvector/gnn-linux-x64-musl",
"version": "0.1.24",
"version": "0.1.25",
"os": [
"linux"
],
Expand Down
2 changes: 1 addition & 1 deletion crates/ruvector-gnn-node/npm/win32-x64-msvc/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@ruvector/gnn-win32-x64-msvc",
"version": "0.1.24",
"version": "0.1.25",
"os": [
"win32"
],
Expand Down
16 changes: 8 additions & 8 deletions crates/ruvector-gnn-node/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@ruvector/gnn",
"version": "0.1.24",
"version": "0.1.25",
"description": "Graph Neural Network capabilities for Ruvector - Node.js bindings",
"main": "index.js",
"types": "index.d.ts",
Expand Down Expand Up @@ -53,12 +53,12 @@
"access": "public"
},
"optionalDependencies": {
"@ruvector/gnn-win32-x64-msvc": "0.1.24",
"@ruvector/gnn-darwin-x64": "0.1.24",
"@ruvector/gnn-linux-x64-gnu": "0.1.24",
"@ruvector/gnn-linux-x64-musl": "0.1.24",
"@ruvector/gnn-linux-arm64-gnu": "0.1.24",
"@ruvector/gnn-linux-arm64-musl": "0.1.24",
"@ruvector/gnn-darwin-arm64": "0.1.24"
"@ruvector/gnn-linux-x64-gnu": "0.1.25",
"@ruvector/gnn-linux-x64-musl": "0.1.25",
"@ruvector/gnn-linux-arm64-gnu": "0.1.25",
"@ruvector/gnn-linux-arm64-musl": "0.1.25",
"@ruvector/gnn-darwin-x64": "0.1.25",
"@ruvector/gnn-darwin-arm64": "0.1.25",
"@ruvector/gnn-win32-x64-msvc": "0.1.25"
}
}
22 changes: 8 additions & 14 deletions crates/ruvector-gnn-node/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -44,21 +44,15 @@ impl RuvectorLayer {
/// ```
#[napi(constructor)]
pub fn new(input_dim: u32, hidden_dim: u32, heads: u32, dropout: f64) -> Result<Self> {
if dropout < 0.0 || dropout > 1.0 {
return Err(Error::new(
Status::InvalidArg,
"Dropout must be between 0.0 and 1.0".to_string(),
));
}
let inner = RustRuvectorLayer::new(
input_dim as usize,
hidden_dim as usize,
heads as usize,
dropout as f32,
)
.map_err(|e| Error::new(Status::InvalidArg, e.to_string()))?;

Ok(Self {
inner: RustRuvectorLayer::new(
input_dim as usize,
hidden_dim as usize,
heads as usize,
dropout as f32,
),
})
Ok(Self { inner })
}

/// Forward pass through the GNN layer
Expand Down
10 changes: 3 additions & 7 deletions crates/ruvector-gnn-wasm/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -81,14 +81,10 @@ impl JsRuvectorLayer {
heads: usize,
dropout: f32,
) -> Result<JsRuvectorLayer, JsValue> {
if dropout < 0.0 || dropout > 1.0 {
return Err(JsValue::from_str("Dropout must be between 0.0 and 1.0"));
}
let inner = RuvectorLayer::new(input_dim, hidden_dim, heads, dropout)
.map_err(|e| JsValue::from_str(&e.to_string()))?;

Ok(JsRuvectorLayer {
inner: RuvectorLayer::new(input_dim, hidden_dim, heads, dropout),
hidden_dim,
})
Ok(JsRuvectorLayer { inner, hidden_dim })
}

/// Forward pass through the GNN layer
Expand Down
Loading
Loading