Releases: ruvnet/RuVector
v2.1.0 — SOTA Gap Implementations
v2.1.0 — State-of-the-Art Gap Implementations
13 new modules across 3 crates addressing gaps between RuVector and 2024-2026 research from Google, Meta, DeepSeek, and Microsoft. 8,577 lines of new code, 859 tests passing, zero regressions.
Highlights
Advanced Search & Retrieval (ruvector-core)
- Hybrid Search (RRF) — Sparse + dense vector fusion with Reciprocal Rank Fusion, SPLADE-compatible scoring. 20-49% retrieval improvement.
- Graph RAG — Knowledge graph + Leiden community detection + local/global/hybrid search. 30-60% improvement on complex multi-hop queries.
- DiskANN / Vamana — SSD-backed billion-scale ANN with alpha-RNG pruning and LRU page cache. <10ms latency.
- ColBERT Multi-Vector — Per-token late interaction retrieval with MaxSim, AvgSim, SumMax scoring.
- Matryoshka Embeddings — Adaptive-dimension search with funnel and cascade modes for speed with minimal recall loss.
- OPQ — Optimized Product Quantization with learned rotation matrix. 10-30% error reduction vs standard PQ.
- LSM Compaction — Log-Structured Merge-tree for write-heavy workloads with bloom filters.
Attention & Inference (ruvector-attention)
- FlashAttention-3 — IO-aware tiled attention reducing memory from O(N²) to O(N). Configurable block sizes, causal masking, dropout.
- Multi-Head Latent Attention (MLA) — DeepSeek-V2/V3 style KV-cache compression (~93% reduction).
- KV-Cache Compression — 3-4 bit asymmetric per-channel quantization (TurboQuant-inspired). H2O, Sliding Window, PyramidKV eviction. 6-8x memory reduction.
- Selective State Space Models (Mamba) — Linear-time sequence processing with selective scan and discretization.
- Speculative Decoding — Draft-verify pipeline with Medusa multi-head and tree attention for 2-3x generation speedup.
Graph Learning (ruvector-gnn)
- GraphMAE — Graph Masked Autoencoder with GAT encoder, SCE loss, degree-centrality masking, re-masking regularization.
Quality
- 859 Rust tests — 423 (core) + 210 (attention) + 226 (gnn), all passing
- Zero regressions from v2.0.6
- No unsafe code in any new module
- Security fixes: NaN-safe sort comparisons, quantization input validation
Published Packages
crates.io:
| Crate | Version |
|---|---|
| ruvector-core | 2.1.0 |
| ruvector-attention | 2.1.0 |
| ruvector-gnn | 2.1.0 |
| ruvector-attention-wasm | 2.1.0 |
| ruvector-gnn-wasm | 2.1.0 |
| ruvllm | 2.1.0 |
npm:
| Package | Version |
|---|---|
| ruvector | 0.2.19 |
| ruvector-wasm | 2.1.0 |
| ruvector-attention-wasm | 2.1.0 |
| ruvector-gnn-wasm | 2.1.0 |
| ruvector-attention-unified-wasm | 0.1.0 |
| @ruvector/ruvllm | 2.5.4 |
CLI Fix
- Fixed `npx ruvector create` and `benchmark` commands — `dimension` → `dimensions` field name mismatch (#307)
Documentation
- Updated root README with all new SOTA modules
- Updated npm README with v2.1 features and TurboQuant section
- Updated @ruvector/ruvllm README with TurboQuant KV-cache compression docs
- ADR-128: SOTA gap analysis and implementation documentation
Full Changelog: v2.0.6...v2.1.0
Training Pipeline (ADR-129)
Added complete GCloud training infrastructure for continuous model improvement:
- Release gate automation — 7 ship/no-ship criteria (G1-G7) with automated checker
- Dataset governance — Schema validation, dedup, contamination checks, quality scoring
- Nightly training — Incremental LoRA from pi.ruv.io brain learnings → validate → push to HF
- TurboQuant sidecar —
.turboquant.jsonper-layer KV-cache config profiles - Cloud Run Jobs — 4 GPU jobs (calibration, SFT, benchmark, nightly) + 2 schedulers
- Ablation matrix — 5-run isolation testing (baseline → imatrix → SFT → DPO → TQ)
Deploy: ./scripts/training/deploy_training.sh
v0.5.0-dragnes: DrAgnes + Common Crawl + Gemini Grounding
DrAgnes Dermatology AI + Common Crawl WET Pipeline + Gemini Grounding Agents
New Features
- DrAgnes (
examples/dragnes/): Standalone AI dermatology intelligence platform with CNN classification, HIPAA compliance, DermLite integration - Common Crawl WET Pipeline: 178-domain automated import (medical + CS + physics)
- Gemini Grounding Agents: 4 autonomous agents — fact verification, relation generation, cross-domain discovery, research director
- Brain: 2,064 memories, 943K edges, 57x sparsifier, Gemini 2.5 Flash optimizer
Bug Fixes
- SONA
getStats().trajectoriesRecordedalways 0 (#273) - SONA state persistence across restarts (#274)
- MCP SSE auto-reconnect on stale sessions
- RuVocal icon 404, FoundationBackground crash
Published Packages
ruvector-sona@0.1.7(crates.io)ruvector@0.2.17(npm)
ADRs
117 (DrAgnes), 118 (Cost Strategy), 119 (Historical Crawl), 120 (WET Pipeline), 121 (Gemini Grounding), 122 (Grounding Agents)
Infrastructure
- Cloud Run: 4 CPU / 4 GiB / 20 instances
- 19 scheduler jobs (train, drift, transfer, graph, WET, Gemini agents)
- 3 Cloud Run Jobs (WET import, Gemini agents, rvagent learning)
v0.88.0 — RuVix Cognition Kernel + Neural Trader WASM
Highlights
RuVix Cognition Kernel (ADR-087)
A foundational architecture for RuVix — a cognition kernel for the Agentic Age. Not a Linux clone, but a purpose-built OS with 6 primitives, 12 syscalls, and RVF as native boot object.
Core primitives: task, capability, region, queue, timer, proof
Key innovations:
- Proof-gated mutation as kernel invariant — no proof, no mutation
- RuVector kernel-resident — vectors and graphs are native kernel resources
- Coherence-aware scheduler — deadline + novelty + structural risk
- RVF as boot object — signed packages are complete cognitive units
- 12 syscalls total
Build path: Phase A (Linux-hosted nucleus, 9 crates) → Phase B (bare metal AArch64)
8 demo applications: proof-gated vector journal, edge ML inference, drone swarm, self-healing knowledge graph, collective intelligence mesh, quantum-coherent memory replay, biological signal processor, adversarial reasoning arena
Security Hardening (Post-Audit)
6 specification clarifications added after security audit:
- Root task privilege attenuation — drops capabilities after boot
- Capability delegation depth limit — max depth 8, GRANT_ONCE right
- Boot RVF proof bootstrap — signed boot is the single trusted path
- Reflex proof cache scoping — per-(mutation_hash, nonce), single-use
- Zero-copy IPC TOCTOU — rejects descriptors into Slab regions
- Boot signature failure — kernel panic, no fallback
Neural Trader WASM Bindings (ADR-086)
4 new crates: core (2 tests), coherence (7 tests), replay (3 tests), wasm (10 + 43 JS tests)
Visual Updates
Animated dot-matrix grid and billboard squares on pi.ruv.io
PRs: #244, #248 + security hardening commit
🤖 Generated with claude-flow
Release v2.0.5
RuVector Release v2.0.5
Published Packages
crates.io
ruvector-math- Advanced math primitivesruvector-attention- 7-theory attention mechanismsruvector-math-wasm- WASM bindings for mathruvector-attention-wasm- WASM bindings for attention
npm
@ruvector/math-wasm- Browser WASM package@ruvector/attention- Main Node.js package (auto-selects platform)@ruvector/attention-wasm- Browser WASM package- Platform-specific: linux-x64, linux-arm64, darwin-x64, darwin-arm64, win32-x64
Installation
# Rust
cargo add ruvector-math ruvector-attention
# Node.js (auto-selects correct binary)
npm install @ruvector/attention
# Browser (WASM)
npm install @ruvector/math-wasm @ruvector/attention-wasmRelease v2.0.4
RuVector Release v2.0.4
Published Packages
crates.io
ruvector-math- Advanced math primitivesruvector-attention- 7-theory attention mechanismsruvector-math-wasm- WASM bindings for mathruvector-attention-wasm- WASM bindings for attention
npm
@ruvector/math-wasm- Browser WASM package@ruvector/attention- Main Node.js package (auto-selects platform)@ruvector/attention-wasm- Browser WASM package- Platform-specific: linux-x64, linux-arm64, darwin-x64, darwin-arm64, win32-x64
Installation
# Rust
cargo add ruvector-math ruvector-attention
# Node.js (auto-selects correct binary)
npm install @ruvector/attention
# Browser (WASM)
npm install @ruvector/math-wasm @ruvector/attention-wasmRelease v0.3.0
RuVector Release v0.3.0
Published Packages
crates.io
ruvector-math- Advanced math primitivesruvector-attention- 7-theory attention mechanismsruvector-math-wasm- WASM bindings for mathruvector-attention-wasm- WASM bindings for attention
npm
@ruvector/math-wasm- Browser WASM package@ruvector/attention- Main Node.js package (auto-selects platform)@ruvector/attention-wasm- Browser WASM package- Platform-specific: linux-x64, linux-arm64, darwin-x64, darwin-arm64, win32-x64
Installation
# Rust
cargo add ruvector-math ruvector-attention
# Node.js (auto-selects correct binary)
npm install @ruvector/attention
# Browser (WASM)
npm install @ruvector/math-wasm @ruvector/attention-wasmRVF CLI rvf-v0.1.0
RVF CLI Release
Standalone vector database CLI for creating, querying, and managing RVF stores.
Download
| Platform | Binary |
|---|---|
| Linux x64 | rvf-linux-x64 |
| Linux ARM64 | rvf-linux-arm64 |
| macOS x64 (Intel) | rvf-darwin-x64 |
| macOS ARM64 (Apple Silicon) | rvf-darwin-arm64 |
| Windows x64 | rvf-windows-x64.exe |
Quick start
# Download (macOS ARM64 example)
curl -L -o rvf https://github.com/ruvnet/ruvector/releases/download/rvf-v0.1.0/rvf-darwin-arm64
chmod +x rvf
# Create a store and query
./rvf create mydb.rvf --dimension 128 --metric cosine
./rvf status mydb.rvfSee the CLI README for full documentation.
Release v0.1.32
RuVector Release v0.1.32
Published Packages
crates.io
ruvector-math- Advanced math primitivesruvector-attention- 7-theory attention mechanismsruvector-math-wasm- WASM bindings for mathruvector-attention-wasm- WASM bindings for attention
npm
@ruvector/math-wasm- Browser WASM package@ruvector/attention- Main Node.js package (auto-selects platform)@ruvector/attention-wasm- Browser WASM package- Platform-specific: linux-x64, linux-arm64, darwin-x64, darwin-arm64, win32-x64
Installation
# Rust
cargo add ruvector-math ruvector-attention
# Node.js (auto-selects correct binary)
npm install @ruvector/attention
# Browser (WASM)
npm install @ruvector/math-wasm @ruvector/attention-wasmRelease v0.1.31
RuVector Release v0.1.31
Published Packages
crates.io
ruvector-math- Advanced math primitivesruvector-attention- 7-theory attention mechanismsruvector-math-wasm- WASM bindings for mathruvector-attention-wasm- WASM bindings for attention
npm
ruvector-math-wasm- Browser WASM package@ruvector/attention- Main Node.js package (auto-selects platform)@ruvector/attention-wasm- Browser WASM package- Platform-specific: linux-x64, linux-arm64, darwin-x64, darwin-arm64, win32-x64
Installation
# Rust
cargo add ruvector-math ruvector-attention
# Node.js (auto-selects correct binary)
npm install @ruvector/attention
# Browser (WASM)
npm install ruvector-math-wasm @ruvector/attention-wasmv0.1.4: chore: Update NAPI-RS binaries for all platforms
Built from commit b5b4858a26debd612e5ddbc80964d048dba401ff Platforms updated: - linux-x64-gnu - linux-arm64-gnu - darwin-x64 - darwin-arm64 - win32-x64-msvc 🤖 Generated by GitHub Actions