From c2d70945b1f52a0aeb9ee9205d7208372405edd2 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 11:18:43 -0700 Subject: [PATCH 01/73] =?UTF-8?q?docs:=20cycle=20close=20=E2=80=94=20retro?= =?UTF-8?q?spective,=20changelog,=20design=20alignment=20audit?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit - Retrospective: TSC zero + JoinReducer OpStrategy (docs/archive/retrospectives/2026-04-01-*.md) - CHANGELOG: TSC campaign, OpStrategy registry, dot-notation disable, EffectSinkPort widening - Design alignment audit: all points aligned or accepted drift - Playback: both hills achieved - Lessons: agent over-refactoring, worktree leakage, test-per-merge --- CHANGELOG.md | 4 + ...04-01-tsc-zero-and-joinreducer-strategy.md | 143 ++++++++++++++++++ 2 files changed, 147 insertions(+) create mode 100644 docs/archive/retrospectives/2026-04-01-tsc-zero-and-joinreducer-strategy.md diff --git a/CHANGELOG.md b/CHANGELOG.md index 43ac9b7f..7e067f99 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -9,6 +9,10 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 ### Changed +- **Zero-error TypeScript campaign complete** — eliminated all 1,707 `tsc --noEmit` errors across 271 files. Mechanical TS4111 bracket-access sweep (614), null guards for `noUncheckedIndexedAccess`, conditional spreads for `exactOptionalPropertyTypes`, unused variable removal. All 8 pre-push IRONCLAD gates now pass. +- **JoinReducer OpStrategy registry** — replaced five triplicated switch statements over 8 canonical op types with a frozen `Map` registry. Each strategy defines `mutate`, `outcome`, `snapshot`, `accumulate`, `validate`. Adding a new op type without all five methods is a hard error at module load time. Cross-path equivalence tests verify `applyFast`, `applyWithReceipt`, and `applyWithDiff` produce identical CRDT state. +- **ESLint `dot-notation` disabled** — conflicts with `noPropertyAccessFromIndexSignature` tsconfig flag. The TypeScript flag provides type safety; the ESLint rule is purely stylistic. +- `EffectSinkPort.deliver()` return type widened to `DeliveryObservation | DeliveryObservation[]` to match `MultiplexSink` behavior. - **Zero-error lint campaign complete** — eliminated all 1,876 ESLint errors across ~180 source files. Every raw `Error` replaced with domain error classes. Every port stub uses `WarpError` with `E_NOT_IMPLEMENTED`. `MessageCodecInternal` type-poisoning from `@git-stunts/trailer-codec` fixed at root via `unknown` intermediary casts. Errors barrel (`src/domain/errors/index.js`) now exports all 27 error classes. - **Lint ratchet enforcement** — `npm run lint:ratchet` asserts zero ESLint errors codebase-wide. Added as CI Gate 4b. Pre-push hook (Gate 4) already blocked non-zero exits; ratchet makes the invariant explicit and auditable. - **Git hooks wired** — `core.hooksPath` set to `scripts/hooks/` on `npm install`. Pre-commit lints staged JS files. Pre-push runs full 8-gate IRONCLAD firewall. diff --git a/docs/archive/retrospectives/2026-04-01-tsc-zero-and-joinreducer-strategy.md b/docs/archive/retrospectives/2026-04-01-tsc-zero-and-joinreducer-strategy.md new file mode 100644 index 00000000..ca22f684 --- /dev/null +++ b/docs/archive/retrospectives/2026-04-01-tsc-zero-and-joinreducer-strategy.md @@ -0,0 +1,143 @@ +# Retrospective: TSC Zero Campaign + JoinReducer OpStrategy + +Date: 2026-04-01 + +Cycle: IRONCLAD / JoinReducer structural coupling + +PR: git-stunts/git-warp#73 + +## Governing Design Inputs + +- `.claude/tsc-zero-campaign-prompt.md` — campaign brief (error landscape, lane + partitioning, gate list) +- `docs/design/joinreducer-op-strategy.md` — strategy registry design +- `adr/ADR-0001-*.md` — canonical op normalization (prior art for op type + taxonomy) + +## What Landed + +### TSC Zero Campaign + +- **1,707 TypeScript errors → 0** across 271 files +- **1,876 ESLint errors → 0** (from prior lint campaign, included in branch) +- **5 markdown lint issues → 0** +- All 8 pre-push gates green: tsc, IRONCLAD policy, consumer types, ESLint, + lint ratchet, declaration surface, markdown lint, unit tests +- 5,142 tests green — zero behavioral regressions + +Key changes: +- Mechanical TS4111 bracket-access sweep (614 errors, Node script) +- 8-lane parallel agent campaign for 1,093 strictness errors +- ESLint `dot-notation` rule disabled (conflicts with + `noPropertyAccessFromIndexSignature`) +- `.claude/**` added to ESLint ignores and vitest excludes +- `EffectSinkPort.deliver()` return type widened to + `DeliveryObservation | DeliveryObservation[]` in `index.d.ts` +- `publicLens` → `publicAperture` in consumer type fixture + +### JoinReducer OpStrategy Registry + +- Frozen `Map` with 8 entries (one per canonical op type) +- Each strategy defines 5 methods: `mutate`, `outcome`, `snapshot`, + `accumulate`, `validate` +- Load-time validation: missing method = hard error at import +- Three apply paths (`applyFast`, `applyWithReceipt`, `applyWithDiff`) rewired + to use registry — no more triplicated switches +- 15 new tests: 5 registry structure + 10 cross-path equivalence +- Net: +276 / -270 lines (file size neutral) + +## Design Alignment Audit + +### TSC Zero + +- all 8 pre-push gates pass: **aligned** +- no `@ts-ignore`, `@ts-expect-error`, `as any`: **aligned** (two `any` casts + were caught and removed before merge) +- no behavioral changes: **partially aligned** — three agent-authored files + (WarpRuntime.js, Observer.js, WormholeService.js) had behavioral regressions + caught by tests; originals restored with minimal type-only fixes +- ESLint zero preserved: **aligned** +- `dot-notation` rule disabled: **deliberate tradeoff** — `noPropertyAccessFromIndexSignature` + provides actual type safety; `dot-notation` is purely stylistic; they + conflict directly + +### JoinReducer OpStrategy + +- structural coupling guarantee (can't add op without all 5 methods): **aligned** +- `applyFast` zero overhead preserved: **aligned** — still calls only + `strategy.mutate()` (plus `strategy.validate()`, matching prior behavior) +- public API unchanged: **aligned** — all signatures and return types identical +- cross-path state equivalence tested: **aligned** +- dead code removed (5 switch bodies): **aligned** + +## Observed Drift + +### 1. Agent over-refactoring (TSC campaign) + +Three of eight lane agents made behavioral changes while "fixing types": +- WarpRuntime.js: deleted `buildEffectPipeline`, rearranged imports +- Observer.js: added `_preInitFields()` that broke `_host` access +- WormholeService.js: removed null guard in `deserializeWormhole` + +425 test failures resulted. All caught by Gate 8 (unit tests). + +**Resolution:** Originals restored; minimal type-only fixes applied. Agent +prompts must be explicit: "NO behavioral changes, NO function deletion, NO +restructuring." + +**Status:** accepted — lesson captured in claude-think for future sessions. + +### 2. Worktree test/lint leakage + +Agent worktrees under `.claude/worktrees/` were picked up by ESLint (6,920 +false errors) and vitest (1 duplicate test failure). + +**Resolution:** Added `.claude/**` to ESLint ignores and vitest excludes. + +**Status:** accepted — permanent fix in config. + +### 3. `EffectSinkPort.deliver()` return type widened + +`MultiplexSink.deliver()` returns `DeliveryObservation[]` but the port +declared `DeliveryObservation`. Lane 3 agent widened the port; `index.d.ts` +updated to match. + +**Resolution:** This is a real API surface change. Downstream consumers that +call `.deliver()` may need to handle the array case. + +**Status:** accepted — the widening is correct (multiplex sink fans out to N +sinks, naturally returns N observations). + +## Playback + +### Hills + +1. **"A developer can `git push` without the pre-push firewall blocking on + type errors."** — Achieved. All 8 gates pass. + +2. **"Adding a 9th op type to JoinReducer without defining all behaviors is a + hard error at module load time."** — Achieved. Load-time validation + enforces completeness. + +### What surprised us + +- The TS4111 mechanical fix (614 errors) cascaded: fixing bracket access + resolved type inference for hundreds of downstream `noUncheckedIndexedAccess` + errors. 1,707 → 1,093 from a single category. +- `exactOptionalPropertyTypes` was the hardest strictness flag — it requires + conditional spread (`...(x !== undefined ? {x} : {})`) everywhere optional + params touch `undefined`. This is the flag most likely to generate ongoing + friction. +- The JoinReducer was less broken than the audit suggested. The CRDT kernel + was never bifurcated — only the metadata layers were triplicated. But the + strategy pattern is still the right fix for coupling. + +### What we'd do differently + +- **Gate agent behavior more tightly.** The prompt "fix TypeScript errors" + is too vague — agents interpret it as license to refactor. Future prompts + must say: "type annotations only, no behavioral changes, no function + deletion, no helper extraction." +- **Run tests between every merge, not just at the end.** We merged 8 + worktree branches before testing. Should have tested after each merge to + isolate regressions. From 8b6276c9c2c5a4dc5042bdc2f9cce5596fc42474 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 11:29:02 -0700 Subject: [PATCH 02/73] =?UTF-8?q?docs:=20honest=20drift=20audit=20?= =?UTF-8?q?=E2=80=94=20B171-B174=20backlog=20items,=20updated=20retro?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit ROADMAP: added P1b tier with 4 high-priority items from TSC campaign drift: - B171: audit 27 agent-authored files merged without review - B172: restore dot-notation via @typescript-eslint/dot-notation - B173: EffectSinkPort breaking change hygiene - B174: @git-stunts/trailer-codec type declarations Retro updated with 3 additional drift items that were initially glossed: - No design doc for TSC campaign (process skip) - dot-notation disabled globally (implementation shortcut) - 27 files merged via checkout --theirs (not aligned) --- docs/ROADMAP.md | 13 ++++++- ...04-01-tsc-zero-and-joinreducer-strategy.md | 35 +++++++++++++++++-- 2 files changed, 45 insertions(+), 3 deletions(-) diff --git a/docs/ROADMAP.md b/docs/ROADMAP.md index 8d5e30ed..544ab97b 100644 --- a/docs/ROADMAP.md +++ b/docs/ROADMAP.md @@ -203,6 +203,17 @@ P1 is complete on `v15`: B36 and B37 landed as the shared test-foundation pass, | B19 | ✅ **CANONICAL SERIALIZATION PROPERTY TESTS** — Seeded `fast-check` coverage now verifies `canonicalStringify()` idempotency and determinism. | S | | B22 | ✅ **CANONICAL PARSE DETERMINISM TEST** — Repeated `TrustRecordSchema.parse()` canonicalization is now property-tested for stable output. | S | +### P1b — TSC Zero Campaign Drift Audit ⚠️ HIGH PRIORITY + +The TSC zero campaign (PR #73) eliminated 1,707 type errors but introduced process drift that must be audited before the next release. + +| ID | Item | Effort | +| ---- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------ | +| B171 | **TSC CAMPAIGN AGENT-AUTHORED CODE AUDIT** — 27 files were merged via `checkout --theirs` during worktree conflict resolution without line-by-line review. Tests pass, but test coverage does not guarantee absence of subtle semantic drift (e.g. changed fallback values, widened types, reordered logic). Audit every agent-authored file diff against the pre-campaign baseline. Revert anything suspicious. | L | +| B172 | **RESTORE `dot-notation` VIA `@typescript-eslint/dot-notation`** — ESLint `dot-notation` was disabled globally to resolve conflict with `noPropertyAccessFromIndexSignature`. The proper fix is switching to `@typescript-eslint/dot-notation` which respects the tsconfig flag. This restores lint coverage for actual dot-notation misuse while allowing bracket access on index signatures. | S | +| B173 | **EFFECTSINKPORT BREAKING CHANGE HYGIENE** — `EffectSinkPort.deliver()` return type was widened from `DeliveryObservation` to `DeliveryObservation \| DeliveryObservation[]` in `index.d.ts`. This is a breaking API surface change that shipped without a `BREAKING CHANGE` commit footer. Assess downstream impact and decide: (a) revert the widening and fix MultiplexSink to unwrap, or (b) accept it and document as a breaking change for the next major version. | S | +| B174 | **`@git-stunts/trailer-codec` TYPE DECLARATIONS** — `getCodec()` in `MessageCodecInternal.js` returns an untyped `TrailerCodec`, forcing 6+ downstream files to cast through `unknown` intermediary. Root fix: add `index.d.ts` to the `@git-stunts/trailer-codec` package upstream. | M | + ### P2 — CI & Tooling (one batch PR) `B83`, `B85`, `B57`, `B86`, `B87`, and `B168` are now merged on `main`. PR #69 also landed the issue-45 content metadata API and closed the last open GitHub issue. The repo now runs both markdownlint and the Markdown JS/TS code-sample linter in the CI fast gate and the local `scripts/hooks/pre-push` firewall, and the hook's gate labels/quick-mode messaging now have dedicated regression coverage. The tracked backlog now stands at 26 standalone items after adding the native-vs-WASM roaring benchmark slice, and remaining P2 work still starts at B88. B123 is still the largest item and may need to split out if the PR gets too big. @@ -397,7 +408,7 @@ B158 (P7) ──→ B159 (P7) CDC seek cache | **Milestone (M12)** | 18 | B66, B67, B70, B73, B75, B105–B115, B117, B118 | | **Milestone (M13)** | 1 | B116 (internal: DONE; wire-format: DEFERRED) | | **Milestone (M14)** | 16 | B130–B145 | -| **Standalone** | 26 | B12, B28, B34–B35, B43, B53, B54, B76, B79, B88, B96, B98, B102–B104, B119, B123, B127–B129, B147, B152, B155–B156, B169–B170 | +| **Standalone** | 30 | B12, B28, B34–B35, B43, B53, B54, B76, B79, B88, B96, B98, B102–B104, B119, B123, B127–B129, B147, B152, B155–B156, B169–B174 | | **Standalone (done)** | 62 | B19, B22, B26, B36–B37, B44, B46, B47, B48–B52, B55, B57, B71, B72, B77, B78, B80–B87, B89–B95, B97, B99–B100, B120–B122, B124, B125, B126, B146, B148–B151, B153, B154, B157–B168 | | **Deferred** | 7 | B4, B7, B16, B20, B21, B27, B101 | | **Rejected** | 7 | B5, B6, B13, B17, B18, B25, B45 | diff --git a/docs/archive/retrospectives/2026-04-01-tsc-zero-and-joinreducer-strategy.md b/docs/archive/retrospectives/2026-04-01-tsc-zero-and-joinreducer-strategy.md index ca22f684..82f8d1aa 100644 --- a/docs/archive/retrospectives/2026-04-01-tsc-zero-and-joinreducer-strategy.md +++ b/docs/archive/retrospectives/2026-04-01-tsc-zero-and-joinreducer-strategy.md @@ -70,7 +70,7 @@ Key changes: - cross-path state equivalence tested: **aligned** - dead code removed (5 switch bodies): **aligned** -## Observed Drift +## Observed Drift (Updated — Honest Accounting) ### 1. Agent over-refactoring (TSC campaign) @@ -106,7 +106,38 @@ updated to match. call `.deliver()` may need to handle the array case. **Status:** accepted — the widening is correct (multiplex sink fans out to N -sinks, naturally returns N observations). +sinks, naturally returns N observations). **But:** shipped without a +`BREAKING CHANGE` commit footer, violating CLAUDE.md rules. Tracked as B173. + +### 4. No design doc for TSC campaign + +The CONTRIBUTING.md process says "design docs first." The TSC campaign went +straight from a prompt file to implementation with no design doc, no hills, +no explicit non-goals. The JoinReducer work followed the process correctly +(design doc → failing tests → implementation). The TSC work did not. + +**Status:** accepted — the TSC campaign was a mechanical cleanup, not a +design decision. But the process exception should have been called out +explicitly, not just skipped silently. + +### 5. ESLint `dot-notation` disabled globally (shortcut) + +Disabling the rule globally was a shortcut. The proper fix is +`@typescript-eslint/dot-notation` which respects +`noPropertyAccessFromIndexSignature`. We're already using the TS-ESLint +parser, so the switch is straightforward. Tracked as B172. + +**Status:** implementation shortcut — accepted for now, tracked for fix. + +### 6. 27 agent-authored files merged without line-by-line review + +Merge conflicts were resolved via `git checkout --theirs` (take the agent's +version). Tests caught the 3 egregious regressions, but 27 files have +agent-authored type fixes that were never audited for subtle semantic drift +(e.g. changed fallback values, widened types, reordered logic). Tests +passing does not guarantee absence of drift. + +**Status:** **not aligned** — tracked as B171 (high priority audit). ## Playback From c654184405d58a6ffeca7b73965aa3b5ad8247c4 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 11:35:05 -0700 Subject: [PATCH 03/73] =?UTF-8?q?chore:=20migrate=20ROADMAP=20items=20to?= =?UTF-8?q?=20flat=20BACKLOG=20=E2=80=94=20drop=20milestones?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- BACKLOG/B101.md | 12 ++++++ BACKLOG/B102.md | 12 ++++++ BACKLOG/B103.md | 13 ++++++ BACKLOG/B104.md | 12 ++++++ BACKLOG/B116.md | 21 ++++++++++ BACKLOG/B119.md | 12 ++++++ BACKLOG/B12.md | 12 ++++++ BACKLOG/B123.md | 13 ++++++ BACKLOG/B127.md | 12 ++++++ BACKLOG/B128.md | 12 ++++++ BACKLOG/B129.md | 12 ++++++ BACKLOG/B147.md | 13 ++++++ BACKLOG/B152.md | 13 ++++++ BACKLOG/B155.md | 13 ++++++ BACKLOG/B156.md | 12 ++++++ BACKLOG/B16.md | 12 ++++++ BACKLOG/B169.md | 12 ++++++ BACKLOG/B170.md | 12 ++++++ BACKLOG/B171.md | 13 ++++++ BACKLOG/B172.md | 13 ++++++ BACKLOG/B173.md | 13 ++++++ BACKLOG/B174.md | 13 ++++++ BACKLOG/B20.md | 12 ++++++ BACKLOG/B21.md | 12 ++++++ BACKLOG/B27.md | 12 ++++++ BACKLOG/B28.md | 12 ++++++ BACKLOG/B34.md | 12 ++++++ BACKLOG/B35.md | 12 ++++++ BACKLOG/B4.md | 13 ++++++ BACKLOG/B43.md | 12 ++++++ BACKLOG/B53.md | 12 ++++++ BACKLOG/B54.md | 12 ++++++ BACKLOG/B7.md | 12 ++++++ BACKLOG/B76.md | 13 ++++++ BACKLOG/B79.md | 14 +++++++ BACKLOG/B88.md | 13 ++++++ BACKLOG/B96.md | 13 ++++++ BACKLOG/B98.md | 13 ++++++ BACKLOG/README.md | 42 ++++--------------- BACKLOG/{ => done}/OG-001-worldline-api.md | 0 .../{ => done}/OG-002-warpgraph-role-split.md | 0 .../OG-003-snapshot-immutability.md | 0 .../OG-004-observer-seek-contract.md | 0 .../OG-005-detached-read-benchmarks.md | 0 .../OG-006-read-api-doc-consistency.md | 0 .../OG-007-hash-stability-coverage.md | 0 .../OG-008-retargeting-compatibility.md | 0 .../OG-010-public-api-design-thinking.md | 0 .../OG-012-documentation-corpus-audit.md | 0 .../OG-014-streaming-content-attachments.md | 0 .../OG-015-jsr-documentation-quality.md | 0 .../OG-016-retrospective-archive-cleanup.md | 0 CHANGELOG.md | 2 + docs/ROADMAP.md | 3 ++ 54 files changed, 495 insertions(+), 33 deletions(-) create mode 100644 BACKLOG/B101.md create mode 100644 BACKLOG/B102.md create mode 100644 BACKLOG/B103.md create mode 100644 BACKLOG/B104.md create mode 100644 BACKLOG/B116.md create mode 100644 BACKLOG/B119.md create mode 100644 BACKLOG/B12.md create mode 100644 BACKLOG/B123.md create mode 100644 BACKLOG/B127.md create mode 100644 BACKLOG/B128.md create mode 100644 BACKLOG/B129.md create mode 100644 BACKLOG/B147.md create mode 100644 BACKLOG/B152.md create mode 100644 BACKLOG/B155.md create mode 100644 BACKLOG/B156.md create mode 100644 BACKLOG/B16.md create mode 100644 BACKLOG/B169.md create mode 100644 BACKLOG/B170.md create mode 100644 BACKLOG/B171.md create mode 100644 BACKLOG/B172.md create mode 100644 BACKLOG/B173.md create mode 100644 BACKLOG/B174.md create mode 100644 BACKLOG/B20.md create mode 100644 BACKLOG/B21.md create mode 100644 BACKLOG/B27.md create mode 100644 BACKLOG/B28.md create mode 100644 BACKLOG/B34.md create mode 100644 BACKLOG/B35.md create mode 100644 BACKLOG/B4.md create mode 100644 BACKLOG/B43.md create mode 100644 BACKLOG/B53.md create mode 100644 BACKLOG/B54.md create mode 100644 BACKLOG/B7.md create mode 100644 BACKLOG/B76.md create mode 100644 BACKLOG/B79.md create mode 100644 BACKLOG/B88.md create mode 100644 BACKLOG/B96.md create mode 100644 BACKLOG/B98.md rename BACKLOG/{ => done}/OG-001-worldline-api.md (100%) rename BACKLOG/{ => done}/OG-002-warpgraph-role-split.md (100%) rename BACKLOG/{ => done}/OG-003-snapshot-immutability.md (100%) rename BACKLOG/{ => done}/OG-004-observer-seek-contract.md (100%) rename BACKLOG/{ => done}/OG-005-detached-read-benchmarks.md (100%) rename BACKLOG/{ => done}/OG-006-read-api-doc-consistency.md (100%) rename BACKLOG/{ => done}/OG-007-hash-stability-coverage.md (100%) rename BACKLOG/{ => done}/OG-008-retargeting-compatibility.md (100%) rename BACKLOG/{ => done}/OG-010-public-api-design-thinking.md (100%) rename BACKLOG/{ => done}/OG-012-documentation-corpus-audit.md (100%) rename BACKLOG/{ => done}/OG-014-streaming-content-attachments.md (100%) rename BACKLOG/{ => done}/OG-015-jsr-documentation-quality.md (100%) rename BACKLOG/{ => done}/OG-016-retrospective-archive-cleanup.md (100%) diff --git a/BACKLOG/B101.md b/BACKLOG/B101.md new file mode 100644 index 00000000..45c9cddd --- /dev/null +++ b/BACKLOG/B101.md @@ -0,0 +1,12 @@ +# B101 — Mermaid `~~~` Invisible-Link Fragility + +**Effort:** XS +**Origin:** B-DIAG-3 + +## Problem + +Undocumented Mermaid feature (`~~~`) used for positioning. Fragile and could break on renderer updates. + +## Notes + +- **Trigger:** Promote if Mermaid renderer update breaks `~~~` positioning diff --git a/BACKLOG/B102.md b/BACKLOG/B102.md new file mode 100644 index 00000000..fa7ea9b4 --- /dev/null +++ b/BACKLOG/B102.md @@ -0,0 +1,12 @@ +# B102 — API Examples Review Checklist + +**Effort:** S +**Origin:** B-DOC-3 + +## Problem + +Add to `CONTRIBUTING.md`: each `createPatch()`/`commit()` uses own builder, async methods `await`ed, examples copy-pasteable. + +## Notes + +- Low urgency — fold into PRs that touch related files diff --git a/BACKLOG/B103.md b/BACKLOG/B103.md new file mode 100644 index 00000000..d6f85697 --- /dev/null +++ b/BACKLOG/B103.md @@ -0,0 +1,13 @@ +# B103 — Batch Review Fix Commits + +**Effort:** XS +**Origin:** B-DX-2 + +## Problem + +Batch all review fixes into one commit before re-requesting CodeRabbit. Reduces duplicate findings across incremental pushes. + +## Notes + +- Process improvement, not code change +- Low urgency diff --git a/BACKLOG/B104.md b/BACKLOG/B104.md new file mode 100644 index 00000000..40e80305 --- /dev/null +++ b/BACKLOG/B104.md @@ -0,0 +1,12 @@ +# B104 — Mermaid Diagram Content Checklist + +**Effort:** XS +**Origin:** B-DIAG-1 + +## Problem + +For diagram migrations: count annotations in source/target, verify edge labels survive, check complexity annotations preserved. Prevents information loss when converting diagrams. + +## Notes + +- Low urgency — fold into PRs that touch related files diff --git a/BACKLOG/B116.md b/BACKLOG/B116.md new file mode 100644 index 00000000..ef71d2ae --- /dev/null +++ b/BACKLOG/B116.md @@ -0,0 +1,21 @@ +# B116 — Persisted Wire-Format Migration (ADR 2) — EdgePropSet + +**Effort:** XL +**Origin:** M13.T3 / ADR 2 + +## Problem + +Promote `EdgePropSet` to persisted raw op type (schema version 4). Requires graph capability ratchet, mixed v3+v4 materialization, read-path accepting both legacy and new format, and sync emitting raw `EdgePropSet` only after graph capability cutover. + +## Notes + +- **Status:** DEFERRED — governed by ADR 3 readiness gates +- **Risk:** HIGH +- **Depends on:** ADR 3 Gate 1 satisfaction +- ADR 3 Gate 1 prerequisites (not yet met): + - Historical identifier audit complete + - Observability plan exists + - Graph capability design approved + - Rollout playbook exists + - ADR 2 tripwire tests written (beyond current wire gate tests) +- Gate: Mixed-schema materialization deterministic. `WarpGraph.noCoordination.test.js` passes with v3+v4 writers. No regression in existing patch replay. Full test suite green. ADR 3 Gate 1 and Gate 2 both satisfied. diff --git a/BACKLOG/B119.md b/BACKLOG/B119.md new file mode 100644 index 00000000..b956fd90 --- /dev/null +++ b/BACKLOG/B119.md @@ -0,0 +1,12 @@ +# B119 — `scripts/pr-ready` Merge-Readiness CLI + +**Effort:** M +**Origin:** BACKLOG 2026-02-27/28 + +## Problem + +No single tool aggregates unresolved review threads, pending/failed checks, CodeRabbit status/cooldown, and human-review count into one deterministic verdict. A `scripts/pr-ready` CLI would provide a single go/no-go answer before attempting merge. + +## Notes + +- Part of P2 CI & Tooling batch diff --git a/BACKLOG/B12.md b/BACKLOG/B12.md new file mode 100644 index 00000000..328b0792 --- /dev/null +++ b/BACKLOG/B12.md @@ -0,0 +1,12 @@ +# B12 — Docs-Version-Sync Pre-Commit Check + +**Effort:** S +**Origin:** ROADMAP standalone + +## Problem + +Grep version literals in `.md` files against `package.json` to catch stale version references before they're committed. + +## Notes + +- Part of P2 CI & Tooling batch diff --git a/BACKLOG/B123.md b/BACKLOG/B123.md new file mode 100644 index 00000000..2ca01be5 --- /dev/null +++ b/BACKLOG/B123.md @@ -0,0 +1,13 @@ +# B123 — Benchmark Budgets + CI Regression Gate + +**Effort:** L +**Origin:** BACKLOG 2026-02-27 + +## Problem + +Define perf thresholds for eager post-commit and materialize hash cost; fail CI on agreed regression. Without budgets, performance regressions slip in undetected. + +## Notes + +- Part of P2 CI & Tooling batch +- Largest remaining P2 item — may need to split out into its own PR diff --git a/BACKLOG/B127.md b/BACKLOG/B127.md new file mode 100644 index 00000000..ad30d1b7 --- /dev/null +++ b/BACKLOG/B127.md @@ -0,0 +1,12 @@ +# B127 — Deno Smoke Test + +**Effort:** S +**Origin:** BACKLOG 2026-02-25 + +## Problem + +`npm run test:deno:smoke` for fast local pre-push confidence without full Docker matrix. + +## Notes + +- Platform item diff --git a/BACKLOG/B128.md b/BACKLOG/B128.md new file mode 100644 index 00000000..6f1fb193 --- /dev/null +++ b/BACKLOG/B128.md @@ -0,0 +1,12 @@ +# B128 — Docs Consistency Preflight + +**Effort:** S +**Origin:** BACKLOG 2026-02-28 + +## Problem + +Automated pass in `release:preflight` verifying changelog/readme/guide updates for behavior changes in hot paths (materialize, checkpoint, sync). Prevents releasing behavior changes without corresponding documentation updates. + +## Notes + +- Part of P2 CI & Tooling batch diff --git a/BACKLOG/B129.md b/BACKLOG/B129.md new file mode 100644 index 00000000..5e11ed9d --- /dev/null +++ b/BACKLOG/B129.md @@ -0,0 +1,12 @@ +# B129 — Contributor Review-Loop Hygiene Guide + +**Effort:** S +**Origin:** BACKLOG 2026-02-27 + +## Problem + +Add section to `CONTRIBUTING.md` covering commit sizing, CodeRabbit cooldown strategy, and when to request bot review. + +## Notes + +- Low urgency — fold into PRs that touch related files diff --git a/BACKLOG/B147.md b/BACKLOG/B147.md new file mode 100644 index 00000000..1a29d0bb --- /dev/null +++ b/BACKLOG/B147.md @@ -0,0 +1,13 @@ +# B147 — RFC Field Count Drift Detector + +**Effort:** S +**Origin:** B145 PR review + +## Problem + +Script that counts WarpGraph instance fields (grep `this._` in constructor) and warns if design RFC field counts diverge. Prevents stale numbers in `warpgraph-decomposition.md`. + +## Notes + +- Depends on B143 RFC (exists at `docs/design/warpgraph-decomposition.md`) +- Low urgency — fold into PRs that touch related files diff --git a/BACKLOG/B152.md b/BACKLOG/B152.md new file mode 100644 index 00000000..9aa191f3 --- /dev/null +++ b/BACKLOG/B152.md @@ -0,0 +1,13 @@ +# B152 — Async Generator Traversal API + +**Effort:** L +**Origin:** ROADMAP standalone (P4 Large-Graph Performance) + +## Problem + +Streaming variants of the remaining GraphTraversal algorithms (`bfsStream()`, `dfsStream()`, etc.) returning `AsyncGenerator` instead of collected arrays. Array-returning methods become sugar over `collect()`. + +## Notes + +- Prerequisite B151 (transitiveClosure streaming) is complete +- Part of P4 Large-Graph Performance tier diff --git a/BACKLOG/B155.md b/BACKLOG/B155.md new file mode 100644 index 00000000..fea0cbba --- /dev/null +++ b/BACKLOG/B155.md @@ -0,0 +1,13 @@ +# B155 — `levels()` as Lightweight `--view` Layout + +**Effort:** M +**Origin:** ROADMAP standalone (P5 Features) + +## Problem + +`levels()` is exactly the Y-axis assignment a layered DAG layout needs. For simple DAGs, `levels()` + left-to-right X sweep could produce clean layouts without the 2.5MB ELK import. Offer `--view --layout=levels` as an instant rendering mode, reserving ELK for complex graphs. + +## Notes + +- Files: `src/visualization/layouts/`, `bin/cli/commands/view.js` +- Part of P5 Features & Visualization tier diff --git a/BACKLOG/B156.md b/BACKLOG/B156.md new file mode 100644 index 00000000..e5c35bd1 --- /dev/null +++ b/BACKLOG/B156.md @@ -0,0 +1,12 @@ +# B156 — Structural Diff via Transitive Reduction + +**Effort:** L +**Origin:** ROADMAP standalone (P5 Features) + +## Problem + +Compute `transitiveReduction(stateA)` vs `transitiveReduction(stateB)` to produce a compact structural diff that strips implied edges and shows only "load-bearing" changes. Natural fit for H1 (Time-Travel Delta Engine) as `warp diff --mode=structural`. + +## Notes + +- Part of P5 Features & Visualization tier diff --git a/BACKLOG/B16.md b/BACKLOG/B16.md new file mode 100644 index 00000000..a69b0538 --- /dev/null +++ b/BACKLOG/B16.md @@ -0,0 +1,12 @@ +# B16 — `unsignedRecordForId` Edge-Case Tests + +**Effort:** S +**Origin:** ROADMAP deferred + +## Problem + +Additional edge-case test coverage for `unsignedRecordForId`. + +## Notes + +- **Trigger:** Promote if canonical format changes diff --git a/BACKLOG/B169.md b/BACKLOG/B169.md new file mode 100644 index 00000000..2537dbec --- /dev/null +++ b/BACKLOG/B169.md @@ -0,0 +1,12 @@ +# B169 — Archived Doc Status Guardrail + +**Effort:** XS +**Origin:** PR #66 review follow-up + +## Problem + +Add a docs checklist or automated check preventing time-sensitive branch-state wording such as `pending merge` from landing in archive/history docs like `docs/ROADMAP/COMPLETED.md`. + +## Notes + +- Low urgency — fold into PRs that touch related files diff --git a/BACKLOG/B170.md b/BACKLOG/B170.md new file mode 100644 index 00000000..8ee9f802 --- /dev/null +++ b/BACKLOG/B170.md @@ -0,0 +1,12 @@ +# B170 — Native vs WASM Roaring Benchmark Pack + +**Effort:** M +**Origin:** ROADMAP standalone (P4 Large-Graph Performance) + +## Problem + +Design hot-path workload simulations from actual bitmap usage in `BitmapIndexBuilder`, `LogicalBitmapIndexBuilder`, `StreamingBitmapIndexBuilder`, `BitmapIndexReader`, and `IncrementalIndexUpdater`; benchmark native `roaring` under heavy-load scenarios, rerun the same workloads with `roaring-wasm`, and publish a decision memo with throughput/latency deltas and operational recommendations. + +## Notes + +- Part of P4 Large-Graph Performance tier diff --git a/BACKLOG/B171.md b/BACKLOG/B171.md new file mode 100644 index 00000000..2ee97a97 --- /dev/null +++ b/BACKLOG/B171.md @@ -0,0 +1,13 @@ +# B171 — TSC Campaign Agent-Authored Code Audit + +**Effort:** L +**Origin:** TSC zero campaign drift (PR #73) + +## Problem + +27 files were merged via `checkout --theirs` during worktree conflict resolution without line-by-line review. Tests pass, but test coverage does not guarantee absence of subtle semantic drift (e.g. changed fallback values, widened types, reordered logic). Audit every agent-authored file diff against the pre-campaign baseline. Revert anything suspicious. + +## Notes + +- Source: P1b priority tier (TSC Zero Campaign Drift Audit) +- High priority — should be audited before next release diff --git a/BACKLOG/B172.md b/BACKLOG/B172.md new file mode 100644 index 00000000..f161cad8 --- /dev/null +++ b/BACKLOG/B172.md @@ -0,0 +1,13 @@ +# B172 — Restore `dot-notation` via `@typescript-eslint/dot-notation` + +**Effort:** S +**Origin:** TSC zero campaign drift (PR #73) + +## Problem + +ESLint `dot-notation` was disabled globally to resolve conflict with `noPropertyAccessFromIndexSignature`. The proper fix is switching to `@typescript-eslint/dot-notation` which respects the tsconfig flag. This restores lint coverage for actual dot-notation misuse while allowing bracket access on index signatures. + +## Notes + +- Source: P1b priority tier (TSC Zero Campaign Drift Audit) +- High priority diff --git a/BACKLOG/B173.md b/BACKLOG/B173.md new file mode 100644 index 00000000..1231c7eb --- /dev/null +++ b/BACKLOG/B173.md @@ -0,0 +1,13 @@ +# B173 — EffectSinkPort Breaking Change Hygiene + +**Effort:** S +**Origin:** TSC zero campaign drift (PR #73) + +## Problem + +`EffectSinkPort.deliver()` return type was widened from `DeliveryObservation` to `DeliveryObservation | DeliveryObservation[]` in `index.d.ts`. This is a breaking API surface change that shipped without a `BREAKING CHANGE` commit footer. Assess downstream impact and decide: (a) revert the widening and fix MultiplexSink to unwrap, or (b) accept it and document as a breaking change for the next major version. + +## Notes + +- Source: P1b priority tier (TSC Zero Campaign Drift Audit) +- High priority diff --git a/BACKLOG/B174.md b/BACKLOG/B174.md new file mode 100644 index 00000000..ebdf9242 --- /dev/null +++ b/BACKLOG/B174.md @@ -0,0 +1,13 @@ +# B174 — `@git-stunts/trailer-codec` Type Declarations + +**Effort:** M +**Origin:** TSC zero campaign drift (PR #73) + +## Problem + +`getCodec()` in `MessageCodecInternal.js` returns an untyped `TrailerCodec`, forcing 6+ downstream files to cast through `unknown` intermediary. Root fix: add `index.d.ts` to the `@git-stunts/trailer-codec` package upstream. + +## Notes + +- Source: P1b priority tier (TSC Zero Campaign Drift Audit) +- Fix is upstream in `@git-stunts/trailer-codec`, not in this repo diff --git a/BACKLOG/B20.md b/BACKLOG/B20.md new file mode 100644 index 00000000..02811a1c --- /dev/null +++ b/BACKLOG/B20.md @@ -0,0 +1,12 @@ +# B20 — Trust Record Round-Trip Snapshot Test + +**Effort:** S +**Origin:** ROADMAP deferred + +## Problem + +Snapshot test verifying trust record round-trip serialization stability. + +## Notes + +- **Trigger:** Promote if trust record schema changes diff --git a/BACKLOG/B21.md b/BACKLOG/B21.md new file mode 100644 index 00000000..8aa318a8 --- /dev/null +++ b/BACKLOG/B21.md @@ -0,0 +1,12 @@ +# B21 — Trust Schema Discriminated Union + +**Effort:** S +**Origin:** ROADMAP deferred + +## Problem + +Refactor trust schema from `superRefine` to discriminated union for cleaner validation. + +## Notes + +- **Trigger:** Promote if superRefine causes a bug or blocks a feature diff --git a/BACKLOG/B27.md b/BACKLOG/B27.md new file mode 100644 index 00000000..a3fd87fe --- /dev/null +++ b/BACKLOG/B27.md @@ -0,0 +1,12 @@ +# B27 — `TrustKeyStore` Pre-Validated Key Cache + +**Effort:** S +**Origin:** ROADMAP deferred + +## Problem + +Cache pre-validated keys in `TrustKeyStore` to avoid repeated validation on hot paths. + +## Notes + +- **Trigger:** Promote when `verifySignature` appears in any p95 flame graph above 5% of call time diff --git a/BACKLOG/B28.md b/BACKLOG/B28.md new file mode 100644 index 00000000..ce82b11a --- /dev/null +++ b/BACKLOG/B28.md @@ -0,0 +1,12 @@ +# B28 — Pure TypeScript Example App + +**Effort:** M +**Origin:** ROADMAP standalone + +## Problem + +CI compile-only stub (`tsc --noEmit` on minimal TS consumer) to verify the type declarations work end-to-end for TypeScript consumers. + +## Notes + +- Part of P3 Type Safety tier diff --git a/BACKLOG/B34.md b/BACKLOG/B34.md new file mode 100644 index 00000000..c76c0475 --- /dev/null +++ b/BACKLOG/B34.md @@ -0,0 +1,12 @@ +# B34 — Docs: SECURITY_SYNC.md + +**Effort:** M +**Origin:** ROADMAP standalone (P6 Docs) + +## Problem + +Extract threat model from JSDoc into operator-facing documentation. The sync threat model is currently buried in code comments and not accessible to operators deploying git-warp sync. + +## Notes + +- Low urgency — fold into PRs that touch related files diff --git a/BACKLOG/B35.md b/BACKLOG/B35.md new file mode 100644 index 00000000..36b3aeee --- /dev/null +++ b/BACKLOG/B35.md @@ -0,0 +1,12 @@ +# B35 — Docs: README Install Section + +**Effort:** S +**Origin:** ROADMAP standalone (P6 Docs) + +## Problem + +Quick Install section with Docker + native paths for the README. Current install instructions are incomplete. + +## Notes + +- Low urgency — fold into PRs that touch related files diff --git a/BACKLOG/B4.md b/BACKLOG/B4.md new file mode 100644 index 00000000..3862a92b --- /dev/null +++ b/BACKLOG/B4.md @@ -0,0 +1,13 @@ +# B4 — WARP UI Visualizer + +**Effort:** L +**Origin:** ROADMAP deferred + +## Problem + +Full UI visualizer for WARP graphs. Scope and UX goals not yet defined. + +## Notes + +- **Trigger:** Promote when RFC filed with scoped UX goals +- B157 (browser compatibility) is complete — unblocks browser-side work diff --git a/BACKLOG/B43.md b/BACKLOG/B43.md new file mode 100644 index 00000000..37ab664e --- /dev/null +++ b/BACKLOG/B43.md @@ -0,0 +1,12 @@ +# B43 — Vitest Explicit Runtime Excludes + +**Effort:** S +**Origin:** ROADMAP standalone + +## Problem + +Prevent accidental local runs of Docker-only suites by adding explicit runtime excludes to the vitest configuration. + +## Notes + +- Part of P2 CI & Tooling batch diff --git a/BACKLOG/B53.md b/BACKLOG/B53.md new file mode 100644 index 00000000..2ba8bbc8 --- /dev/null +++ b/BACKLOG/B53.md @@ -0,0 +1,12 @@ +# B53 — Fix JSR Publish Dry-Run Deno Panic + +**Effort:** M +**Origin:** ROADMAP standalone (Platform) + +## Problem + +Deno 2.6.7 `deno_ast` panics on overlapping text changes from duplicate `roaring` import rewrites. Either pin Deno version, vendor the import, or file upstream issue and add workaround. + +## Notes + +- Promote if JSR publish becomes imminent diff --git a/BACKLOG/B54.md b/BACKLOG/B54.md new file mode 100644 index 00000000..8300cadc --- /dev/null +++ b/BACKLOG/B54.md @@ -0,0 +1,12 @@ +# B54 — `typedCustom()` Zod Helper + +**Effort:** S +**Origin:** ROADMAP standalone + +## Problem + +`z.custom()` without a generic yields `unknown` in JS; a JSDoc-friendly wrapper (or `@typedef`-based pattern) would eliminate verbose `/** @type {z.ZodType} */ (z.custom(...))` casts across HttpSyncServer and future Zod schemas. + +## Notes + +- Part of P3 Type Safety tier diff --git a/BACKLOG/B7.md b/BACKLOG/B7.md new file mode 100644 index 00000000..3e96cda2 --- /dev/null +++ b/BACKLOG/B7.md @@ -0,0 +1,12 @@ +# B7 — Doctor: Property-Based Fuzz Test + +**Effort:** M +**Origin:** ROADMAP deferred + +## Problem + +Property-based fuzz testing for the `doctor` / health check system. + +## Notes + +- **Trigger:** Promote when doctor check count exceeds 8 diff --git a/BACKLOG/B76.md b/BACKLOG/B76.md new file mode 100644 index 00000000..03d74a6f --- /dev/null +++ b/BACKLOG/B76.md @@ -0,0 +1,13 @@ +# B76 — WarpGraph Invisible API Surface Docs + +**Effort:** M +**Origin:** B-AUDIT-4 (STANK) + +## Problem + +Add `// API Surface` block listing all 40+ dynamically wired methods with source module. Consider generating as a build step. The dynamically composed API surface is invisible to developers reading the source. + +## Notes + +- File: `src/domain/WarpGraph.js:451-478` +- Low urgency — fold into PRs that touch related files diff --git a/BACKLOG/B79.md b/BACKLOG/B79.md new file mode 100644 index 00000000..ee716468 --- /dev/null +++ b/BACKLOG/B79.md @@ -0,0 +1,14 @@ +# B79 — WarpGraph Constructor Lifecycle Docs + +**Effort:** M +**Origin:** B-AUDIT-16 (TSK TSK) + +## Problem + +Document cache invalidation strategy for 25 instance variables: which operations dirty which caches, which flush them. + +## Notes + +- File: `src/domain/WarpGraph.js:69-198` +- Depends on B143 RFC (exists at `docs/design/warpgraph-decomposition.md`) +- Low urgency — fold into PRs that touch related files diff --git a/BACKLOG/B88.md b/BACKLOG/B88.md new file mode 100644 index 00000000..a1accb6d --- /dev/null +++ b/BACKLOG/B88.md @@ -0,0 +1,13 @@ +# B88 — Mermaid Rendering Smoke Test + +**Effort:** S +**Origin:** B-DIAG-2 + +## Problem + +Parse all ` ```mermaid ` blocks with `@mermaid-js/mermaid-cli` in CI to catch syntax errors in documentation diagrams before they reach users. + +## Notes + +- Target: `.github/workflows/ci.yml` or `scripts/` +- Part of P2 CI & Tooling batch diff --git a/BACKLOG/B96.md b/BACKLOG/B96.md new file mode 100644 index 00000000..55acf1b3 --- /dev/null +++ b/BACKLOG/B96.md @@ -0,0 +1,13 @@ +# B96 — Consumer Test Type-Only Import Coverage + +**Effort:** M +**Origin:** B-TYPE-1 + +## Problem + +Exercise all exported types beyond just declaring variables. Types like `OpOutcome`, `TraversalDirection`, `LogLevelValue` aren't tested at all. The consumer type test should verify all exported types are usable, not just importable. + +## Notes + +- File: `test/type-check/consumer.ts` +- Part of P3 Type Safety tier diff --git a/BACKLOG/B98.md b/BACKLOG/B98.md new file mode 100644 index 00000000..21440703 --- /dev/null +++ b/BACKLOG/B98.md @@ -0,0 +1,13 @@ +# B98 — Test-File Wildcard Ratchet + +**Effort:** S +**Origin:** B-TYPE-3 + +## Problem + +`ts-policy-check.js` excludes test files entirely. Need to either add a separate ratchet with higher threshold or document exclusion as intentional. + +## Notes + +- File: `scripts/ts-policy-check.js` +- Part of P3 Type Safety tier diff --git a/BACKLOG/README.md b/BACKLOG/README.md index b3aec82e..59fb64f9 100644 --- a/BACKLOG/README.md +++ b/BACKLOG/README.md @@ -1,37 +1,13 @@ -# BACKLOG — Observer Geometry +# BACKLOG -Last updated: 2026-03-29 +Items waiting to be pulled into a cycle. -This directory holds promotable pre-design items for the current Observer -Geometry tranche. +When an item is selected for work, move its file to `docs/design/cycles//` — the backlog file becomes the design doc for that cycle. -Workflow: +## Process -1. capture the slice here -2. promote it into `docs/design/` when selected -3. write tests as the executable spec -4. implement -5. add a retrospective - -## Active Items - -| Status | ID | Title | File | -| --- | --- | --- | --- | -| DONE | OG-001 | First-class `Worldline` API | [OG-001-worldline-api.md](OG-001-worldline-api.md) | -| DONE | OG-002 | Split mutable session `WarpRuntime` from immutable snapshot noun | [OG-002-warpgraph-role-split.md](OG-002-warpgraph-role-split.md) | -| DONE | OG-003 | Deepen public snapshot immutability | [OG-003-snapshot-immutability.md](OG-003-snapshot-immutability.md) | -| DONE | OG-004 | Canonical immutable observer seek contract | [OG-004-observer-seek-contract.md](OG-004-observer-seek-contract.md) | -| DONE | OG-005 | Benchmark detached coordinate and strand reads | [OG-005-detached-read-benchmarks.md](OG-005-detached-read-benchmarks.md) | -| DONE | OG-006 | Remove remaining docs/examples that imply caller retargeting | [OG-006-read-api-doc-consistency.md](OG-006-read-api-doc-consistency.md) | -| DONE | OG-007 | Expand hash-stability coverage across snapshot flavors | [OG-007-hash-stability-coverage.md](OG-007-hash-stability-coverage.md) | -| DONE | OG-008 | Make retargeting compatibility a hard major-version cut | [OG-008-retargeting-compatibility.md](OG-008-retargeting-compatibility.md) | -| QUEUED | OG-009 | Align playback-head and TTD consumers after read nouns stabilize | [OG-009-playback-head-alignment.md](OG-009-playback-head-alignment.md) | -| DONE | OG-010 | IBM Design Thinking pass over public APIs and README | [OG-010-public-api-design-thinking.md](OG-010-public-api-design-thinking.md) | -| QUEUED | OG-011 | Publish a public API catalog and browser documentation playground | [OG-011-public-api-catalog-and-playground.md](OG-011-public-api-catalog-and-playground.md) | -| DONE | OG-012 | Audit and reconcile the documentation corpus before v15 | [OG-012-documentation-corpus-audit.md](OG-012-documentation-corpus-audit.md) | -| QUEUED | OG-013 | Design out-of-core materialization and streaming reads | [OG-013-out-of-core-materialization-and-streaming-reads.md](OG-013-out-of-core-materialization-and-streaming-reads.md) | -| DONE | OG-014 | Mandatory CAS blob storage with streaming I/O | [OG-014-streaming-content-attachments.md](OG-014-streaming-content-attachments.md) | -| DONE | OG-015 | Raise JSR documentation quality score | [OG-015-jsr-documentation-quality.md](OG-015-jsr-documentation-quality.md) | -| DONE | OG-016 | Archive retrospective clutter | [OG-016-retrospective-archive-cleanup.md](OG-016-retrospective-archive-cleanup.md) | -| QUEUED | OG-017 | Break up the `index.d.ts` monolith | [OG-017-modular-type-declarations.md](OG-017-modular-type-declarations.md) | -| QUEUED | OG-018 | Browser guide and storage adapter documentation | [OG-018-browser-guide.md](OG-018-browser-guide.md) | +1. Item lives here as `B{number}.md` +2. Pull into cycle → move to `docs/design/cycles//B{number}.md` +3. Write failing tests as spec +4. Implement +5. Retrospective in `docs/archive/retrospectives/` diff --git a/BACKLOG/OG-001-worldline-api.md b/BACKLOG/done/OG-001-worldline-api.md similarity index 100% rename from BACKLOG/OG-001-worldline-api.md rename to BACKLOG/done/OG-001-worldline-api.md diff --git a/BACKLOG/OG-002-warpgraph-role-split.md b/BACKLOG/done/OG-002-warpgraph-role-split.md similarity index 100% rename from BACKLOG/OG-002-warpgraph-role-split.md rename to BACKLOG/done/OG-002-warpgraph-role-split.md diff --git a/BACKLOG/OG-003-snapshot-immutability.md b/BACKLOG/done/OG-003-snapshot-immutability.md similarity index 100% rename from BACKLOG/OG-003-snapshot-immutability.md rename to BACKLOG/done/OG-003-snapshot-immutability.md diff --git a/BACKLOG/OG-004-observer-seek-contract.md b/BACKLOG/done/OG-004-observer-seek-contract.md similarity index 100% rename from BACKLOG/OG-004-observer-seek-contract.md rename to BACKLOG/done/OG-004-observer-seek-contract.md diff --git a/BACKLOG/OG-005-detached-read-benchmarks.md b/BACKLOG/done/OG-005-detached-read-benchmarks.md similarity index 100% rename from BACKLOG/OG-005-detached-read-benchmarks.md rename to BACKLOG/done/OG-005-detached-read-benchmarks.md diff --git a/BACKLOG/OG-006-read-api-doc-consistency.md b/BACKLOG/done/OG-006-read-api-doc-consistency.md similarity index 100% rename from BACKLOG/OG-006-read-api-doc-consistency.md rename to BACKLOG/done/OG-006-read-api-doc-consistency.md diff --git a/BACKLOG/OG-007-hash-stability-coverage.md b/BACKLOG/done/OG-007-hash-stability-coverage.md similarity index 100% rename from BACKLOG/OG-007-hash-stability-coverage.md rename to BACKLOG/done/OG-007-hash-stability-coverage.md diff --git a/BACKLOG/OG-008-retargeting-compatibility.md b/BACKLOG/done/OG-008-retargeting-compatibility.md similarity index 100% rename from BACKLOG/OG-008-retargeting-compatibility.md rename to BACKLOG/done/OG-008-retargeting-compatibility.md diff --git a/BACKLOG/OG-010-public-api-design-thinking.md b/BACKLOG/done/OG-010-public-api-design-thinking.md similarity index 100% rename from BACKLOG/OG-010-public-api-design-thinking.md rename to BACKLOG/done/OG-010-public-api-design-thinking.md diff --git a/BACKLOG/OG-012-documentation-corpus-audit.md b/BACKLOG/done/OG-012-documentation-corpus-audit.md similarity index 100% rename from BACKLOG/OG-012-documentation-corpus-audit.md rename to BACKLOG/done/OG-012-documentation-corpus-audit.md diff --git a/BACKLOG/OG-014-streaming-content-attachments.md b/BACKLOG/done/OG-014-streaming-content-attachments.md similarity index 100% rename from BACKLOG/OG-014-streaming-content-attachments.md rename to BACKLOG/done/OG-014-streaming-content-attachments.md diff --git a/BACKLOG/OG-015-jsr-documentation-quality.md b/BACKLOG/done/OG-015-jsr-documentation-quality.md similarity index 100% rename from BACKLOG/OG-015-jsr-documentation-quality.md rename to BACKLOG/done/OG-015-jsr-documentation-quality.md diff --git a/BACKLOG/OG-016-retrospective-archive-cleanup.md b/BACKLOG/done/OG-016-retrospective-archive-cleanup.md similarity index 100% rename from BACKLOG/OG-016-retrospective-archive-cleanup.md rename to BACKLOG/done/OG-016-retrospective-archive-cleanup.md diff --git a/CHANGELOG.md b/CHANGELOG.md index 7e067f99..fea71ea8 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -9,6 +9,8 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 ### Changed +- **Backlog restructured** — migrated all incomplete ROADMAP items to individual `BACKLOG/B{number}.md` files. Dropped milestone structure in favor of flat backlog. Completed OG-items moved to `BACKLOG/done/`. `docs/ROADMAP.md` retained for reference with migration notice. + - **Zero-error TypeScript campaign complete** — eliminated all 1,707 `tsc --noEmit` errors across 271 files. Mechanical TS4111 bracket-access sweep (614), null guards for `noUncheckedIndexedAccess`, conditional spreads for `exactOptionalPropertyTypes`, unused variable removal. All 8 pre-push IRONCLAD gates now pass. - **JoinReducer OpStrategy registry** — replaced five triplicated switch statements over 8 canonical op types with a frozen `Map` registry. Each strategy defines `mutate`, `outcome`, `snapshot`, `accumulate`, `validate`. Adding a new op type without all five methods is a hard error at module load time. Cross-path equivalence tests verify `applyFast`, `applyWithReceipt`, and `applyWithDiff` produce identical CRDT state. - **ESLint `dot-notation` disabled** — conflicts with `noPropertyAccessFromIndexSignature` tsconfig flag. The TypeScript flag provides type safety; the ESLint rule is purely stylistic. diff --git a/docs/ROADMAP.md b/docs/ROADMAP.md index 544ab97b..1c781491 100644 --- a/docs/ROADMAP.md +++ b/docs/ROADMAP.md @@ -1,5 +1,8 @@ # ROADMAP — @git-stunts/git-warp +> **MIGRATED:** All incomplete items have been migrated to individual files in `BACKLOG/`. +> Completed items remain in `docs/ROADMAP/COMPLETED.md`. This file is kept for reference only. + > **Current release on `main`:** v16.0.0 > **Next intended release:** v16.0.1 > **Last reconciled:** 2026-03-29 (v16.0.0 release. OG-014 streaming CAS blob storage, OG-015 JSR docs, deprecated TraversalService and createWriter removed.) From baef5f9181b93afff7caa4801823ac30e7badab2 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 11:36:56 -0700 Subject: [PATCH 04/73] =?UTF-8?q?chore:=20migrate=20to=20flat=20BACKLOG=20?= =?UTF-8?q?+=20cycles=20=E2=80=94=20drop=20milestones=20and=20ROADMAP?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Process restructuring: - BACKLOG/ is now a flat directory of markdown files (one per B-item) - Cycles replace milestones: pull item → docs/design/cycles// - The backlog file becomes the design doc for the cycle - No more ROADMAP as planning registry (kept for reference only) Migration: - 38 incomplete B-items extracted from ROADMAP into individual files - 13 completed OG-items moved to BACKLOG/done/ - 5 queued OG-items left in place - CONTRIBUTING.md rewritten for cycle-based process - CLAUDE.md updated (Task Tracking section) - docs/ROADMAP.md marked as migrated --- .github/CONTRIBUTING.md | 179 +++++++++++----------------------------- 1 file changed, 49 insertions(+), 130 deletions(-) diff --git a/.github/CONTRIBUTING.md b/.github/CONTRIBUTING.md index a959ebf1..40fb8132 100644 --- a/.github/CONTRIBUTING.md +++ b/.github/CONTRIBUTING.md @@ -2,162 +2,81 @@ ## Planning Sources Of Truth -Do not duplicate the repo's "active plan" inside `CONTRIBUTING.md`. -That information drifts too easily here. +- `BACKLOG/` — individual markdown files, one per item (`B{number}.md`) +- `docs/design/cycles//` — active work; backlog file moves here +- `CHANGELOG.md` — what has landed +- `docs/archive/retrospectives/` — closed cycle audits -Instead, use these sources: +No milestones. No ROADMAP. Cycles are the unit of work. -- `BACKLOG/README.md` for the currently active cycle and promotable pre-design - slices -- `docs/ROADMAP.md` for committed release and milestone inventory -- `CHANGELOG.md` for what has already landed on the branch or in released - versions -- `docs/design/` for the governing design notes promoted from active backlog - items +## Cycle Process -If these artifacts disagree, reconcile them as part of the cycle close instead -of letting `CONTRIBUTING.md` become a second planning registry. +A cycle is one backlog item, start to finish: -## Development Loop +1. **Pull** — move `BACKLOG/B{number}.md` to `docs/design/cycles//` +2. **Design** — the backlog file becomes the design doc; add hills, non-goals, + invariants as needed +3. **Spec** — write failing tests as executable spec +4. **Implement** — make the tests pass +5. **Close** — retrospective, drift audit, CHANGELOG, tech debt journal, + cool ideas -This repo follows the same disciplined cycle used by higher-layer products built -on git-warp: +### Retrospectives -1. design docs first -2. tests as executable spec second -3. implementation third -4. playback, retrospective, and reconciliation after the slice lands +Every closed cycle gets a retrospective in `docs/archive/retrospectives/`. +At minimum: -Tests are the spec. Design docs define intent and invariants. Implementation -follows. +1. Governing design docs and backlog IDs +2. What actually landed +3. Design Alignment Audit — label each point as `aligned`, `partially aligned`, + or `not aligned` +4. Observed drift — classify as deliberate tradeoff, implementation shortcut, + hidden constraint, test gap, or design ambiguity +5. Resolution — update design docs, add follow-on backlog item, or fix + immediately -When a `BACKLOG/` item is selected for active work, promote it into -`docs/design/` before writing tests. - -For non-trivial work, use IBM Design Thinking style framing: - -- sponsor actors -- hills -- playbacks -- explicit non-goals - -Keep that vocabulary in the design method. Do not leak it into the runtime -ontology unless the substrate truly needs a first-class concept. - -## Retrospectives - -Retrospectives are not optional cleanup. Every closed slice should leave behind -an explicit retrospective, and that retrospective must audit the landed changes -against the intended design. - -At minimum, every retrospective should include: - -1. governing design docs and backlog IDs -2. what actually landed -3. a `Design Alignment Audit` section -4. any observed drift -5. whether the drift is accepted, rejected, or deferred - -The `Design Alignment Audit` should check the implemented slice against the -intended invariants and label each major point as: - -- `aligned` -- `partially aligned` -- `not aligned` - -If implementation drift occurred, the retrospective must say why: - -- deliberate tradeoff -- implementation shortcut -- hidden pre-existing constraint -- test gap -- design ambiguity - -And it must say how the repo resolves that drift: - -- update the design docs -- add a follow-on `BACKLOG/` item -- immediately fix the implementation in the next slice - -Do not treat a passing test suite as proof that the design was honored. The -retro is where we verify that the code matches the intended architecture, not -just the executable spec that happened to be written. - -## Checkpoints - -Most slices should pass through four checkpoints: - -1. doctrine -2. spec -3. semantic -4. surface - -For git-warp, "surface" often means public API, CLI, or documentation surface -rather than a GUI. - -Local red while iterating is acceptable. Shared branches, pushes intended for -review, and merge submissions should be green. +Do not treat a passing test suite as proof that the design was honored. ## Getting Started -1. Clone the repository -2. Install dependencies: `npm install` -3. Set up git hooks: `npm run setup:hooks` -4. Run tests: `npm test` +```bash +git clone git@github.com:git-stunts/git-warp.git +cd git-warp +npm install # installs deps, sets up git hooks +npm run test:local # run unit tests +``` ## Git Hooks -This project uses custom git hooks located in `scripts/hooks/`. Run `npm run setup:hooks` to enable them. -- Hooks are also auto-configured on `npm install` (no-op if not a git repo). -- `pre-commit` runs eslint on staged JS files. -- `pre-push` runs `npm run lint`, `npm test`, `npm run benchmark`, and the Docker bats CLI suite (`git-warp` commands). +Custom hooks in `scripts/hooks/`, auto-configured on `npm install`. -### Pre-commit Hook - -The pre-commit hook runs ESLint on all staged JavaScript files. If linting fails, the commit is blocked. - -To fix lint errors: -```bash -npx eslint --fix -``` - -To bypass temporarily (use sparingly): -```bash -git commit --no-verify -``` +- **pre-commit** — ESLint on staged JS files +- **pre-push** — 8-gate IRONCLAD firewall (tsc, policy, consumer types, + ESLint, ratchet, surface, markdown, tests) ## Code Style -- ESLint enforces code style. Run `npx eslint .` to check. -- Use template literals instead of string concatenation -- Always use curly braces for if/else blocks -- Keep functions focused and avoid deep nesting +- ESLint enforces style. Run `npx eslint .` to check. +- Template literals over concatenation +- Always use curly braces for if/else +- Keep functions focused, avoid deep nesting ## Running Tests ```bash -npm test # Run all unit tests (Docker) -npm run test:local # Run unit tests without Docker -npm test -- # Run specific tests - -# Multi-runtime test matrix (Docker) -npm run test:node22 # Node 22: unit + integration + BATS CLI -npm run test:bun # Bun: API integration tests -npm run test:deno # Deno: API integration tests -npm run test:matrix # All runtimes in parallel +npm run test:local # Unit tests without Docker +npm test # Unit tests (Docker) +npm run test:matrix # Full multi-runtime matrix (Docker) ``` ### No-Coordination Invariant -The no-coordination regression suite is non-negotiable for multi-writer safety. -Ensure `test/unit/domain/WarpGraph.noCoordination.test.js` passes before submitting changes. +`test/unit/domain/WarpGraph.noCoordination.test.js` is non-negotiable for +multi-writer safety. Must pass before any PR. ## Pull Requests -1. Create a feature branch from `main` -2. Make your changes with clear commit messages -3. Keep commits documentation-atomic: when a change affects shipped behavior, public surface, or backlog status, update `CHANGELOG.md` and the roadmap/backlog docs in the same commit. -4. When a `BACKLOG/` item becomes active, promote it into `docs/design/` before implementation. When roadmap work completes, reconcile `docs/ROADMAP.md` and `docs/ROADMAP/COMPLETED.md` in the same commit. -5. Ensure all tests pass: `npm test` -6. Ensure linting passes: `npx eslint .` -7. Submit a PR with a clear description +1. Branch from the latest green branch +2. Clear commit messages; docs-atomic (CHANGELOG + code in same commit) +3. All tests pass, all lint gates pass +4. Submit PR with clear description From faa634c128d3978011413fd6d7fb1608d8e48a88 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 13:25:58 -0700 Subject: [PATCH 05/73] =?UTF-8?q?chore:=20B175=20=E2=80=94=20Guide=20obser?= =?UTF-8?q?ver-first=20client=20pattern?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- BACKLOG/B175.md | 21 +++++++++++++++++++++ 1 file changed, 21 insertions(+) create mode 100644 BACKLOG/B175.md diff --git a/BACKLOG/B175.md b/BACKLOG/B175.md new file mode 100644 index 00000000..f4ecccf1 --- /dev/null +++ b/BACKLOG/B175.md @@ -0,0 +1,21 @@ +# B175 — Guide: Observer-First Client Pattern + +**Effort:** M +**Origin:** User direction (2026-04-01) + +## Problem + +The GUIDE and ADVANCED_GUIDE don't strongly enough convey that the primary client interaction model is through Observer APIs. Clients should be reading state through Observers (projections over worldlines through apertures) and letting git-warp manage the underlying graph topology, materialization, and CRDT mechanics. + +The current docs teach low-level graph manipulation (createPatch, addNode, etc.) with equal weight to the Observer read path, which gives the impression that clients should be directly managing graph state. In practice, most consumers should: + +1. Write through `Writer` / `PatchBuilderV2` (thin, scoped mutations) +2. Read through `Observer` (projected, filtered, cached views) +3. Let git-warp handle materialization, conflict resolution, and indexing + +## Notes + +- Review `docs/GUIDE.md` and `docs/ADVANCED_GUIDE.md` for teaching order +- The Observer API (apertures, worldlines, seek, strand-scoped reads) should be the primary "how to read data" section +- Direct `getNodes()` / `getNodeProps()` / `query()` are escape hatches, not the default path +- This aligns with Paper IV's observer geometry: observers are the projection layer, not an optional feature From 8ad60c7f21bb396b2e6003be5a2e023d51737ea3 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 13:32:06 -0700 Subject: [PATCH 06/73] =?UTF-8?q?chore:=20B176-B178=20=E2=80=94=20audit-dr?= =?UTF-8?q?iven=20high-priority=20backlog=20items?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit B176: WarpRuntime + warp/ god class decomposition (XL, HIGH) B177: Cohesive WarpKernelPort — persistence union type cleanup (L, HIGH) B178: CLI __dirname path traversal fragility (S, HIGH) Origin: code quality audit findings (AUDIT_CODE_QUALITY.md, AUDIT_THE_JANK.md). Note: 161 any casts and 172 TODO(ts-cleanup) markers cited in the audit are already at zero — TSC campaign fixed those. The structural issues (god class, fragmented ports, path traversal) remain. --- BACKLOG/B176.md | 44 ++++++++++++++++++++++++++++++++++++++++++++ BACKLOG/B177.md | 29 +++++++++++++++++++++++++++++ BACKLOG/B178.md | 30 ++++++++++++++++++++++++++++++ 3 files changed, 103 insertions(+) create mode 100644 BACKLOG/B176.md create mode 100644 BACKLOG/B177.md create mode 100644 BACKLOG/B178.md diff --git a/BACKLOG/B176.md b/BACKLOG/B176.md new file mode 100644 index 00000000..0c824d5f --- /dev/null +++ b/BACKLOG/B176.md @@ -0,0 +1,44 @@ +# B176 — WarpRuntime + warp/ Methods God Class Decomposition + +**Effort:** XL +**Origin:** Code quality audit (AUDIT_CODE_QUALITY.md, AUDIT_THE_JANK.md), user escalation (2026-04-01) +**Priority:** HIGH + +## Problem + +The WarpRuntime class (683 LOC) delegates to 12 method-mixin files in `src/domain/warp/` (5,930 LOC combined = 6,613 LOC total surface). While the SyncController has already been extracted, the class still orchestrates: + +- Git reference manipulation (via persistence port) +- Pathfinding queries (via GraphTraversal) +- Materialization and checkpoint management +- Temporal state routing (seek, coordinate, strand) +- Memory GC mechanisms +- Writer/PatchSession lifecycle +- Observer creation and management +- Subscription and watch APIs +- Effect pipeline integration +- Trust/audit integration +- Index management + +This coupling makes the class a merge conflict magnet and prevents new contributors from building a mental model. The `warp/` method mixins are wired dynamically via `Object.defineProperty` in `_wire.js`, which defeats static analysis and makes the API surface invisible without reading the wiring code. + +## Decomposition Direction + +The existing `warp/` method files already represent a partial decomposition — they just need to become proper service classes instead of method mixins bolted onto one god class. WarpRuntime becomes a thin facade delegating to: + +- `MaterializationService` (materialize, checkpoint, incremental) +- `QueryService` (query builder, getNodes, getNodeProps, getEdges) +- `TraversalService` (path, traversal algorithms) +- `TemporalService` (seek, coordinate, strand routing) +- `SubscriptionService` (watch, subscribe, diff streaming) +- `WriterService` (writer lifecycle, patch sessions) + +Each already exists as a `warp/*.methods.js` file — the refactor is promoting them from mixins to injected services. + +## Notes + +- Existing RFC: B143 (WarpGraph decomposition design) — check if still current +- SyncController already extracted (M10 era) — good precedent +- The `_wire.js` dynamic wiring must go — it defeats type checking and IDE navigation +- 16 ports already exist — the port surface is fine, it's the orchestrator that's too fat +- `CorePersistence` typedef (`CommitPort & BlobPort & TreePort & RefPort`) is the right pattern — intersection types over a single fat interface diff --git a/BACKLOG/B177.md b/BACKLOG/B177.md new file mode 100644 index 00000000..94404e32 --- /dev/null +++ b/BACKLOG/B177.md @@ -0,0 +1,29 @@ +# B177 — Cohesive WarpKernelPort (Persistence Union Type Cleanup) + +**Effort:** L +**Origin:** Code quality audit (AUDIT_CODE_QUALITY.md), user escalation (2026-04-01) +**Priority:** HIGH + +## Problem + +The persistence dependency is expressed as an intersection of 4 fine-grained ports: + +```js +/** @typedef {CommitPort & BlobPort & TreePort & RefPort} CorePersistence */ +``` + +In practice, every concrete adapter (GitGraphAdapter, InMemoryGraphAdapter) implements all four simultaneously. Services that need persistence must type their parameters as this intersection, which is verbose and brittle. When a service needs a method that exists on the adapter but not on any individual port (e.g. `getConfig()`), developers historically resorted to `/** @type {any} */` casts to silence the compiler. + +The TSC zero campaign eliminated all 161 `any` casts, but the underlying problem remains: the port surface is too fragmented for the actual usage pattern. A single `WarpKernelPort` that composes the four sub-ports would: + +1. Give services a single, honest type to depend on +2. Eliminate the need for intersection type gymnastics in JSDoc +3. Make it possible to add persistence methods without updating 4 separate port files +4. Restore the ability to verify adapter completeness statically + +## Notes + +- The 4 sub-ports (CommitPort, BlobPort, TreePort, RefPort) can remain as building blocks — `WarpKernelPort extends CommitPort, BlobPort, TreePort, RefPort` (or intersection in JSDoc) +- GraphPersistencePort.js already exists as a 5th port with a different shape — reconcile or deprecate +- The 0 remaining `any` casts and 0 `TODO(ts-cleanup)` markers suggest the worst symptoms are fixed, but the root cause (fragmented port surface) still exists +- ConfigPort is a 6th port used by some adapters but not part of CorePersistence — decide if it belongs diff --git a/BACKLOG/B178.md b/BACKLOG/B178.md new file mode 100644 index 00000000..9a0093b8 --- /dev/null +++ b/BACKLOG/B178.md @@ -0,0 +1,30 @@ +# B178 — CLI __dirname Path Traversal Fragility + +**Effort:** S +**Origin:** Code quality audit, user escalation (2026-04-01) +**Priority:** HIGH + +## Problem + +`bin/cli/shared.js` uses relative path traversal to locate static assets: + +```js +const __dirname = path.dirname(__filename); +const templateDir = path.resolve(__dirname, '..', '..', 'scripts', 'hooks'); +const rawJson = fs.readFileSync(path.resolve(__dirname, '..', '..', 'package.json'), 'utf8'); +``` + +This binds runtime behavior to the physical directory layout of the source repository. If the CLI is ever bundled (esbuild, rollup, webpack) or installed via a global npm link with a different structure, `__dirname` resolves incorrectly and causes fatal runtime crashes. + +## Fix Options + +1. **Inline static assets at build time** — read `package.json` version at build, embed as a constant. Hook templates could be inlined or resolved relative to `import.meta.url`. +2. **Use `import.meta.url`** — already ESM, so `new URL('../..', import.meta.url)` is the standard pattern. More resilient than `__dirname` polyfill. +3. **`createRequire(import.meta.url)` for JSON** — `createRequire(import.meta.url)('../package.json')` works in Node, Bun, Deno. + +## Notes + +- The CLI already uses ESM (`import` statements throughout) +- `import.meta.url` is the idiomatic ESM approach +- This also affects `bin/cli/commands/install-hooks.js` which references `scripts/hooks/` +- Multi-runtime support (Node/Bun/Deno) means the fix must work across all three From a94d9ea1e4e2e45d18681e7297e91123b791b724 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 13:34:44 -0700 Subject: [PATCH 07/73] =?UTF-8?q?chore:=20B179=20=E2=80=94=20sync=20auth?= =?UTF-8?q?=20Ed25519=20migration=20(HIGH)?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- BACKLOG/B179.md | 35 +++++++++++++++++++++++++++++++++++ 1 file changed, 35 insertions(+) create mode 100644 BACKLOG/B179.md diff --git a/BACKLOG/B179.md b/BACKLOG/B179.md new file mode 100644 index 00000000..77ce1775 --- /dev/null +++ b/BACKLOG/B179.md @@ -0,0 +1,35 @@ +# B179 — Sync Auth: Migrate from Symmetric HMAC to Ed25519 Asymmetric Signatures + +**Effort:** L +**Origin:** Security audit, user escalation (2026-04-01) +**Priority:** HIGH + +## Problem + +`SyncAuthService.js` uses HMAC-SHA256 with a shared secret for sync request authentication. The nonce-reservation system (UUID + 5-minute clock-skew window + LRU cache) effectively prevents replay attacks, but the underlying cryptographic model is symmetric — all authorized nodes hold the same secret key. + +In a multi-writer network, this means: + +1. **Single point of compromise** — one node's key leak exposes the entire network +2. **No attribution** — HMAC proves the sender knows the secret, not *which* sender it is +3. **Key distribution problem** — adding a new writer requires secure distribution of the shared secret to all existing nodes +4. **No revocation** — revoking one writer's access means rotating the secret for everyone + +## Fix + +Migrate to Ed25519 asymmetric signatures: + +- Each writer holds a private key; the network knows their public key +- Sync requests are signed with the sender's private key, verified with their public key +- Compromising one node exposes only that node's private key — blast radius is localized +- Writer revocation = remove their public key from the trust set, no secret rotation needed +- Attribution is inherent — the signature proves which specific writer sent the request + +## Notes + +- The trust subsystem already has key management infrastructure (`TrustRecordService`, `TrustEvaluator`, `TrustKeyStore`) — the sync auth migration should build on this, not create a parallel key system +- `@git-stunts/vault` handles OS-native keychain storage — private keys should go through Vault, not `.env` files +- The nonce-reservation + clock-skew mechanism is sound and should be preserved regardless of signature scheme +- Wire format change: sync request headers will carry a signature + public key ID instead of an HMAC tag. This is a breaking protocol change — needs versioned negotiation or a migration window where both are accepted +- `WebCryptoAdapter` already exists for multi-runtime crypto — Ed25519 is available via `crypto.subtle` in Node 20+, Bun, and Deno +- Consider: should the public key set be stored in the graph itself (as trust records) or out-of-band? The trust subsystem already stores writer trust assessments in-graph — public keys could follow the same pattern From fc2e3c520ff4d35e2442b59c722021370316c38d Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 13:36:37 -0700 Subject: [PATCH 08/73] =?UTF-8?q?chore:=20B180=20=E2=80=94=20observer=20re?= =?UTF-8?q?daction=20threat=20model=20documentation=20(MEDIUM)?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- BACKLOG/B180.md | 38 ++++++++++++++++++++++++++++++++++++++ 1 file changed, 38 insertions(+) create mode 100644 BACKLOG/B180.md diff --git a/BACKLOG/B180.md b/BACKLOG/B180.md new file mode 100644 index 00000000..7ce4adf4 --- /dev/null +++ b/BACKLOG/B180.md @@ -0,0 +1,38 @@ +# B180 — Observer Redaction is Application-Layer Only + +**Effort:** M +**Origin:** Security audit, user escalation (2026-04-01) +**Priority:** MEDIUM + +## Problem + +Observer apertures support `redact: ['ssn', 'secret']` to hide properties from query results. This redaction happens in `ObserverView.js` at the application layer — it filters properties out of the materialized view before returning them to the caller. + +The underlying data is fully transparent in Git storage. Anyone with filesystem read access to `.git/objects/` can `git cat-file -p ` and extract unredacted CBOR patch blobs directly. Application-level redaction provides zero protection against: + +1. A process with read access to the repo directory +2. A Git clone recipient +3. A sync peer that receives raw patches + +## Assessment + +This is **by design, not a bug** — but the documentation should be explicit about the threat model boundary. Observer redaction is a **query-layer convenience**, not a security boundary. It's analogous to SQL views that hide columns: useful for application-level access control, not a substitute for encryption at rest. + +The actual security boundary for sensitive data is: + +- **B164 (DONE)**: Graph encryption at rest via `patchBlobStorage` with AES-256-GCM — patches are encrypted before writing to Git objects. `git cat-file` returns ciphertext. +- **Filesystem permissions**: The `.git/` directory should be readable only by authorized processes. +- **Sync auth (B179)**: Network-level authentication prevents unauthorized peers from receiving patches. + +## What Needs to Happen + +1. **Document the threat model explicitly** — GUIDE and SECURITY_SYNC.md (B34) must state that `redact` is application-layer filtering, not a cryptographic guarantee +2. **Recommend encryption for sensitive fields** — if redacted properties contain truly sensitive data, the graph should use encrypted blob storage (B164) so the storage layer is also protected +3. **Consider: property-level encryption** — a future feature where individual property values are encrypted with per-field keys, so even a sync peer with the graph encryption key can't read fields they're not authorized for. This is a significant design effort and may not be warranted yet. + +## Notes + +- B164 (graph encryption at rest) already landed — this is the real security boundary +- Observer redaction is still useful for multi-tenant application logic where all tenants share a process but shouldn't see each other's fields +- The audit framing ("functionally obsolete against sophisticated actors") overstates the issue — redaction was never intended as a cryptographic boundary, and encrypted storage exists for that purpose +- Paper IV's observer geometry treats apertures as projection lenses, not security perimeters — the theory is consistent with application-layer filtering From 9fda7f4ce96d6d694a1e40d6e775bb569ffad6cf Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 13:39:16 -0700 Subject: [PATCH 09/73] =?UTF-8?q?chore:=20fold=20B180=20(redaction=20docs)?= =?UTF-8?q?=20into=20B175=20=E2=80=94=20not=20a=20standalone=20item?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- BACKLOG/B175.md | 9 +++++++++ BACKLOG/B180.md | 38 -------------------------------------- 2 files changed, 9 insertions(+), 38 deletions(-) delete mode 100644 BACKLOG/B180.md diff --git a/BACKLOG/B175.md b/BACKLOG/B175.md index f4ecccf1..a55fb526 100644 --- a/BACKLOG/B175.md +++ b/BACKLOG/B175.md @@ -19,3 +19,12 @@ The current docs teach low-level graph manipulation (createPatch, addNode, etc.) - The Observer API (apertures, worldlines, seek, strand-scoped reads) should be the primary "how to read data" section - Direct `getNodes()` / `getNodeProps()` / `query()` are escape hatches, not the default path - This aligns with Paper IV's observer geometry: observers are the projection layer, not an optional feature + +### Redaction and encryption guidance + +The guide should clearly explain the security model for sensitive data: + +- Aperture `redact` is **application-layer filtering** — useful for multi-tenant query isolation, but not a cryptographic boundary. Anyone with filesystem access to `.git/objects/` can read raw patch blobs. +- For actual data protection, enable **graph encryption at rest** via `patchBlobStorage` with an encryption key (B164). This encrypts patch CBOR with AES-256-GCM before writing to Git objects. +- The guide should teach: redact for convenience, encrypt for security. Show how to configure `CasBlobAdapter` with an encryption key and wire it through `WarpGraph.open({ patchBlobStorage })`. +- Also explain that `@git-stunts/vault` manages encryption keys via OS-native keychains — no `.env` files for secrets. diff --git a/BACKLOG/B180.md b/BACKLOG/B180.md deleted file mode 100644 index 7ce4adf4..00000000 --- a/BACKLOG/B180.md +++ /dev/null @@ -1,38 +0,0 @@ -# B180 — Observer Redaction is Application-Layer Only - -**Effort:** M -**Origin:** Security audit, user escalation (2026-04-01) -**Priority:** MEDIUM - -## Problem - -Observer apertures support `redact: ['ssn', 'secret']` to hide properties from query results. This redaction happens in `ObserverView.js` at the application layer — it filters properties out of the materialized view before returning them to the caller. - -The underlying data is fully transparent in Git storage. Anyone with filesystem read access to `.git/objects/` can `git cat-file -p ` and extract unredacted CBOR patch blobs directly. Application-level redaction provides zero protection against: - -1. A process with read access to the repo directory -2. A Git clone recipient -3. A sync peer that receives raw patches - -## Assessment - -This is **by design, not a bug** — but the documentation should be explicit about the threat model boundary. Observer redaction is a **query-layer convenience**, not a security boundary. It's analogous to SQL views that hide columns: useful for application-level access control, not a substitute for encryption at rest. - -The actual security boundary for sensitive data is: - -- **B164 (DONE)**: Graph encryption at rest via `patchBlobStorage` with AES-256-GCM — patches are encrypted before writing to Git objects. `git cat-file` returns ciphertext. -- **Filesystem permissions**: The `.git/` directory should be readable only by authorized processes. -- **Sync auth (B179)**: Network-level authentication prevents unauthorized peers from receiving patches. - -## What Needs to Happen - -1. **Document the threat model explicitly** — GUIDE and SECURITY_SYNC.md (B34) must state that `redact` is application-layer filtering, not a cryptographic guarantee -2. **Recommend encryption for sensitive fields** — if redacted properties contain truly sensitive data, the graph should use encrypted blob storage (B164) so the storage layer is also protected -3. **Consider: property-level encryption** — a future feature where individual property values are encrypted with per-field keys, so even a sync peer with the graph encryption key can't read fields they're not authorized for. This is a significant design effort and may not be warranted yet. - -## Notes - -- B164 (graph encryption at rest) already landed — this is the real security boundary -- Observer redaction is still useful for multi-tenant application logic where all tenants share a process but shouldn't see each other's fields -- The audit framing ("functionally obsolete against sophisticated actors") overstates the issue — redaction was never intended as a cryptographic boundary, and encrypted storage exists for that purpose -- Paper IV's observer geometry treats apertures as projection lenses, not security perimeters — the theory is consistent with application-layer filtering From 56b8fff899ba418afe609e993d4bed010f1bb985 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 14:38:49 -0700 Subject: [PATCH 10/73] =?UTF-8?q?chore:=20B181=20=E2=80=94=20max=20file=20?= =?UTF-8?q?size=20+=20one-thing-per-file=20policy=20(HIGH)?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- BACKLOG/B181.md | 72 +++++++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 72 insertions(+) create mode 100644 BACKLOG/B181.md diff --git a/BACKLOG/B181.md b/BACKLOG/B181.md new file mode 100644 index 00000000..2e77784e --- /dev/null +++ b/BACKLOG/B181.md @@ -0,0 +1,72 @@ +# B181 — Enforce Max File Size + One-Thing-Per-File Policy + +**Effort:** L +**Origin:** User direction (2026-04-01) +**Priority:** HIGH + +## Problem + +The codebase has files ranging up to 2,572 LOC (ConflictAnalyzerService) with the combined WarpRuntime + warp/ mixin surface at 6,613 LOC. Large files are hard to navigate, attract merge conflicts, and resist comprehension. The lack of a file-size ceiling means files grow silently until someone notices. + +More fundamentally, files frequently contain multiple exports that serve different purposes — helper functions, type definitions, constants, and the primary class all living in the same file. This violates the principle that a file should be about one thing. + +## Policy + +### Max LOC + +Hard ceiling enforced by ESLint or a lint script: + +- **Source files (`src/`)**: 500 LOC max +- **Test files (`test/`)**: 800 LOC max (tests are inherently more verbose) +- **CLI commands (`bin/`)**: 300 LOC max +- **Scripts (`scripts/`)**: 300 LOC max + +Files over the limit must be split. The pre-commit or pre-push gate blocks violations. + +### One Thing Per File + +Each file exports **one primary thing** — a class, a function, a type, or a closely-related set of constants. If a file exports a class AND standalone helper functions that aren't private to that class, the helpers belong in their own module. + +Exceptions: +- Re-export barrels (`index.js`) are fine +- A function + its directly-related typedef is one thing +- A small set of related factory functions (e.g. `createNodeAdd`, `createEdgeAdd`) is one thing + +### Current Violators + +Files over 500 LOC that need splitting (source only): + +| File | LOC | What to split | +|---|---|---| +| ConflictAnalyzerService.js | 2,572 | 27 standalone helpers → separate module(s) | +| StrandService.js | 2,048 | 8 concerns → separate services (see B176) | +| GraphTraversal.js | 1,620 | Algorithm families could be separate files | +| PatchBuilderV2.js | 1,103 | Content ops, effect emission → extract | +| comparison.methods.js | 1,088 | Comparison helpers → separate modules | +| GitGraphAdapter.js | 1,036 | Already clean SRP, but could split by Git operation family | +| IncrementalIndexUpdater.js | 956 | Node/edge/prop update logic → separate strategies | +| query.methods.js | 906 | Query execution vs query building | +| QueryBuilder.js | 852 | Query DSL vs query execution | +| StreamingBitmapIndexBuilder.js | 835 | Build vs serialize | +| AuditVerifierService.js | 835 | Verification vs chain walking | +| InMemoryGraphAdapter.js | 815 | Already clean SRP | +| VisibleStateComparisonV5.js | 808 | Comparison algorithms | +| materializeAdvanced.methods.js | 716 | Advanced materialization paths | +| DagPathFinding.js | 705 | Path algorithms | +| WarpRuntime.js | 683 | See B176 | +| SyncController.js | 680 | Already extracted, near limit | + +## Implementation + +1. Add ESLint `max-lines` rule (already exists, just need to tighten the threshold) +2. Add the ceiling to `eslint.config.js` — 500 for src, 800 for test, 300 for bin/scripts +3. Existing violators get added to a temporary relaxation block (like the complexity relaxation) +4. Each file split is its own cycle — pull from backlog, split, verify tests, commit +5. Ratchet: the relaxation block must shrink over time, never grow + +## Notes + +- ESLint `max-lines` rule supports `skipBlankLines` and `skipComments` — use both for a fair count +- The `one thing per file` policy is harder to lint — enforce via code review and the bad_code.md journal +- GraphTraversal.js (1,620 LOC) was flagged as NOT a god object in the audit — single responsibility (algorithm library). The split here is by algorithm family, not by concern. Still worth doing for navigability. +- This policy should go in CONTRIBUTING.md and CLAUDE.md once agreed From e9d272d315ad3488d90dd987873e00daba013022 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 17:02:34 -0700 Subject: [PATCH 11/73] =?UTF-8?q?chore:=20introduce=20The=20Method=20?= =?UTF-8?q?=E2=80=94=20filesystem-native=20dev=20process=20(cycle=200001)?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Replace B-number backlog system with The Method: lane-based backlog directories, legend-prefixed filenames, sequential cycle numbering, and dual-audience design docs. - METHOD.md signpost at root - docs/method/backlog/ with inbox, asap, up-next, cool-ideas, bad-code lanes - 49 B-number/OG items migrated to named files in appropriate lanes - 10 tech debt entries split from .claude/bad_code.md to bad-code/ - 13 cool ideas split from .claude/cool_ideas.md to cool-ideas/ - docs/method/process.md, release.md (moved from docs/) - docs/method/retro/, graveyard/, legends/ directories - BACKLOG/ directory removed - CLAUDE.md, CHANGELOG.md, docs/ROADMAP.md updated --- BACKLOG/README.md | 13 - BACKLOG/done/OG-001-worldline-api.md | 38 --- BACKLOG/done/OG-002-warpgraph-role-split.md | 32 --- BACKLOG/done/OG-003-snapshot-immutability.md | 41 ---- BACKLOG/done/OG-004-observer-seek-contract.md | 34 --- .../done/OG-005-detached-read-benchmarks.md | 24 -- .../done/OG-006-read-api-doc-consistency.md | 24 -- .../done/OG-007-hash-stability-coverage.md | 25 -- .../done/OG-008-retargeting-compatibility.md | 25 -- .../done/OG-010-public-api-design-thinking.md | 65 ----- .../done/OG-012-documentation-corpus-audit.md | 44 ---- .../OG-014-streaming-content-attachments.md | 114 --------- .../done/OG-015-jsr-documentation-quality.md | 104 -------- .../OG-016-retrospective-archive-cleanup.md | 40 --- CHANGELOG.md | 3 +- METHOD.md | 227 ++++++++++++++++++ docs/ROADMAP.md | 3 +- .../0001-method-bootstrap/method-bootstrap.md | 49 ++++ .../DX_api-examples-review-checklist.md | 3 +- .../DX_archived-doc-status-guardrail.md | 3 +- .../backlog/DX_batch-review-fix-commits.md | 3 +- .../method/backlog/DX_browser-guide.md | 3 +- .../DX_consumer-test-type-import-coverage.md | 3 +- .../DX_contributor-review-hygiene-guide.md | 3 +- .../method/backlog/DX_deno-smoke-test.md | 3 +- .../backlog/DX_docs-consistency-preflight.md | 3 +- .../backlog/DX_docs-version-sync-precommit.md | 3 +- .../backlog/DX_jsr-publish-deno-panic.md | 3 +- .../method/backlog/DX_pr-ready-merge-cli.md | 3 +- .../DX_public-api-catalog-playground.md | 3 +- .../backlog/DX_pure-typescript-example-app.md | 3 +- .../backlog/DX_readme-install-section.md | 3 +- .../DX_rfc-field-count-drift-detector.md | 3 +- .../method/backlog/DX_security-sync-docs.md | 3 +- .../backlog/DX_test-file-wildcard-ratchet.md | 3 +- .../backlog/DX_typed-custom-zod-helper.md | 3 +- .../backlog/DX_vitest-runtime-excludes.md | 3 +- ...DX_warpgraph-constructor-lifecycle-docs.md | 3 +- .../DX_warpgraph-invisible-api-docs.md | 3 +- .../backlog/PERF_benchmark-budgets-ci-gate.md | 3 +- .../PERF_out-of-core-materialization.md | 3 +- .../TRUST_keystore-prevalidated-cache.md | 3 +- .../backlog/TRUST_property-based-fuzz-test.md | 3 +- .../TRUST_record-round-trip-snapshot.md | 3 +- .../TRUST_schema-discriminated-union.md | 3 +- .../TRUST_unsigned-record-edge-cases.md | 3 +- .../VIZ_mermaid-diagram-content-checklist.md | 3 +- .../VIZ_mermaid-invisible-link-fragility.md | 3 +- .../VIZ_mermaid-rendering-smoke-test.md | 3 +- .../backlog/asap/DX_agent-code-audit.md | 3 +- .../backlog/asap/DX_max-file-size-policy.md | 4 +- .../backlog/asap/DX_restore-dot-notation.md | 3 +- .../backlog/asap/DX_trailer-codec-dts.md | 3 +- .../asap/PROTO_effectsink-breaking-change.md | 3 +- .../asap/PROTO_warpkernel-port-cleanup.md | 4 +- .../asap/PROTO_warpruntime-god-class.md | 4 +- .../backlog/asap/TRUST_sync-auth-ed25519.md | 4 +- .../backlog/asap/TUI_cli-dirname-fragility.md | 4 +- .../DX_exact-optional-conditional-spread.md | 15 ++ .../bad-code/DX_trailer-codec-type-poison.md | 18 ++ .../bad-code/PERF_toposort-full-adjacency.md | 23 ++ ..._transitive-reduction-redundant-adjlist.md | 16 ++ .../bad-code/PROTO_audit-receipt-raw-error.md | 10 + ...PROTO_patchbuilder-12-param-constructor.md | 18 ++ .../PROTO_receipt-op-type-redundant.md | 10 + .../PROTO_strand-service-god-object.md | 21 ++ .../bad-code/PROTO_sync-protocol-raw-error.md | 9 + .../PROTO_warpserve-domain-infra-blur.md | 16 ++ .../DX_cross-path-equivalence-test-dsl.md | 12 + .../backlog/cool-ideas/DX_tsc-autofix-tool.md | 13 + .../PERF_encrypted-stores-fixed-chunking.md | 8 + .../PERF_native-vs-wasm-roaring-benchmark.md | 3 +- .../cool-ideas/PERF_restore-buffer-guard.md | 6 + .../PERF_streaming-graph-traversal.md | 16 ++ .../PROTO_encrypted-trailer-rename.md | 7 + .../PROTO_writer-isolated-bisect.md | 10 + .../TRUST_per-writer-kek-wrapping.md | 8 + .../VIZ_graph-diff-transitive-reduction.md | 9 + .../VIZ_levels-lightweight-layout.md | 3 +- ...IZ_structural-diff-transitive-reduction.md | 3 +- .../cool-ideas/VIZ_warp-ui-visualizer.md | 3 +- .../up-next/DX_modular-type-declarations.md | 3 +- .../up-next/DX_observer-first-guide.md | 3 +- .../up-next/PERF_async-generator-traversal.md | 3 +- .../up-next/PROTO_playback-head-alignment.md | 3 +- ...PROTO_wire-format-migration-edgepropset.md | 3 +- docs/method/process.md | 41 ++++ docs/{ => method}/release.md | 0 .../0001-method-bootstrap/method-bootstrap.md | 55 +++++ 89 files changed, 670 insertions(+), 728 deletions(-) delete mode 100644 BACKLOG/README.md delete mode 100644 BACKLOG/done/OG-001-worldline-api.md delete mode 100644 BACKLOG/done/OG-002-warpgraph-role-split.md delete mode 100644 BACKLOG/done/OG-003-snapshot-immutability.md delete mode 100644 BACKLOG/done/OG-004-observer-seek-contract.md delete mode 100644 BACKLOG/done/OG-005-detached-read-benchmarks.md delete mode 100644 BACKLOG/done/OG-006-read-api-doc-consistency.md delete mode 100644 BACKLOG/done/OG-007-hash-stability-coverage.md delete mode 100644 BACKLOG/done/OG-008-retargeting-compatibility.md delete mode 100644 BACKLOG/done/OG-010-public-api-design-thinking.md delete mode 100644 BACKLOG/done/OG-012-documentation-corpus-audit.md delete mode 100644 BACKLOG/done/OG-014-streaming-content-attachments.md delete mode 100644 BACKLOG/done/OG-015-jsr-documentation-quality.md delete mode 100644 BACKLOG/done/OG-016-retrospective-archive-cleanup.md create mode 100644 METHOD.md create mode 100644 docs/design/0001-method-bootstrap/method-bootstrap.md rename BACKLOG/B102.md => docs/method/backlog/DX_api-examples-review-checklist.md (78%) rename BACKLOG/B169.md => docs/method/backlog/DX_archived-doc-status-guardrail.md (78%) rename BACKLOG/B103.md => docs/method/backlog/DX_batch-review-fix-commits.md (79%) rename BACKLOG/OG-018-browser-guide.md => docs/method/backlog/DX_browser-guide.md (96%) rename BACKLOG/B96.md => docs/method/backlog/DX_consumer-test-type-import-coverage.md (82%) rename BACKLOG/B129.md => docs/method/backlog/DX_contributor-review-hygiene-guide.md (72%) rename BACKLOG/B127.md => docs/method/backlog/DX_deno-smoke-test.md (71%) rename BACKLOG/B128.md => docs/method/backlog/DX_docs-consistency-preflight.md (81%) rename BACKLOG/B12.md => docs/method/backlog/DX_docs-version-sync-precommit.md (71%) rename BACKLOG/B53.md => docs/method/backlog/DX_jsr-publish-deno-panic.md (75%) rename BACKLOG/B119.md => docs/method/backlog/DX_pr-ready-merge-cli.md (79%) rename BACKLOG/OG-011-public-api-catalog-and-playground.md => docs/method/backlog/DX_public-api-catalog-playground.md (96%) rename BACKLOG/B28.md => docs/method/backlog/DX_pure-typescript-example-app.md (74%) rename BACKLOG/B35.md => docs/method/backlog/DX_readme-install-section.md (72%) rename BACKLOG/B147.md => docs/method/backlog/DX_rfc-field-count-drift-detector.md (83%) rename BACKLOG/B34.md => docs/method/backlog/DX_security-sync-docs.md (79%) rename BACKLOG/B98.md => docs/method/backlog/DX_test-file-wildcard-ratchet.md (81%) rename BACKLOG/B54.md => docs/method/backlog/DX_typed-custom-zod-helper.md (81%) rename BACKLOG/B43.md => docs/method/backlog/DX_vitest-runtime-excludes.md (71%) rename BACKLOG/B79.md => docs/method/backlog/DX_warpgraph-constructor-lifecycle-docs.md (80%) rename BACKLOG/B76.md => docs/method/backlog/DX_warpgraph-invisible-api-docs.md (81%) rename BACKLOG/B123.md => docs/method/backlog/PERF_benchmark-budgets-ci-gate.md (78%) rename BACKLOG/OG-013-out-of-core-materialization-and-streaming-reads.md => docs/method/backlog/PERF_out-of-core-materialization.md (97%) rename BACKLOG/B27.md => docs/method/backlog/TRUST_keystore-prevalidated-cache.md (74%) rename BACKLOG/B7.md => docs/method/backlog/TRUST_property-based-fuzz-test.md (69%) rename BACKLOG/B20.md => docs/method/backlog/TRUST_record-round-trip-snapshot.md (68%) rename BACKLOG/B21.md => docs/method/backlog/TRUST_schema-discriminated-union.md (73%) rename BACKLOG/B16.md => docs/method/backlog/TRUST_unsigned-record-edge-cases.md (66%) rename BACKLOG/B104.md => docs/method/backlog/VIZ_mermaid-diagram-content-checklist.md (80%) rename BACKLOG/B101.md => docs/method/backlog/VIZ_mermaid-invisible-link-fragility.md (75%) rename BACKLOG/B88.md => docs/method/backlog/VIZ_mermaid-rendering-smoke-test.md (81%) rename BACKLOG/B171.md => docs/method/backlog/asap/DX_agent-code-audit.md (84%) rename BACKLOG/B181.md => docs/method/backlog/asap/DX_max-file-size-policy.md (96%) rename BACKLOG/B172.md => docs/method/backlog/asap/DX_restore-dot-notation.md (78%) rename BACKLOG/B174.md => docs/method/backlog/asap/DX_trailer-codec-dts.md (79%) rename BACKLOG/B173.md => docs/method/backlog/asap/PROTO_effectsink-breaking-change.md (84%) rename BACKLOG/B177.md => docs/method/backlog/asap/PROTO_warpkernel-port-cleanup.md (90%) rename BACKLOG/B176.md => docs/method/backlog/asap/PROTO_warpruntime-god-class.md (92%) rename BACKLOG/B179.md => docs/method/backlog/asap/TRUST_sync-auth-ed25519.md (93%) rename BACKLOG/B178.md => docs/method/backlog/asap/TUI_cli-dirname-fragility.md (91%) create mode 100644 docs/method/backlog/bad-code/DX_exact-optional-conditional-spread.md create mode 100644 docs/method/backlog/bad-code/DX_trailer-codec-type-poison.md create mode 100644 docs/method/backlog/bad-code/PERF_toposort-full-adjacency.md create mode 100644 docs/method/backlog/bad-code/PERF_transitive-reduction-redundant-adjlist.md create mode 100644 docs/method/backlog/bad-code/PROTO_audit-receipt-raw-error.md create mode 100644 docs/method/backlog/bad-code/PROTO_patchbuilder-12-param-constructor.md create mode 100644 docs/method/backlog/bad-code/PROTO_receipt-op-type-redundant.md create mode 100644 docs/method/backlog/bad-code/PROTO_strand-service-god-object.md create mode 100644 docs/method/backlog/bad-code/PROTO_sync-protocol-raw-error.md create mode 100644 docs/method/backlog/bad-code/PROTO_warpserve-domain-infra-blur.md create mode 100644 docs/method/backlog/cool-ideas/DX_cross-path-equivalence-test-dsl.md create mode 100644 docs/method/backlog/cool-ideas/DX_tsc-autofix-tool.md create mode 100644 docs/method/backlog/cool-ideas/PERF_encrypted-stores-fixed-chunking.md rename BACKLOG/B170.md => docs/method/backlog/cool-ideas/PERF_native-vs-wasm-roaring-benchmark.md (81%) create mode 100644 docs/method/backlog/cool-ideas/PERF_restore-buffer-guard.md create mode 100644 docs/method/backlog/cool-ideas/PERF_streaming-graph-traversal.md create mode 100644 docs/method/backlog/cool-ideas/PROTO_encrypted-trailer-rename.md create mode 100644 docs/method/backlog/cool-ideas/PROTO_writer-isolated-bisect.md create mode 100644 docs/method/backlog/cool-ideas/TRUST_per-writer-kek-wrapping.md create mode 100644 docs/method/backlog/cool-ideas/VIZ_graph-diff-transitive-reduction.md rename BACKLOG/B155.md => docs/method/backlog/cool-ideas/VIZ_levels-lightweight-layout.md (81%) rename BACKLOG/B156.md => docs/method/backlog/cool-ideas/VIZ_structural-diff-transitive-reduction.md (77%) rename BACKLOG/B4.md => docs/method/backlog/cool-ideas/VIZ_warp-ui-visualizer.md (81%) rename BACKLOG/OG-017-modular-type-declarations.md => docs/method/backlog/up-next/DX_modular-type-declarations.md (97%) rename BACKLOG/B175.md => docs/method/backlog/up-next/DX_observer-first-guide.md (95%) rename BACKLOG/B152.md => docs/method/backlog/up-next/PERF_async-generator-traversal.md (77%) rename BACKLOG/OG-009-playback-head-alignment.md => docs/method/backlog/up-next/PROTO_playback-head-alignment.md (79%) rename BACKLOG/B116.md => docs/method/backlog/up-next/PROTO_wire-format-migration-edgepropset.md (90%) create mode 100644 docs/method/process.md rename docs/{ => method}/release.md (100%) create mode 100644 docs/method/retro/0001-method-bootstrap/method-bootstrap.md diff --git a/BACKLOG/README.md b/BACKLOG/README.md deleted file mode 100644 index 59fb64f9..00000000 --- a/BACKLOG/README.md +++ /dev/null @@ -1,13 +0,0 @@ -# BACKLOG - -Items waiting to be pulled into a cycle. - -When an item is selected for work, move its file to `docs/design/cycles//` — the backlog file becomes the design doc for that cycle. - -## Process - -1. Item lives here as `B{number}.md` -2. Pull into cycle → move to `docs/design/cycles//B{number}.md` -3. Write failing tests as spec -4. Implement -5. Retrospective in `docs/archive/retrospectives/` diff --git a/BACKLOG/done/OG-001-worldline-api.md b/BACKLOG/done/OG-001-worldline-api.md deleted file mode 100644 index b06d3bab..00000000 --- a/BACKLOG/done/OG-001-worldline-api.md +++ /dev/null @@ -1,38 +0,0 @@ -# OG-001 — First-Class `Worldline` API - -Status: DONE - -Promoted to: `docs/design/worldline-observer-api-phasing.md` - -## Problem - -Read-side coordinates are still expressed indirectly through mutable -`WarpRuntime` session handles instead of a first-class history noun. - -## Why This Matters - -The observer rewrite is not complete until callers can target immutable history -through a proper `Worldline` API rather than by treating `WarpRuntime` as both a -session and a snapshot. - -## Promotion - -This item was promoted when the next slice began defining the public read-side -API shape after the detached observer-boundary repair work. - -## Outcome - -The minimal first-class `Worldline` surface landed on 2026-03-27: - -- `WarpRuntime.worldline()` now returns a worldline handle -- `Worldline.materialize()` resolves detached snapshots -- `Worldline.observer()` creates observers pinned to the worldline source -- `Worldline.seek()` returns a new worldline handle - -Further work on tick-indexed coordinates and richer worldline identity now -belongs to later slices rather than this initial noun-introduction item. - -See also: - -- `docs/design/worldline-observer-api-phasing.md` -- `docs/archive/retrospectives/2026-03-27-worldline-minimal-phase-b.md` diff --git a/BACKLOG/done/OG-002-warpgraph-role-split.md b/BACKLOG/done/OG-002-warpgraph-role-split.md deleted file mode 100644 index cd7461ba..00000000 --- a/BACKLOG/done/OG-002-warpgraph-role-split.md +++ /dev/null @@ -1,32 +0,0 @@ -# OG-002 — Split Mutable Session `WarpRuntime` From Immutable Snapshot Noun - -Status: DONE - -Promoted to: `docs/design/warpstate-runtime-noun-split.md` - -Completed in: `15.0.0` - -## Problem - -`WarpRuntime` had been carrying too many roles at once under the old -`WarpGraph` noun: mutable session handle, -materialization driver, and the intended immutable snapshot noun. - -## Why This Matters - -The new observer/worldline model will stay semantically muddy until the public -names make the substrate boundary obvious. - -## Promotion - -This item was promoted when `Worldline` and immutable observer/worldline -handles made the remaining `WarpGraph` noun overload the next explicit cleanup -decision. - -## Outcome - -The hard major-version cut landed: - -- public runtime noun is now `WarpRuntime` -- `WarpGraph` was removed instead of preserved as a compatibility alias -- package version bumped to `15.0.0` diff --git a/BACKLOG/done/OG-003-snapshot-immutability.md b/BACKLOG/done/OG-003-snapshot-immutability.md deleted file mode 100644 index cf450baf..00000000 --- a/BACKLOG/done/OG-003-snapshot-immutability.md +++ /dev/null @@ -1,41 +0,0 @@ -# OG-003 — Deepen Public Snapshot Immutability - -Status: DONE - -Promoted to: `docs/design/snapshot-immutability-hardening.md` - -Completed on: `2026-03-27` - -## Problem - -Public materialize APIs now return detached state, but nested `Map` structures -are still writable by callers in their local copy. - -## Why This Matters - -The current slice fixed aliasing, not full immutability. Snapshot hashing and -read-only semantics would be stronger if callers could not mutate the public -structure at all. - -## Promotion Trigger - -Promoted when the runtime rename was complete and the remaining read-side gap -was reduced to one concrete problem: detached snapshots still exposed mutable -nested containers. - -## Outcome - -This slice landed with one shared immutable-snapshot helper that now hardens: - -- `WarpRuntime.materialize(...)` -- `WarpRuntime.materializeCoordinate(...)` -- `WarpRuntime.materializeStrand(...)` -- `WarpRuntime.getStateSnapshot()` -- `Worldline.materialize()` - -The public snapshot contract is now stronger: - -- nested `Map` / `Set` mutators throw -- nested register payload objects are frozen -- detached snapshots no longer expose writable nested state through ordinary - caller operations diff --git a/BACKLOG/done/OG-004-observer-seek-contract.md b/BACKLOG/done/OG-004-observer-seek-contract.md deleted file mode 100644 index 53c58296..00000000 --- a/BACKLOG/done/OG-004-observer-seek-contract.md +++ /dev/null @@ -1,34 +0,0 @@ -# OG-004 — Canonical Immutable Observer Seek Contract - -Status: DONE - -Promoted to: `docs/design/worldline-observer-api-phasing.md` - -## Problem - -The preferred observer seek behavior is now clearer, but it is not yet enforced -as a first-class API contract. - -## Why This Matters - -If observer seeking mutates handles in place, the system will reintroduce the -same handle-instability that the read-boundary rewrite is removing. - -## Promotion - -This item was promoted when observer construction and immutable `seek()` -semantics became the next public read-side API slice. - -## Outcome - -Phase A landed on 2026-03-27: - -- observers now expose factual `source` metadata -- observers now expose pinned `stateHash` -- `ObserverView.seek()` now returns a new observer rather than mutating the - current one - -See also: - -- `docs/design/worldline-observer-api-phasing.md` -- `docs/archive/retrospectives/2026-03-27-observer-seek-phase-a.md` diff --git a/BACKLOG/done/OG-005-detached-read-benchmarks.md b/BACKLOG/done/OG-005-detached-read-benchmarks.md deleted file mode 100644 index 2208955d..00000000 --- a/BACKLOG/done/OG-005-detached-read-benchmarks.md +++ /dev/null @@ -1,24 +0,0 @@ -# OG-005 — Benchmark Detached Coordinate And Strand Reads - -Status: DONE - -Promoted to: `docs/design/detached-read-benchmarks.md` - -Closed by: - -- `test/unit/benchmark/detachedReadBenchmark.fixture.test.js` -- `test/benchmark/DetachedReadBoundary.benchmark.js` -- `docs/archive/retrospectives/2026-03-27-detached-read-benchmarks.md` - -## Problem - -Detached read handles are safer, but their cost is not yet measured. - -## Why This Matters - -Before adding new caching layers or optimizing around detached reads, we should -know what the coordinate and strand read boundary actually costs. - -## Promotion Trigger - -Promoted when the detached-read performance slice began. diff --git a/BACKLOG/done/OG-006-read-api-doc-consistency.md b/BACKLOG/done/OG-006-read-api-doc-consistency.md deleted file mode 100644 index 4643068e..00000000 --- a/BACKLOG/done/OG-006-read-api-doc-consistency.md +++ /dev/null @@ -1,24 +0,0 @@ -# OG-006 — Remove Remaining Docs And Examples That Imply Caller Retargeting - -Status: DONE - -Promoted to: `docs/design/read-api-doc-consistency.md` - -Closed by: - -- `test/unit/scripts/read-api-doc-consistency.test.js` -- `docs/archive/retrospectives/2026-03-27-read-api-doc-consistency.md` - -## Problem - -Some docs and examples may still imply that `materializeCoordinate()` or -`materializeStrand()` retarget the caller graph instance. - -## Why This Matters - -Tests now encode the safer contract. The prose surface should stop teaching the -old semantics. - -## Promotion Trigger - -Promoted when the public read-surface documentation reconciliation pass began. diff --git a/BACKLOG/done/OG-007-hash-stability-coverage.md b/BACKLOG/done/OG-007-hash-stability-coverage.md deleted file mode 100644 index 9f8041d3..00000000 --- a/BACKLOG/done/OG-007-hash-stability-coverage.md +++ /dev/null @@ -1,25 +0,0 @@ -# OG-007 — Expand Hash-Stability Coverage Across Snapshot Flavors - -Status: DONE - -Promoted to: `docs/design/snapshot-hash-stability-coverage.md` - -Closed by: - -- `test/unit/domain/WarpRuntime.snapshotHashStability.test.js` -- `docs/archive/retrospectives/2026-03-27-snapshot-hash-stability-coverage.md` - -## Problem - -The read-boundary slice added detached snapshot behavior, but hash-stability -coverage is still incomplete across receipt-enabled and strand snapshots. - -## Why This Matters - -Hash-stable materialized state is a core requirement for immutable read-side -semantics. - -## Promotion Trigger - -Promoted when the next snapshot-integrity test pass began after detached reads, -runtime renaming, and immutable public snapshots had all landed. diff --git a/BACKLOG/done/OG-008-retargeting-compatibility.md b/BACKLOG/done/OG-008-retargeting-compatibility.md deleted file mode 100644 index bc9ca14c..00000000 --- a/BACKLOG/done/OG-008-retargeting-compatibility.md +++ /dev/null @@ -1,25 +0,0 @@ -# OG-008 — Compatibility And Deprecation Story For Retargeting Reads - -Status: DONE - -Completed in: `15.0.0` - -## Problem - -The public read semantics changed. Callers that depended on retargeting needed -an explicit decision about whether the old surface would linger as an alias or -be removed cleanly. - -## Why This Matters - -Breaking API changes are acceptable here, but they should still be explicit and -traceable. - -## Promotion Trigger - -This item resolved as a hard major-version cut: - -- detached read semantics already removed the old retargeting contract -- the runtime noun was renamed from `WarpGraph` to `WarpRuntime` -- no compatibility alias was kept -- the release version moved to `15.0.0` diff --git a/BACKLOG/done/OG-010-public-api-design-thinking.md b/BACKLOG/done/OG-010-public-api-design-thinking.md deleted file mode 100644 index adbde7fa..00000000 --- a/BACKLOG/done/OG-010-public-api-design-thinking.md +++ /dev/null @@ -1,65 +0,0 @@ -# OG-010 — IBM Design Thinking Pass Over Public APIs And README - -Status: DONE - -## Problem - -Multiple higher-layer apps have repeated the same misuse pattern on top of -`git-warp`: - -- materialize too much graph history into app memory -- write app-local graph read logic -- write app-local traversal logic -- treat whole-graph enumeration as a normal product read path - -This is no longer just an application mistake. It is evidence that the -`git-warp` public surface and docs do not teach the right read discipline -strongly enough. - -## Why This Matters - -The substrate now has much better semantics than it had before: - -- pinned read handles -- detached immutable snapshots -- `Worldline` -- `Observer` -- strand read boundaries - -But the public API and README still need a product-design pass so the right path -is easier to discover than the wrong one. - -This cycle must consider two sponsor perspectives equally: - -- sponsor human: an application developer trying to build a real product on - top of `git-warp` -- sponsor agent: a coding agent trying to use `git-warp` without rebuilding a - second graph engine above it - -This cycle must also remain honest to a third tooling/debugger sponsor: - -- sponsor tooling: a TTD or provenance/debugger consumer that needs explicit - replay, provenance, comparison, and multi-lane playback truth - -If the public surface serves one and confuses the others, it is not good -enough. - -## Intended Questions For The Cycle - -- Which APIs are inspection/debug APIs versus product hot-path APIs? -- How should the README teach read discipline, not just raw capability? -- What cost-signaling is missing from the current surface? -- What task-shaped read examples should exist for both humans and agents? -- What public read helpers would let higher layers ask questions instead of - rebuilding graph logic locally? -- Which features are primary WARP product value versus core/tooling truth? -- Where should multi-lane playback coordination such as `PlaybackHead` live? - -## Promotion - -Promoted to: - -- [docs/design/public-api-design-thinking.md](../docs/design/public-api-design-thinking.md) - -This item now tracks the active cycle kickoff for the IBM Design Thinking pass -over the `git-warp` public API and README. diff --git a/BACKLOG/done/OG-012-documentation-corpus-audit.md b/BACKLOG/done/OG-012-documentation-corpus-audit.md deleted file mode 100644 index 03a8314d..00000000 --- a/BACKLOG/done/OG-012-documentation-corpus-audit.md +++ /dev/null @@ -1,44 +0,0 @@ -# OG-012 — Audit And Reconcile The Documentation Corpus Before v15 - -Status: DONE - -## Problem - -The repo's documentation corpus has grown organically across multiple release -tranches. - -That left three different doc classes mixed together in the same visible -surface: - -- current user-facing docs -- historical design / milestone / runbook material -- superseded or one-off artifacts that still look "live" because they sit at - the top of `docs/` - -Before `v15.0.0`, the docs set needs to become intentional. - -## Why This Matters - -If the repository does not make it clear which docs are canonical, both humans -and agents will read the wrong thing: - -- app builders will learn outdated nouns or workflows -- agentic consumers will infer the wrong public API surface -- maintainers will keep accreting new docs into an already muddy structure - -This is a release-quality problem, not just housekeeping. - -## Desired Outcome - -- define the canonical documentation set for `v15` -- separate live docs from archived/historical material -- remove obvious trash from the repo surface -- make the docs taxonomy explicit in-repo -- add executable checks so the corpus does not drift back into a pile - -## Promotion - -Promoted to: - -- [docs/design/documentation-corpus-audit.md](../docs/design/documentation-corpus-audit.md) -- [docs/design/architecture-and-cli-guide-rewrite.md](../docs/design/architecture-and-cli-guide-rewrite.md) diff --git a/BACKLOG/done/OG-014-streaming-content-attachments.md b/BACKLOG/done/OG-014-streaming-content-attachments.md deleted file mode 100644 index c8f25bd7..00000000 --- a/BACKLOG/done/OG-014-streaming-content-attachments.md +++ /dev/null @@ -1,114 +0,0 @@ -# OG-014 — Mandatory CAS blob storage with streaming I/O - -Status: DONE - -Legend: Observer Geometry - -Design doc: `docs/design/streaming-cas-blob-storage.md` - -## Problem - -Content blob attachments in `git-warp` have two structural problems: - -### 1. CAS blob storage is opt-in - -`attachContent()` and `attachEdgeContent()` accept an optional `blobStorage` -injection. When callers do not provide it, blobs fall through to raw -`persistence.writeBlob()` — a single unchunked Git object with no CDC -deduplication, no encryption support, and no streaming restore path. - -This means the substrate's chunking, deduplication, and encryption capabilities -are present but silently bypassed by default. There is no good reason for a -content blob to skip CAS. Every blob should be chunked. - -### 2. Neither write nor read paths support streaming - -**Write path**: `attachContent(nodeId, content)` accepts `Uint8Array | string`. -The caller must buffer the entire payload in memory before handing it to the -patch builder. `CasBlobAdapter.store()` then wraps that buffer in -`Readable.from([buf])` — a synthetic stream from an already-buffered payload. - -**Read path**: `getContent(nodeId)` returns `Promise`. The -full blob is materialized into memory before the caller can process it. -`CasBlobAdapter.retrieve()` calls `cas.restore()` which buffers internally. - -`git-cas` already supports streaming on both sides: -- `cas.store({ source })` accepts any readable/iterable source -- `cas.restoreStream()` returns `AsyncIterable` - -The streaming substrate is there. It is not expressed through the public API. - -## Why this matters - -WARP graphs can carry attached documents, media, model weights, and other -payloads that are legitimately large. The API should not force full in-memory -buffering on either side of the I/O boundary. - -- Callers writing large content should be able to pipe a stream in -- Callers reading large content should be able to consume it incrementally -- Every blob should get CDC chunking and deduplication as a substrate guarantee -- The decision between buffered and streaming I/O should belong to the caller - -## Current state - -As of `v15.0.1`: - -- `BlobStoragePort`: `store(content, options) → Promise`, - `retrieve(oid) → Promise` — both buffered -- `CasBlobAdapter`: fully implemented CAS adapter with CDC chunking, optional - encryption, backward-compat fallback to raw Git blobs — but only buffered I/O -- `CasBlobAdapter` is internal (not exported from `index.js`) -- `PatchBuilderV2.attachContent()`: accepts `Uint8Array | string`, uses - `blobStorage.store()` if injected, else raw `persistence.writeBlob()` -- `getContent()` / `getEdgeContent()`: returns `Promise`, - uses `blobStorage.retrieve()` if injected, else raw `persistence.readBlob()` -- `WarpApp` and `WarpCore` do not expose content read methods at all -- `git-cas` streaming (`restoreStream()`) is already used in - `CasSeekCacheAdapter` but not in blob reads -- `InMemoryGraphAdapter` has `writeBlob()`/`readBlob()` for browser/test path - -## Desired outcome - -1. CAS blob storage is mandatory — no fallback to raw `writeBlob()` for content -2. Write path accepts streaming input and pipes through without buffering -3. Read path returns a stream the caller can consume incrementally -4. Buffered convenience methods remain available, layered on top of streams -5. Browser and in-memory paths still work via a conforming adapter -6. Legacy raw Git blob attachments remain readable for backward compatibility - -## Acceptance criteria - -1. Every content blob written through `attachContent()` / `attachEdgeContent()` - goes through `BlobStoragePort` — no raw `persistence.writeBlob()` fallback. -2. `attachContent()` / `attachEdgeContent()` accept streaming input - (`AsyncIterable`, `ReadableStream`, `Uint8Array`, `string`). -3. New `getContentStream()` / `getEdgeContentStream()` return - `AsyncIterable` for incremental consumption. -4. Existing `getContent()` / `getEdgeContent()` remain as buffered convenience, - implemented on top of the stream primitive. -5. `BlobStoragePort` grows `storeStream()` and `retrieveStream()` methods. -6. `CasBlobAdapter` implements streaming via `git-cas` natively. -7. An `InMemoryBlobStorageAdapter` implements the port contract for browser and - test paths. -8. Legacy raw Git blob attachments remain readable through backward-compat - fallback in `CasBlobAdapter.retrieveStream()`. -9. Content stream methods are exposed on `WarpApp` and `WarpCore`. - -## Non-goals - -- No automatic migration of existing raw Git blobs to CAS format -- No silent breaking change to existing `getContent()` / `getEdgeContent()` - return types -- No attempt to solve whole-state out-of-core replay (that is OG-013) -- No encryption-by-default (encryption remains an opt-in CAS capability) - -## Notes - -This item supersedes the original OG-014 scope, which covered only streaming -reads. The expanded scope now includes mandatory CAS and streaming writes. - -Related items: -- `OG-013`: out-of-core materialization and streaming reads (broader, separate) -- `B160`: blob attachments via CAS (done, but opt-in — this item makes it - mandatory) -- `B163`: streaming restore for seek cache (done, pattern to follow for blobs) diff --git a/BACKLOG/done/OG-015-jsr-documentation-quality.md b/BACKLOG/done/OG-015-jsr-documentation-quality.md deleted file mode 100644 index 0e5533fd..00000000 --- a/BACKLOG/done/OG-015-jsr-documentation-quality.md +++ /dev/null @@ -1,104 +0,0 @@ -# OG-015 — Raise JSR documentation quality score - -Status: DONE - -Legend: Observer Geometry - -## Problem - -`v15.0.1` fixed the release-surface problems that made npm and JSR publish too -much internal material, and it fixed JSR's `No slow types are used` warning. - -But JSR still reports that only about 67% of exported symbols are documented. -That means the package is now publishable and structurally cleaner, while still -leaving too much of the public surface under-documented in IDE hovers and JSR's -generated API docs. - -This is a real quality gap for a package whose public surface is now explicitly -split into: - -- `WarpApp` -- `WarpCore` -- `Worldline` -- `Aperture` -- `Observer` -- `Strand` - -If those nouns and their surrounding helpers are not documented consistently, -the docs pipeline and the type surface drift apart again. - -## Why this matters - -The repo now has a much stronger user-facing documentation pipeline, but JSR -and editor hovers are part of the real product surface too. - -Improving symbol docs would: - -- increase discoverability for builders reading the API from their editor -- make JSR-generated reference pages more useful -- reduce the need to jump from code completion into source files -- keep the public noun cuts (`WarpApp`, `WarpCore`, `Strand`, `Aperture`, etc.) - legible at the type level -- reinforce the builder-first documentation posture established in `v15` - -## Current state - -As of `v15.0.1`: - -- JSR dry-run passes -- the slow-type warning is resolved via self-type bindings on JavaScript - entrypoints -- the package README and module docs are present -- many exported symbols still lack symbol-level doc comments -- some public type descriptions are accurate but too terse to be useful as - standalone hover docs - -## Desired outcome - -Raise the documentation quality of the exported public surface until the -generated reference feels intentional rather than incidental. - -Likely shape: - -- audit the exported symbols in `index.d.ts` -- add or improve doc comments for major public classes, methods, and types -- prioritize the main builder and tooling entrypoints first - - `WarpApp` - - `WarpCore` - - `Worldline` - - `Observer` - - `Aperture` - - `Strand` - - writer / patch APIs - - query / traversal result shapes -- tighten module docs on secondary entrypoints where needed -- re-run `jsr publish --dry-run` until documentation coverage crosses the JSR - threshold and the generated output reads cleanly - -## Acceptance criteria - -1. JSR documentation coverage rises above the current failing threshold. -2. Major public symbols have meaningful hover docs, not placeholder prose. -3. Public docs and type-surface docs use the same nouns and conceptual model. -4. Secondary entrypoints keep valid module docs. -5. New doc comments stay builder-first and do not reintroduce paper-heavy - framing into the main API reference surface. - -## Non-goals - -- rewriting the entire docs site or public guide corpus again -- documenting private or internal-only helpers as if they were public API -- treating JSR score-chasing as more important than accurate public semantics - -## Notes - -This item is specifically about public type-surface and JSR documentation -quality. - -It is related to, but separate from: - -- `OG-011-public-api-catalog-and-playground.md` -- `OG-012-documentation-corpus-audit.md` - -Those items are about broader documentation architecture. -This item is about the publish-time API documentation quality bar. diff --git a/BACKLOG/done/OG-016-retrospective-archive-cleanup.md b/BACKLOG/done/OG-016-retrospective-archive-cleanup.md deleted file mode 100644 index 593b706f..00000000 --- a/BACKLOG/done/OG-016-retrospective-archive-cleanup.md +++ /dev/null @@ -1,40 +0,0 @@ -# OG-016 — Archive retrospective clutter - -Status: DONE - -Legend: Observer Geometry - -## Problem - -The `docs/` tree contains dozens of retrospective files -(`docs/archive/retrospectives/2026-03-28-...`, design doc retros, etc.) that are -valuable for the team but clutter the documentation surface visible to -external contributors and evaluators. - -The editor's report (2026-03-29) flagged this as the primary drag on -document cohesion (8/10 → could be 10/10). - -## Desired outcome - -Move retrospective and historical audit files into a dedicated archive -path so the `docs/` tree shows only active, forward-looking documentation. - -Likely shape: - -- `docs/archive/retrospectives/` for retrospective files -- `docs/archive/audits/` for historical audit transcripts (already partially - exists) -- Update any cross-references that point into moved paths -- Keep design doc retros (`.retro.md`) co-located with their design docs — - those are part of the active design record, not archive clutter - -## Acceptance criteria - -1. `docs/` top-level listing is clean and forward-looking. -2. No broken cross-references after the move. -3. Historical files remain reachable via archive path. - -## Non-goals - -- No content edits to the retrospective files themselves. -- No deletion of any retrospective — they all stay in the repo. diff --git a/CHANGELOG.md b/CHANGELOG.md index fea71ea8..bfb5821e 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -9,7 +9,8 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 ### Changed -- **Backlog restructured** — migrated all incomplete ROADMAP items to individual `BACKLOG/B{number}.md` files. Dropped milestone structure in favor of flat backlog. Completed OG-items moved to `BACKLOG/done/`. `docs/ROADMAP.md` retained for reference with migration notice. +- **The Method** — introduced `METHOD.md` as the development process framework. Filesystem-native backlog (`docs/method/backlog/`) with lane directories (`inbox/`, `asap/`, `up-next/`, `cool-ideas/`, `bad-code/`). Legend-prefixed filenames (`PROTO_`, `TRUST_`, `VIZ_`, `TUI_`, `DX_`, `PERF_`). Sequential cycle numbering (`docs/design//`). Dual-audience design docs (sponsor human + sponsor agent). Replaced B-number system entirely. +- **Backlog migration** — all 49 B-number and OG items migrated from `BACKLOG/` to `docs/method/backlog/` lanes. Tech debt journal (`.claude/bad_code.md`) split into 10 individual files in `bad-code/`. Cool ideas journal split into 13 individual files in `cool-ideas/`. `docs/release.md` moved to `docs/method/release.md`. `BACKLOG/` directory removed. - **Zero-error TypeScript campaign complete** — eliminated all 1,707 `tsc --noEmit` errors across 271 files. Mechanical TS4111 bracket-access sweep (614), null guards for `noUncheckedIndexedAccess`, conditional spreads for `exactOptionalPropertyTypes`, unused variable removal. All 8 pre-push IRONCLAD gates now pass. - **JoinReducer OpStrategy registry** — replaced five triplicated switch statements over 8 canonical op types with a frozen `Map` registry. Each strategy defines `mutate`, `outcome`, `snapshot`, `accumulate`, `validate`. Adding a new op type without all five methods is a hard error at module load time. Cross-path equivalence tests verify `applyFast`, `applyWithReceipt`, and `applyWithDiff` produce identical CRDT state. diff --git a/METHOD.md b/METHOD.md new file mode 100644 index 00000000..942dcb2e --- /dev/null +++ b/METHOD.md @@ -0,0 +1,227 @@ +# METHOD + +A backlog, a loop, and honest bookkeeping. + +## Principles + +The agent and the human sit at the same table. They see different +things. Both are named in every design. Both must agree before work +ships. Default to building the agent surface first — it is the +foundation the human experience stands on. If the work is +human-first exploratory design, say so in the design doc. + +Everything traces to a playback question. If you cannot say which +question your work answers, you are drifting. Stop. Reconnect to +the design, or change it. + +Tests are the executable spec. Design names the hill and the playback +questions. Tests prove the answers. No ceremonial layer between +intent and proof. + +The filesystem is the database. A directory is a decision context. A +filename is an identity. Moving a file is a decision. `ls` is the +query. + +Process should be calm. No sprints. No velocity. No burndown. A +backlog tiered by judgment, and a loop for doing it well. + +## Structure + +```text +docs/ + method/ + backlog/ + inbox/ raw ideas, anyone, anytime + asap/ do this now + up-next/ do this soon + cool-ideas/ experiments, wild thoughts + bad-code/ tech debt + *.md shaped work not in a named lane + legends/ named domains + retro//.md retrospectives + graveyard/ rejected ideas + process.md how cycles run + release.md how releases work + design/ + /.md cycle design docs + *.md living documents +``` + +Signpost documents live at root or one level into `docs/`. They use +`ALL_CAPS.md`. Deeper than that, they are not signposts. + +## Backlog + +Markdown files. Each describes work worth doing. The filesystem is +the index. + +### Inbox + +Anyone — human or agent — drops ideas in at any time. A sentence is +enough. No legend, no scope, no ceremony. Capture it. Keep moving. +The inbox is processed during maintenance. + +### Lanes + +- **`inbox/`** — unprocessed +- **`asap/`** — pull into a cycle soon +- **`up-next/`** — next in line +- **`cool-ideas/`** — not commitments +- **`bad-code/`** — it works, but it bothers you + +Anything else sits in the backlog root. The backlog root holds shaped +work that matters, but does not currently belong in a named lane. + +### Naming + +Legend prefix if applicable. No numeric IDs. + +``` +VIZ_braille-rendering.md +PROTO_strand-lifecycle.md +debt-trailer-codec-dts.md +``` + +### Promoting + +Pulled into a cycle, a backlog item becomes a design doc: + +``` +backlog/asap/PROTO_strand-lifecycle.md + → design//strand-lifecycle.md +``` + +The backlog file is removed. + +### Commitment + +Pull it and you own it. It does not go back. + +- **Finish** — hill met +- **Pivot** — end early, write the retro. Remaining work re-enters + the backlog as a new item + +### Maintenance + +End of cycle: + +- Process inbox. Promote, flesh out, or bury. +- Re-prioritize. What you learned changes what matters. +- Clean up. Merge duplicates, kill the dead. + +Do not reorganize mid-cycle. + +### Cycle types + +Same loop regardless: + +- **Feature** — design, test, build, ship +- **Design** — the deliverable is docs, not code +- **Debt** — pull from `bad-code/`. The hill is "this no longer + bothers us" + +## Legends + +A named domain that spans many cycles. Each legend describes what it +covers, who cares, what success looks like, and how you know. + +Legends do not start or finish. They are reference frames. + +A legend code (`VIZ`, `PROTO`, `TUI`) prefixes backlog filenames. + +## Cycles + +A unit of shipped work. Design, implementation, retrospective. +Numbered sequentially. + +Cycle directories use `/`, for example +`0010-strand-speculation/`. + +### The loop + +0. **Pull** — choose. Move it. Committed. + +1. **Design** — write a design doc in `docs/design//`. + - Sponsor human + - Sponsor agent + - Hill + - Playback questions — yes/no, both perspectives. Write them + first. + - Non-goals + +2. **RED** — write failing tests. Playback questions become specs. + Default to agent surface first. + +3. **GREEN** — make them pass. + +4. **Playback** — produce a witness. The agent answers agent + questions. The human answers user questions. Write it down. The + witness is the concrete artifact — test output, transcript, + screenshot, recording — that shows both answers. No clear yes + means no. + +5. **PR → main** — review until merge. + +6. **Close** — merge. Retro in `docs/method/retro//`. + - Drift check (mandatory). Undocumented drift is the only + failure. + - New debt to `bad-code/`. + - Cool ideas to `cool-ideas/`. + - Backlog maintenance. + + Releases happen when externally meaningful behavior changes. + Update CHANGELOG when externally visible behavior changed. + Update README when usage, interfaces, or operator understanding + changed. + +### Outcomes + +- **Hill met** — merge, close +- **Partial** — merge what is honest. Retro explains the gap +- **Not met** — cycle still concludes. Write the retro + +A failed cycle with a good retro beats a successful one with no +learnings. + +Every cycle ends with a retro. Success is not required. + +## Graveyard + +Rejected work moves to `docs/method/graveyard/` with a note. The +graveyard prevents re-proposing without context. + +## Flow + +```text +idea + → inbox/ + → triage during maintenance + → graveyard/ + → cool-ideas/ + → backlog root + → up-next/ + → asap/ + → design// (committed) + → RED + → GREEN + → playback (witness) + → retro// + → release (when meaningful) +``` + +## What this system does not have + +No milestones. No velocity. No ticket numbers. + +The backlog is tiered by lane. Choice within a lane is judgment at +pull time. That is enough. + +## Naming + +| Convention | Example | When | +|---|---|---| +| `ALL_CAPS.md` | `VISION.md` | Signpost — root or `docs/` | +| `lowercase.md` | `doctrine.md` | Everything else | +| `_.md` | `VIZ_braille.md` | Backlog with legend | +| `.md` | `debt-trailer-codec.md` | Backlog without legend | +| `/` | `0010-strand-speculation/` | Cycle directory | diff --git a/docs/ROADMAP.md b/docs/ROADMAP.md index 1c781491..09f5f1ff 100644 --- a/docs/ROADMAP.md +++ b/docs/ROADMAP.md @@ -1,6 +1,7 @@ # ROADMAP — @git-stunts/git-warp -> **MIGRATED:** All incomplete items have been migrated to individual files in `BACKLOG/`. +> **MIGRATED:** All incomplete items have been migrated to `docs/method/backlog/`. +> See [METHOD.md](/METHOD.md) for the current process. > Completed items remain in `docs/ROADMAP/COMPLETED.md`. This file is kept for reference only. > **Current release on `main`:** v16.0.0 diff --git a/docs/design/0001-method-bootstrap/method-bootstrap.md b/docs/design/0001-method-bootstrap/method-bootstrap.md new file mode 100644 index 00000000..29b49214 --- /dev/null +++ b/docs/design/0001-method-bootstrap/method-bootstrap.md @@ -0,0 +1,49 @@ +# Method Bootstrap + +**Cycle:** 0001-method-bootstrap +**Type:** Design +**Pulled from:** User direction (2026-04-01) + +## Sponsor human + +James — wants a calm, filesystem-native process that survives +context switches and makes both agent and human work legible. + +## Sponsor agent + +Claude — needs unambiguous structure to find work, classify it, +and operate without asking "where does this go?" + +## Hill + +The Method directory structure exists, all existing backlog items +live in it under descriptive names, and the old B-number system is +gone. From this point forward, `ls docs/method/backlog/` is the +only backlog query. + +## Playback questions + +### Agent + +- Can I find the next piece of work by running `ls` on a lane + directory? **YES/NO** +- Can I classify a new idea into the right lane without asking the + human? **YES/NO** +- Do any B-numbers remain in the repo? **NO** + +### Human + +- Does `ls docs/method/backlog/asap/` show me what matters most? + **YES/NO** +- Can I understand what each backlog item is from its filename + alone? **YES/NO** +- Is the old BACKLOG/ directory gone? **YES/NO** + +## Non-goals + +- Defining all legends upfront. Legends emerge from work. +- Migrating design docs or retrospectives — they stay where they + are. The Method structure is forward-looking. +- Writing process.md or release.md content beyond moving existing + docs into place. +- Code changes of any kind. diff --git a/BACKLOG/B102.md b/docs/method/backlog/DX_api-examples-review-checklist.md similarity index 78% rename from BACKLOG/B102.md rename to docs/method/backlog/DX_api-examples-review-checklist.md index fa7ea9b4..6068e396 100644 --- a/BACKLOG/B102.md +++ b/docs/method/backlog/DX_api-examples-review-checklist.md @@ -1,7 +1,6 @@ -# B102 — API Examples Review Checklist +# API Examples Review Checklist **Effort:** S -**Origin:** B-DOC-3 ## Problem diff --git a/BACKLOG/B169.md b/docs/method/backlog/DX_archived-doc-status-guardrail.md similarity index 78% rename from BACKLOG/B169.md rename to docs/method/backlog/DX_archived-doc-status-guardrail.md index 2537dbec..807835c7 100644 --- a/BACKLOG/B169.md +++ b/docs/method/backlog/DX_archived-doc-status-guardrail.md @@ -1,7 +1,6 @@ -# B169 — Archived Doc Status Guardrail +# Archived Doc Status Guardrail **Effort:** XS -**Origin:** PR #66 review follow-up ## Problem diff --git a/BACKLOG/B103.md b/docs/method/backlog/DX_batch-review-fix-commits.md similarity index 79% rename from BACKLOG/B103.md rename to docs/method/backlog/DX_batch-review-fix-commits.md index d6f85697..aae9cd75 100644 --- a/BACKLOG/B103.md +++ b/docs/method/backlog/DX_batch-review-fix-commits.md @@ -1,7 +1,6 @@ -# B103 — Batch Review Fix Commits +# Batch Review Fix Commits **Effort:** XS -**Origin:** B-DX-2 ## Problem diff --git a/BACKLOG/OG-018-browser-guide.md b/docs/method/backlog/DX_browser-guide.md similarity index 96% rename from BACKLOG/OG-018-browser-guide.md rename to docs/method/backlog/DX_browser-guide.md index 6e13e910..f63fba6e 100644 --- a/BACKLOG/OG-018-browser-guide.md +++ b/docs/method/backlog/DX_browser-guide.md @@ -1,6 +1,5 @@ -# OG-018 — Browser guide and storage adapter documentation +# Browser guide and storage adapter documentation -Status: QUEUED Legend: Observer Geometry diff --git a/BACKLOG/B96.md b/docs/method/backlog/DX_consumer-test-type-import-coverage.md similarity index 82% rename from BACKLOG/B96.md rename to docs/method/backlog/DX_consumer-test-type-import-coverage.md index 55acf1b3..a1ccd8b3 100644 --- a/BACKLOG/B96.md +++ b/docs/method/backlog/DX_consumer-test-type-import-coverage.md @@ -1,7 +1,6 @@ -# B96 — Consumer Test Type-Only Import Coverage +# Consumer Test Type-Only Import Coverage **Effort:** M -**Origin:** B-TYPE-1 ## Problem diff --git a/BACKLOG/B129.md b/docs/method/backlog/DX_contributor-review-hygiene-guide.md similarity index 72% rename from BACKLOG/B129.md rename to docs/method/backlog/DX_contributor-review-hygiene-guide.md index 5e11ed9d..d0ec2a5b 100644 --- a/BACKLOG/B129.md +++ b/docs/method/backlog/DX_contributor-review-hygiene-guide.md @@ -1,7 +1,6 @@ -# B129 — Contributor Review-Loop Hygiene Guide +# Contributor Review-Loop Hygiene Guide **Effort:** S -**Origin:** BACKLOG 2026-02-27 ## Problem diff --git a/BACKLOG/B127.md b/docs/method/backlog/DX_deno-smoke-test.md similarity index 71% rename from BACKLOG/B127.md rename to docs/method/backlog/DX_deno-smoke-test.md index ad30d1b7..a8ae37d2 100644 --- a/BACKLOG/B127.md +++ b/docs/method/backlog/DX_deno-smoke-test.md @@ -1,7 +1,6 @@ -# B127 — Deno Smoke Test +# Deno Smoke Test **Effort:** S -**Origin:** BACKLOG 2026-02-25 ## Problem diff --git a/BACKLOG/B128.md b/docs/method/backlog/DX_docs-consistency-preflight.md similarity index 81% rename from BACKLOG/B128.md rename to docs/method/backlog/DX_docs-consistency-preflight.md index 6f1fb193..71c29909 100644 --- a/BACKLOG/B128.md +++ b/docs/method/backlog/DX_docs-consistency-preflight.md @@ -1,7 +1,6 @@ -# B128 — Docs Consistency Preflight +# Docs Consistency Preflight **Effort:** S -**Origin:** BACKLOG 2026-02-28 ## Problem diff --git a/BACKLOG/B12.md b/docs/method/backlog/DX_docs-version-sync-precommit.md similarity index 71% rename from BACKLOG/B12.md rename to docs/method/backlog/DX_docs-version-sync-precommit.md index 328b0792..049f9a03 100644 --- a/BACKLOG/B12.md +++ b/docs/method/backlog/DX_docs-version-sync-precommit.md @@ -1,7 +1,6 @@ -# B12 — Docs-Version-Sync Pre-Commit Check +# Docs-Version-Sync Pre-Commit Check **Effort:** S -**Origin:** ROADMAP standalone ## Problem diff --git a/BACKLOG/B53.md b/docs/method/backlog/DX_jsr-publish-deno-panic.md similarity index 75% rename from BACKLOG/B53.md rename to docs/method/backlog/DX_jsr-publish-deno-panic.md index 2ba8bbc8..08e1956d 100644 --- a/BACKLOG/B53.md +++ b/docs/method/backlog/DX_jsr-publish-deno-panic.md @@ -1,7 +1,6 @@ -# B53 — Fix JSR Publish Dry-Run Deno Panic +# Fix JSR Publish Dry-Run Deno Panic **Effort:** M -**Origin:** ROADMAP standalone (Platform) ## Problem diff --git a/BACKLOG/B119.md b/docs/method/backlog/DX_pr-ready-merge-cli.md similarity index 79% rename from BACKLOG/B119.md rename to docs/method/backlog/DX_pr-ready-merge-cli.md index b956fd90..975a6995 100644 --- a/BACKLOG/B119.md +++ b/docs/method/backlog/DX_pr-ready-merge-cli.md @@ -1,7 +1,6 @@ -# B119 — `scripts/pr-ready` Merge-Readiness CLI +# `scripts/pr-ready` Merge-Readiness CLI **Effort:** M -**Origin:** BACKLOG 2026-02-27/28 ## Problem diff --git a/BACKLOG/OG-011-public-api-catalog-and-playground.md b/docs/method/backlog/DX_public-api-catalog-playground.md similarity index 96% rename from BACKLOG/OG-011-public-api-catalog-and-playground.md rename to docs/method/backlog/DX_public-api-catalog-playground.md index d4fc479e..2ef37cb5 100644 --- a/BACKLOG/OG-011-public-api-catalog-and-playground.md +++ b/docs/method/backlog/DX_public-api-catalog-playground.md @@ -1,6 +1,5 @@ -# OG-011 — Public API Catalog And Browser Documentation Playground +# Public API Catalog And Browser Documentation Playground -Status: QUEUED ## Why diff --git a/BACKLOG/B28.md b/docs/method/backlog/DX_pure-typescript-example-app.md similarity index 74% rename from BACKLOG/B28.md rename to docs/method/backlog/DX_pure-typescript-example-app.md index ce82b11a..0cc396bb 100644 --- a/BACKLOG/B28.md +++ b/docs/method/backlog/DX_pure-typescript-example-app.md @@ -1,7 +1,6 @@ -# B28 — Pure TypeScript Example App +# Pure TypeScript Example App **Effort:** M -**Origin:** ROADMAP standalone ## Problem diff --git a/BACKLOG/B35.md b/docs/method/backlog/DX_readme-install-section.md similarity index 72% rename from BACKLOG/B35.md rename to docs/method/backlog/DX_readme-install-section.md index 36b3aeee..dd66ec34 100644 --- a/BACKLOG/B35.md +++ b/docs/method/backlog/DX_readme-install-section.md @@ -1,7 +1,6 @@ -# B35 — Docs: README Install Section +# Docs: README Install Section **Effort:** S -**Origin:** ROADMAP standalone (P6 Docs) ## Problem diff --git a/BACKLOG/B147.md b/docs/method/backlog/DX_rfc-field-count-drift-detector.md similarity index 83% rename from BACKLOG/B147.md rename to docs/method/backlog/DX_rfc-field-count-drift-detector.md index 1a29d0bb..54ff6ce6 100644 --- a/BACKLOG/B147.md +++ b/docs/method/backlog/DX_rfc-field-count-drift-detector.md @@ -1,7 +1,6 @@ -# B147 — RFC Field Count Drift Detector +# RFC Field Count Drift Detector **Effort:** S -**Origin:** B145 PR review ## Problem diff --git a/BACKLOG/B34.md b/docs/method/backlog/DX_security-sync-docs.md similarity index 79% rename from BACKLOG/B34.md rename to docs/method/backlog/DX_security-sync-docs.md index c76c0475..8ebe70db 100644 --- a/BACKLOG/B34.md +++ b/docs/method/backlog/DX_security-sync-docs.md @@ -1,7 +1,6 @@ -# B34 — Docs: SECURITY_SYNC.md +# Docs: SECURITY_SYNC.md **Effort:** M -**Origin:** ROADMAP standalone (P6 Docs) ## Problem diff --git a/BACKLOG/B98.md b/docs/method/backlog/DX_test-file-wildcard-ratchet.md similarity index 81% rename from BACKLOG/B98.md rename to docs/method/backlog/DX_test-file-wildcard-ratchet.md index 21440703..e29316f6 100644 --- a/BACKLOG/B98.md +++ b/docs/method/backlog/DX_test-file-wildcard-ratchet.md @@ -1,7 +1,6 @@ -# B98 — Test-File Wildcard Ratchet +# Test-File Wildcard Ratchet **Effort:** S -**Origin:** B-TYPE-3 ## Problem diff --git a/BACKLOG/B54.md b/docs/method/backlog/DX_typed-custom-zod-helper.md similarity index 81% rename from BACKLOG/B54.md rename to docs/method/backlog/DX_typed-custom-zod-helper.md index 8300cadc..6c77d21b 100644 --- a/BACKLOG/B54.md +++ b/docs/method/backlog/DX_typed-custom-zod-helper.md @@ -1,7 +1,6 @@ -# B54 — `typedCustom()` Zod Helper +# `typedCustom()` Zod Helper **Effort:** S -**Origin:** ROADMAP standalone ## Problem diff --git a/BACKLOG/B43.md b/docs/method/backlog/DX_vitest-runtime-excludes.md similarity index 71% rename from BACKLOG/B43.md rename to docs/method/backlog/DX_vitest-runtime-excludes.md index 37ab664e..d848f7f5 100644 --- a/BACKLOG/B43.md +++ b/docs/method/backlog/DX_vitest-runtime-excludes.md @@ -1,7 +1,6 @@ -# B43 — Vitest Explicit Runtime Excludes +# Vitest Explicit Runtime Excludes **Effort:** S -**Origin:** ROADMAP standalone ## Problem diff --git a/BACKLOG/B79.md b/docs/method/backlog/DX_warpgraph-constructor-lifecycle-docs.md similarity index 80% rename from BACKLOG/B79.md rename to docs/method/backlog/DX_warpgraph-constructor-lifecycle-docs.md index ee716468..dddbb52d 100644 --- a/BACKLOG/B79.md +++ b/docs/method/backlog/DX_warpgraph-constructor-lifecycle-docs.md @@ -1,7 +1,6 @@ -# B79 — WarpGraph Constructor Lifecycle Docs +# WarpGraph Constructor Lifecycle Docs **Effort:** M -**Origin:** B-AUDIT-16 (TSK TSK) ## Problem diff --git a/BACKLOG/B76.md b/docs/method/backlog/DX_warpgraph-invisible-api-docs.md similarity index 81% rename from BACKLOG/B76.md rename to docs/method/backlog/DX_warpgraph-invisible-api-docs.md index 03d74a6f..461070ee 100644 --- a/BACKLOG/B76.md +++ b/docs/method/backlog/DX_warpgraph-invisible-api-docs.md @@ -1,7 +1,6 @@ -# B76 — WarpGraph Invisible API Surface Docs +# WarpGraph Invisible API Surface Docs **Effort:** M -**Origin:** B-AUDIT-4 (STANK) ## Problem diff --git a/BACKLOG/B123.md b/docs/method/backlog/PERF_benchmark-budgets-ci-gate.md similarity index 78% rename from BACKLOG/B123.md rename to docs/method/backlog/PERF_benchmark-budgets-ci-gate.md index 2ca01be5..dc9f9cef 100644 --- a/BACKLOG/B123.md +++ b/docs/method/backlog/PERF_benchmark-budgets-ci-gate.md @@ -1,7 +1,6 @@ -# B123 — Benchmark Budgets + CI Regression Gate +# Benchmark Budgets + CI Regression Gate **Effort:** L -**Origin:** BACKLOG 2026-02-27 ## Problem diff --git a/BACKLOG/OG-013-out-of-core-materialization-and-streaming-reads.md b/docs/method/backlog/PERF_out-of-core-materialization.md similarity index 97% rename from BACKLOG/OG-013-out-of-core-materialization-and-streaming-reads.md rename to docs/method/backlog/PERF_out-of-core-materialization.md index 44ae1278..974d87ce 100644 --- a/BACKLOG/OG-013-out-of-core-materialization-and-streaming-reads.md +++ b/docs/method/backlog/PERF_out-of-core-materialization.md @@ -1,6 +1,5 @@ -# OG-013 — Out-of-core materialization and streaming reads +# Out-of-core materialization and streaming reads -Status: QUEUED ## Problem diff --git a/BACKLOG/B27.md b/docs/method/backlog/TRUST_keystore-prevalidated-cache.md similarity index 74% rename from BACKLOG/B27.md rename to docs/method/backlog/TRUST_keystore-prevalidated-cache.md index a3fd87fe..f21f4cb5 100644 --- a/BACKLOG/B27.md +++ b/docs/method/backlog/TRUST_keystore-prevalidated-cache.md @@ -1,7 +1,6 @@ -# B27 — `TrustKeyStore` Pre-Validated Key Cache +# `TrustKeyStore` Pre-Validated Key Cache **Effort:** S -**Origin:** ROADMAP deferred ## Problem diff --git a/BACKLOG/B7.md b/docs/method/backlog/TRUST_property-based-fuzz-test.md similarity index 69% rename from BACKLOG/B7.md rename to docs/method/backlog/TRUST_property-based-fuzz-test.md index 3e96cda2..e5e111cc 100644 --- a/BACKLOG/B7.md +++ b/docs/method/backlog/TRUST_property-based-fuzz-test.md @@ -1,7 +1,6 @@ -# B7 — Doctor: Property-Based Fuzz Test +# Doctor: Property-Based Fuzz Test **Effort:** M -**Origin:** ROADMAP deferred ## Problem diff --git a/BACKLOG/B20.md b/docs/method/backlog/TRUST_record-round-trip-snapshot.md similarity index 68% rename from BACKLOG/B20.md rename to docs/method/backlog/TRUST_record-round-trip-snapshot.md index 02811a1c..fd7a1e39 100644 --- a/BACKLOG/B20.md +++ b/docs/method/backlog/TRUST_record-round-trip-snapshot.md @@ -1,7 +1,6 @@ -# B20 — Trust Record Round-Trip Snapshot Test +# Trust Record Round-Trip Snapshot Test **Effort:** S -**Origin:** ROADMAP deferred ## Problem diff --git a/BACKLOG/B21.md b/docs/method/backlog/TRUST_schema-discriminated-union.md similarity index 73% rename from BACKLOG/B21.md rename to docs/method/backlog/TRUST_schema-discriminated-union.md index 8aa318a8..ba5475de 100644 --- a/BACKLOG/B21.md +++ b/docs/method/backlog/TRUST_schema-discriminated-union.md @@ -1,7 +1,6 @@ -# B21 — Trust Schema Discriminated Union +# Trust Schema Discriminated Union **Effort:** S -**Origin:** ROADMAP deferred ## Problem diff --git a/BACKLOG/B16.md b/docs/method/backlog/TRUST_unsigned-record-edge-cases.md similarity index 66% rename from BACKLOG/B16.md rename to docs/method/backlog/TRUST_unsigned-record-edge-cases.md index a69b0538..b4fb8b8c 100644 --- a/BACKLOG/B16.md +++ b/docs/method/backlog/TRUST_unsigned-record-edge-cases.md @@ -1,7 +1,6 @@ -# B16 — `unsignedRecordForId` Edge-Case Tests +# `unsignedRecordForId` Edge-Case Tests **Effort:** S -**Origin:** ROADMAP deferred ## Problem diff --git a/BACKLOG/B104.md b/docs/method/backlog/VIZ_mermaid-diagram-content-checklist.md similarity index 80% rename from BACKLOG/B104.md rename to docs/method/backlog/VIZ_mermaid-diagram-content-checklist.md index 40e80305..4edb5f79 100644 --- a/BACKLOG/B104.md +++ b/docs/method/backlog/VIZ_mermaid-diagram-content-checklist.md @@ -1,7 +1,6 @@ -# B104 — Mermaid Diagram Content Checklist +# Mermaid Diagram Content Checklist **Effort:** XS -**Origin:** B-DIAG-1 ## Problem diff --git a/BACKLOG/B101.md b/docs/method/backlog/VIZ_mermaid-invisible-link-fragility.md similarity index 75% rename from BACKLOG/B101.md rename to docs/method/backlog/VIZ_mermaid-invisible-link-fragility.md index 45c9cddd..4e794a10 100644 --- a/BACKLOG/B101.md +++ b/docs/method/backlog/VIZ_mermaid-invisible-link-fragility.md @@ -1,7 +1,6 @@ -# B101 — Mermaid `~~~` Invisible-Link Fragility +# Mermaid `~~~` Invisible-Link Fragility **Effort:** XS -**Origin:** B-DIAG-3 ## Problem diff --git a/BACKLOG/B88.md b/docs/method/backlog/VIZ_mermaid-rendering-smoke-test.md similarity index 81% rename from BACKLOG/B88.md rename to docs/method/backlog/VIZ_mermaid-rendering-smoke-test.md index a1accb6d..a737072a 100644 --- a/BACKLOG/B88.md +++ b/docs/method/backlog/VIZ_mermaid-rendering-smoke-test.md @@ -1,7 +1,6 @@ -# B88 — Mermaid Rendering Smoke Test +# Mermaid Rendering Smoke Test **Effort:** S -**Origin:** B-DIAG-2 ## Problem diff --git a/BACKLOG/B171.md b/docs/method/backlog/asap/DX_agent-code-audit.md similarity index 84% rename from BACKLOG/B171.md rename to docs/method/backlog/asap/DX_agent-code-audit.md index 2ee97a97..92cf5414 100644 --- a/BACKLOG/B171.md +++ b/docs/method/backlog/asap/DX_agent-code-audit.md @@ -1,7 +1,6 @@ -# B171 — TSC Campaign Agent-Authored Code Audit +# TSC Campaign Agent-Authored Code Audit **Effort:** L -**Origin:** TSC zero campaign drift (PR #73) ## Problem diff --git a/BACKLOG/B181.md b/docs/method/backlog/asap/DX_max-file-size-policy.md similarity index 96% rename from BACKLOG/B181.md rename to docs/method/backlog/asap/DX_max-file-size-policy.md index 2e77784e..6c5c3bce 100644 --- a/BACKLOG/B181.md +++ b/docs/method/backlog/asap/DX_max-file-size-policy.md @@ -1,8 +1,6 @@ -# B181 — Enforce Max File Size + One-Thing-Per-File Policy +# Enforce Max File Size + One-Thing-Per-File Policy **Effort:** L -**Origin:** User direction (2026-04-01) -**Priority:** HIGH ## Problem diff --git a/BACKLOG/B172.md b/docs/method/backlog/asap/DX_restore-dot-notation.md similarity index 78% rename from BACKLOG/B172.md rename to docs/method/backlog/asap/DX_restore-dot-notation.md index f161cad8..52d4080f 100644 --- a/BACKLOG/B172.md +++ b/docs/method/backlog/asap/DX_restore-dot-notation.md @@ -1,7 +1,6 @@ -# B172 — Restore `dot-notation` via `@typescript-eslint/dot-notation` +# Restore `dot-notation` via `@typescript-eslint/dot-notation` **Effort:** S -**Origin:** TSC zero campaign drift (PR #73) ## Problem diff --git a/BACKLOG/B174.md b/docs/method/backlog/asap/DX_trailer-codec-dts.md similarity index 79% rename from BACKLOG/B174.md rename to docs/method/backlog/asap/DX_trailer-codec-dts.md index ebdf9242..11101ba4 100644 --- a/BACKLOG/B174.md +++ b/docs/method/backlog/asap/DX_trailer-codec-dts.md @@ -1,7 +1,6 @@ -# B174 — `@git-stunts/trailer-codec` Type Declarations +# `@git-stunts/trailer-codec` Type Declarations **Effort:** M -**Origin:** TSC zero campaign drift (PR #73) ## Problem diff --git a/BACKLOG/B173.md b/docs/method/backlog/asap/PROTO_effectsink-breaking-change.md similarity index 84% rename from BACKLOG/B173.md rename to docs/method/backlog/asap/PROTO_effectsink-breaking-change.md index 1231c7eb..1795f569 100644 --- a/BACKLOG/B173.md +++ b/docs/method/backlog/asap/PROTO_effectsink-breaking-change.md @@ -1,7 +1,6 @@ -# B173 — EffectSinkPort Breaking Change Hygiene +# EffectSinkPort Breaking Change Hygiene **Effort:** S -**Origin:** TSC zero campaign drift (PR #73) ## Problem diff --git a/BACKLOG/B177.md b/docs/method/backlog/asap/PROTO_warpkernel-port-cleanup.md similarity index 90% rename from BACKLOG/B177.md rename to docs/method/backlog/asap/PROTO_warpkernel-port-cleanup.md index 94404e32..4569aa22 100644 --- a/BACKLOG/B177.md +++ b/docs/method/backlog/asap/PROTO_warpkernel-port-cleanup.md @@ -1,8 +1,6 @@ -# B177 — Cohesive WarpKernelPort (Persistence Union Type Cleanup) +# Cohesive WarpKernelPort (Persistence Union Type Cleanup) **Effort:** L -**Origin:** Code quality audit (AUDIT_CODE_QUALITY.md), user escalation (2026-04-01) -**Priority:** HIGH ## Problem diff --git a/BACKLOG/B176.md b/docs/method/backlog/asap/PROTO_warpruntime-god-class.md similarity index 92% rename from BACKLOG/B176.md rename to docs/method/backlog/asap/PROTO_warpruntime-god-class.md index 0c824d5f..8df9ba25 100644 --- a/BACKLOG/B176.md +++ b/docs/method/backlog/asap/PROTO_warpruntime-god-class.md @@ -1,8 +1,6 @@ -# B176 — WarpRuntime + warp/ Methods God Class Decomposition +# WarpRuntime + warp/ Methods God Class Decomposition **Effort:** XL -**Origin:** Code quality audit (AUDIT_CODE_QUALITY.md, AUDIT_THE_JANK.md), user escalation (2026-04-01) -**Priority:** HIGH ## Problem diff --git a/BACKLOG/B179.md b/docs/method/backlog/asap/TRUST_sync-auth-ed25519.md similarity index 93% rename from BACKLOG/B179.md rename to docs/method/backlog/asap/TRUST_sync-auth-ed25519.md index 77ce1775..17fd5066 100644 --- a/BACKLOG/B179.md +++ b/docs/method/backlog/asap/TRUST_sync-auth-ed25519.md @@ -1,8 +1,6 @@ -# B179 — Sync Auth: Migrate from Symmetric HMAC to Ed25519 Asymmetric Signatures +# Sync Auth: Migrate from Symmetric HMAC to Ed25519 Asymmetric Signatures **Effort:** L -**Origin:** Security audit, user escalation (2026-04-01) -**Priority:** HIGH ## Problem diff --git a/BACKLOG/B178.md b/docs/method/backlog/asap/TUI_cli-dirname-fragility.md similarity index 91% rename from BACKLOG/B178.md rename to docs/method/backlog/asap/TUI_cli-dirname-fragility.md index 9a0093b8..af6db976 100644 --- a/BACKLOG/B178.md +++ b/docs/method/backlog/asap/TUI_cli-dirname-fragility.md @@ -1,8 +1,6 @@ -# B178 — CLI __dirname Path Traversal Fragility +# CLI __dirname Path Traversal Fragility **Effort:** S -**Origin:** Code quality audit, user escalation (2026-04-01) -**Priority:** HIGH ## Problem diff --git a/docs/method/backlog/bad-code/DX_exact-optional-conditional-spread.md b/docs/method/backlog/bad-code/DX_exact-optional-conditional-spread.md new file mode 100644 index 00000000..c570c0d8 --- /dev/null +++ b/docs/method/backlog/bad-code/DX_exact-optional-conditional-spread.md @@ -0,0 +1,15 @@ +# exactOptionalPropertyTypes conditional spread boilerplate + +**Effort:** M + +## Problem + +`exactOptionalPropertyTypes: true` means you can't pass +`{ key: undefined }` to a function expecting `{ key?: T }`. The fix +is conditional spread: `...(x !== undefined ? { key: x } : {})`. +This is correct but verbose. ~30 call sites across `WarpRuntime.js`, +`SyncController.js`, `WormholeService.js`, `StrandService.js`, and +others. + +A shared `omitUndefined()` utility could DRY it up, but premature +until the pattern stabilizes. diff --git a/docs/method/backlog/bad-code/DX_trailer-codec-type-poison.md b/docs/method/backlog/bad-code/DX_trailer-codec-type-poison.md new file mode 100644 index 00000000..c9c4201e --- /dev/null +++ b/docs/method/backlog/bad-code/DX_trailer-codec-type-poison.md @@ -0,0 +1,18 @@ +# @git-stunts/trailer-codec type poison at the boundary + +**Effort:** M + +## Problem + +`MessageCodecInternal.js` `getCodec()` returns an untyped +`TrailerCodec` from `@git-stunts/trailer-codec` (no `.d.ts`). Every +consumer must cast through `unknown` intermediary. Six files carry +this workaround: `AnchorMessageCodec`, `AuditMessageCodec`, +`CheckpointMessageCodec`, `PatchMessageCodec`, `SyncPayloadSchema`, +and any future codec consumer. + +## Fix + +Add `trailer-codec/index.d.ts` upstream so the return type flows +naturally. This is the same root cause as +`DX_trailer-codec-dts.md` in asap/ — fixing that fixes this. diff --git a/docs/method/backlog/bad-code/PERF_toposort-full-adjacency.md b/docs/method/backlog/bad-code/PERF_toposort-full-adjacency.md new file mode 100644 index 00000000..e385ca7f --- /dev/null +++ b/docs/method/backlog/bad-code/PERF_toposort-full-adjacency.md @@ -0,0 +1,23 @@ +# topologicalSort always materializes full adjacency + +**Effort:** M + +## Problem + +`GraphTraversal.js` `topologicalSort()` (~line 693) unconditionally +builds `adjList: Map` AND +`neighborEdgeMap: Map` for every reachable +node. Both structures hold the full edge set in memory (O(V+E)). The +`_returnAdjList` flag only controls whether `neighborEdgeMap` is +*returned* — it's always *built*. + +For callers that only need the sorted order, this is wasted memory. +Root cause behind `levels()` and `transitiveReduction()` inheriting +full-graph materialization from their `topologicalSort()` call. + +## Possible fix + +Split topo sort into two modes: lightweight (in-degree counting only, +no adj list caching) and current mode (full caching for callers that +need it). Or: make the Kahn phase re-fetch from provider, relying on +LRU neighbor cache for amortization. diff --git a/docs/method/backlog/bad-code/PERF_transitive-reduction-redundant-adjlist.md b/docs/method/backlog/bad-code/PERF_transitive-reduction-redundant-adjlist.md new file mode 100644 index 00000000..863b3d86 --- /dev/null +++ b/docs/method/backlog/bad-code/PERF_transitive-reduction-redundant-adjlist.md @@ -0,0 +1,16 @@ +# transitiveReduction builds adjacency list redundantly + +**Effort:** S + +## Problem + +After getting `_neighborEdgeMap` from topo sort (which already has +full neighbor data), `transitiveReduction()` builds a *second* +`adjList: Map` by extracting just the neighborIds. +Two representations of the same edge set sit in memory +simultaneously. + +## Fix + +Use `_neighborEdgeMap` directly in the BFS, accessing `.neighborId` +inline instead of pre-extracting. diff --git a/docs/method/backlog/bad-code/PROTO_audit-receipt-raw-error.md b/docs/method/backlog/bad-code/PROTO_audit-receipt-raw-error.md new file mode 100644 index 00000000..16e6fce0 --- /dev/null +++ b/docs/method/backlog/bad-code/PROTO_audit-receipt-raw-error.md @@ -0,0 +1,10 @@ +# AuditReceiptService uses raw Error (18 occurrences) + +**Effort:** S + +## Problem + +All validation and CAS errors in `AuditReceiptService.js` throw +plain `Error` instead of a domain error class. Should use a +dedicated `AuditError` (which doesn't exist yet) or `PatchError` +for validation failures and `PersistenceError` for CAS conflicts. diff --git a/docs/method/backlog/bad-code/PROTO_patchbuilder-12-param-constructor.md b/docs/method/backlog/bad-code/PROTO_patchbuilder-12-param-constructor.md new file mode 100644 index 00000000..ee5174e2 --- /dev/null +++ b/docs/method/backlog/bad-code/PROTO_patchbuilder-12-param-constructor.md @@ -0,0 +1,18 @@ +# PatchBuilderV2 12-parameter constructor + +**Effort:** M + +## Problem + +`PatchBuilderV2` constructor accepts 12+ parameters including +`persistence`, `graphName`, `writerId`, `lamport`, `versionVector`, +`getCurrentState`, `expectedParentSha`, `targetRefPath`, +`onCommitSuccess`, `onDeleteWithData`, `codec`, `logger`, +`blobStorage`, `patchBlobStorage`. This is a configuration object, +not dependency injection — most params are runtime state, not +services. + +## Possible fix + +Split into a `PatchBuilderConfig` value object for static config and +pass mutable state separately. diff --git a/docs/method/backlog/bad-code/PROTO_receipt-op-type-redundant.md b/docs/method/backlog/bad-code/PROTO_receipt-op-type-redundant.md new file mode 100644 index 00000000..29ab944a --- /dev/null +++ b/docs/method/backlog/bad-code/PROTO_receipt-op-type-redundant.md @@ -0,0 +1,10 @@ +# RECEIPT_OP_TYPE mapping redundant with OpStrategy + +**Effort:** XS + +## Problem + +`JoinReducer.js` `RECEIPT_OP_TYPE` maps internal names to receipt +names (e.g. `NodeRemove` -> `NodeTombstone`). With the OpStrategy +registry, this could be a `receiptName` property on each strategy +object. Low priority — cosmetic. diff --git a/docs/method/backlog/bad-code/PROTO_strand-service-god-object.md b/docs/method/backlog/bad-code/PROTO_strand-service-god-object.md new file mode 100644 index 00000000..6f5898dc --- /dev/null +++ b/docs/method/backlog/bad-code/PROTO_strand-service-god-object.md @@ -0,0 +1,21 @@ +# StrandService is a god object (2048 LOC) + +**Effort:** L + +## Problem + +StrandService handles: strand CRUD, strand materialization, strand +patching, intent queuing/dequeuing, strand transfer planning, strand +braiding, strand overlays, strand comparison, and descriptor +serialization. It owns 40+ methods across ~2048 lines. + +## Decomposition candidates + +- `StrandMaterializationService` — materialize/compare/snapshot +- `StrandIntentService` — intent queue (queue, dequeue, tick, drain) +- `StrandBraidService` — braid overlay pinning and resolution +- `StrandTransferService` — transfer plan computation +- `StrandDescriptorCodec` — serialization/deserialization + +Each sub-service takes the same `graph` + `persistence` deps. +StrandService becomes a thin facade. diff --git a/docs/method/backlog/bad-code/PROTO_sync-protocol-raw-error.md b/docs/method/backlog/bad-code/PROTO_sync-protocol-raw-error.md new file mode 100644 index 00000000..0f6c5632 --- /dev/null +++ b/docs/method/backlog/bad-code/PROTO_sync-protocol-raw-error.md @@ -0,0 +1,9 @@ +# SyncProtocol uses raw Error with manual code property + +**Effort:** XS + +## Problem + +`SyncProtocol.js` (~line 233) constructs `new Error()` then manually +casts to `Error & { code: string }`. Should use `SyncError` from +domain errors. diff --git a/docs/method/backlog/bad-code/PROTO_warpserve-domain-infra-blur.md b/docs/method/backlog/bad-code/PROTO_warpserve-domain-infra-blur.md new file mode 100644 index 00000000..f8b8fdcb --- /dev/null +++ b/docs/method/backlog/bad-code/PROTO_warpserve-domain-infra-blur.md @@ -0,0 +1,16 @@ +# WarpServeService domain/infra boundary blur + +**Effort:** S + +## Problem + +`WarpServeService` lives in `src/domain/services/` but requires a +`WebSocketServerPort` — a port whose only implementations are +infrastructure adapters. The service orchestrates WebSocket protocol +handling which is domain logic, but its constructor requires I/O +infrastructure to function. This blurs the hexagonal boundary and +makes unit testing harder. + +Not a bug. Acceptable today. If more I/O-dependent services emerge, +consider an "application services" layer between domain and +infrastructure. diff --git a/docs/method/backlog/cool-ideas/DX_cross-path-equivalence-test-dsl.md b/docs/method/backlog/cool-ideas/DX_cross-path-equivalence-test-dsl.md new file mode 100644 index 00000000..401f3c6e --- /dev/null +++ b/docs/method/backlog/cool-ideas/DX_cross-path-equivalence-test-dsl.md @@ -0,0 +1,12 @@ +# Cross-path equivalence as a general testing pattern + +The JoinReducer `pathEquivalence.test.js` applies the same input +through N code paths and asserts identical output. This generalizes: + +- Serialization round-trips +- Checkpoint save/restore vs fresh materialize +- Sync request/response vs local materialize +- Incremental vs full reduce + +Could be a test DSL: +`assertPathEquivalence(input, [pathA, pathB, pathC], comparator)` diff --git a/docs/method/backlog/cool-ideas/DX_tsc-autofix-tool.md b/docs/method/backlog/cool-ideas/DX_tsc-autofix-tool.md new file mode 100644 index 00000000..2c31256f --- /dev/null +++ b/docs/method/backlog/cool-ideas/DX_tsc-autofix-tool.md @@ -0,0 +1,13 @@ +# Mechanical tsc autofix tool + +The TS4111 fixer script from the TSC zero campaign generalized well. +A `tsc-autofix` CLI that reads `tsc --noEmit` stderr, classifies +errors by fixability, and applies mechanical fixes: + +- TS4111 (bracket access): `.prop` -> `['prop']` +- TS6133 (unused vars/imports): delete the declaration +- TS2464 (computed property): wrap in cast +- TS2532/TS18048 (possibly undefined): suggest `?? defaultValue` + +Non-mechanical errors (TS2345, TS2322) left as report. Could live in +`scripts/` or become a `@git-stunts` tool. diff --git a/docs/method/backlog/cool-ideas/PERF_encrypted-stores-fixed-chunking.md b/docs/method/backlog/cool-ideas/PERF_encrypted-stores-fixed-chunking.md new file mode 100644 index 00000000..511950d9 --- /dev/null +++ b/docs/method/backlog/cool-ideas/PERF_encrypted-stores-fixed-chunking.md @@ -0,0 +1,8 @@ +# Switch encrypted stores to fixed chunking + +Both `CasSeekCacheAdapter` and `CasBlobAdapter` use +`{ strategy: 'cdc' }` unconditionally. Ciphertext is pseudorandom +so CDC boundaries provide no dedup benefit. The adapter could check +`_encryptionKey` at init and pick `fixed` vs `cdc` accordingly — +suppressing the git-cas runtime warning and saving rolling hash +overhead. diff --git a/BACKLOG/B170.md b/docs/method/backlog/cool-ideas/PERF_native-vs-wasm-roaring-benchmark.md similarity index 81% rename from BACKLOG/B170.md rename to docs/method/backlog/cool-ideas/PERF_native-vs-wasm-roaring-benchmark.md index 8ee9f802..8ac9e888 100644 --- a/BACKLOG/B170.md +++ b/docs/method/backlog/cool-ideas/PERF_native-vs-wasm-roaring-benchmark.md @@ -1,7 +1,6 @@ -# B170 — Native vs WASM Roaring Benchmark Pack +# Native vs WASM Roaring Benchmark Pack **Effort:** M -**Origin:** ROADMAP standalone (P4 Large-Graph Performance) ## Problem diff --git a/docs/method/backlog/cool-ideas/PERF_restore-buffer-guard.md b/docs/method/backlog/cool-ideas/PERF_restore-buffer-guard.md new file mode 100644 index 00000000..aea76070 --- /dev/null +++ b/docs/method/backlog/cool-ideas/PERF_restore-buffer-guard.md @@ -0,0 +1,6 @@ +# Restore buffer guard for seek cache + blob adapter + +git-cas 5.3.0 added `maxRestoreBufferSize` (default 512 MiB). +Neither `CasSeekCacheAdapter` nor `CasBlobAdapter` passes this +option. A tighter limit (64 MiB for blobs, 32 MiB for seek cache) +would fail fast instead of OOM. diff --git a/docs/method/backlog/cool-ideas/PERF_streaming-graph-traversal.md b/docs/method/backlog/cool-ideas/PERF_streaming-graph-traversal.md new file mode 100644 index 00000000..7661b570 --- /dev/null +++ b/docs/method/backlog/cool-ideas/PERF_streaming-graph-traversal.md @@ -0,0 +1,16 @@ +# Streaming GraphTraversal — async generators + +Every traversal algorithm could offer a streaming variant: +`bfs()` -> `bfs*()`, `dfs()` -> `dfs*()`, etc. The current API +collects results into arrays, forcing O(V) memory even when the +caller only needs the first match, a count, or a pipeline. + +`AsyncGenerator` return type lets callers break early, +compose with other iterables, or pipe into backpressure-aware sinks. +The array-returning methods become sugar. + +The tricky part is stats: can't return `{ nodes, stats }` from a +generator. Options: stats callback in hooks, generator `.return()` +value, or separate `statsForLastRun()` accessor. + +Start with `transitiveClosure` as proof-of-concept, then generalize. diff --git a/docs/method/backlog/cool-ideas/PROTO_encrypted-trailer-rename.md b/docs/method/backlog/cool-ideas/PROTO_encrypted-trailer-rename.md new file mode 100644 index 00000000..17e740b9 --- /dev/null +++ b/docs/method/backlog/cool-ideas/PROTO_encrypted-trailer-rename.md @@ -0,0 +1,7 @@ +# Rename `encrypted` trailer to `eg-encrypted` + +The Git commit trailer key `encrypted` should be namespaced to avoid +collisions. But renaming is a wire format change — existing commits +use `encrypted`. Needs the same ADR + migration approach as edge +property ops: keep reading old key, start writing new one, eventually +stop reading old. Breaking change, major version bump. diff --git a/docs/method/backlog/cool-ideas/PROTO_writer-isolated-bisect.md b/docs/method/backlog/cool-ideas/PROTO_writer-isolated-bisect.md new file mode 100644 index 00000000..a7f3b07a --- /dev/null +++ b/docs/method/backlog/cool-ideas/PROTO_writer-isolated-bisect.md @@ -0,0 +1,10 @@ +# Writer-isolated bisect mode + +A `--isolated` flag on bisect that materializes only the target +writer's patches up to a given point, ignoring other writers +entirely. Useful for debugging single-writer regressions without +cross-writer interference. Trade-off: faster materialization but +may miss interaction bugs. + +If pursued: add `materializeForWriter(writerId, ceiling)` to +WarpGraph, wire `--isolated` flag in bisect CLI. diff --git a/docs/method/backlog/cool-ideas/TRUST_per-writer-kek-wrapping.md b/docs/method/backlog/cool-ideas/TRUST_per-writer-kek-wrapping.md new file mode 100644 index 00000000..da0be1b2 --- /dev/null +++ b/docs/method/backlog/cool-ideas/TRUST_per-writer-kek-wrapping.md @@ -0,0 +1,8 @@ +# Per-writer key envelope encryption (KEK wrapping) + +Each writer gets their own DEK wrapped by a shared KEK. git-cas +already supports envelope encryption — the DEK/KEK split could be +wired at the `CasBlobAdapter` level, with writer ID selecting which +wrapped DEK to use. Lets you revoke a single writer's access by +re-wrapping without re-encrypting all data. Pairs with +`@git-stunts/vault` for KEK storage. diff --git a/docs/method/backlog/cool-ideas/VIZ_graph-diff-transitive-reduction.md b/docs/method/backlog/cool-ideas/VIZ_graph-diff-transitive-reduction.md new file mode 100644 index 00000000..9afbf07e --- /dev/null +++ b/docs/method/backlog/cool-ideas/VIZ_graph-diff-transitive-reduction.md @@ -0,0 +1,9 @@ +# Graph diff via transitive reduction comparison + +Compute `transitiveReduction(graphA)` and +`transitiveReduction(graphB)`, diff those minimal edge sets. Much +more compact structural summary than raw edge-set diff — strips +implied edges, shows only load-bearing structural changes. + +Could feed into time-travel delta engine as +`warp diff --mode=structural`. diff --git a/BACKLOG/B155.md b/docs/method/backlog/cool-ideas/VIZ_levels-lightweight-layout.md similarity index 81% rename from BACKLOG/B155.md rename to docs/method/backlog/cool-ideas/VIZ_levels-lightweight-layout.md index fea0cbba..bd935124 100644 --- a/BACKLOG/B155.md +++ b/docs/method/backlog/cool-ideas/VIZ_levels-lightweight-layout.md @@ -1,7 +1,6 @@ -# B155 — `levels()` as Lightweight `--view` Layout +# `levels()` as Lightweight `--view` Layout **Effort:** M -**Origin:** ROADMAP standalone (P5 Features) ## Problem diff --git a/BACKLOG/B156.md b/docs/method/backlog/cool-ideas/VIZ_structural-diff-transitive-reduction.md similarity index 77% rename from BACKLOG/B156.md rename to docs/method/backlog/cool-ideas/VIZ_structural-diff-transitive-reduction.md index e5c35bd1..56a03536 100644 --- a/BACKLOG/B156.md +++ b/docs/method/backlog/cool-ideas/VIZ_structural-diff-transitive-reduction.md @@ -1,7 +1,6 @@ -# B156 — Structural Diff via Transitive Reduction +# Structural Diff via Transitive Reduction **Effort:** L -**Origin:** ROADMAP standalone (P5 Features) ## Problem diff --git a/BACKLOG/B4.md b/docs/method/backlog/cool-ideas/VIZ_warp-ui-visualizer.md similarity index 81% rename from BACKLOG/B4.md rename to docs/method/backlog/cool-ideas/VIZ_warp-ui-visualizer.md index 3862a92b..3f6c2f12 100644 --- a/BACKLOG/B4.md +++ b/docs/method/backlog/cool-ideas/VIZ_warp-ui-visualizer.md @@ -1,7 +1,6 @@ -# B4 — WARP UI Visualizer +# WARP UI Visualizer **Effort:** L -**Origin:** ROADMAP deferred ## Problem diff --git a/BACKLOG/OG-017-modular-type-declarations.md b/docs/method/backlog/up-next/DX_modular-type-declarations.md similarity index 97% rename from BACKLOG/OG-017-modular-type-declarations.md rename to docs/method/backlog/up-next/DX_modular-type-declarations.md index b602c5df..48d32e8d 100644 --- a/BACKLOG/OG-017-modular-type-declarations.md +++ b/docs/method/backlog/up-next/DX_modular-type-declarations.md @@ -1,6 +1,5 @@ -# OG-017 — Break up the `index.d.ts` monolith +# Break up the `index.d.ts` monolith -Status: QUEUED Legend: Observer Geometry diff --git a/BACKLOG/B175.md b/docs/method/backlog/up-next/DX_observer-first-guide.md similarity index 95% rename from BACKLOG/B175.md rename to docs/method/backlog/up-next/DX_observer-first-guide.md index a55fb526..da7dd13c 100644 --- a/BACKLOG/B175.md +++ b/docs/method/backlog/up-next/DX_observer-first-guide.md @@ -1,7 +1,6 @@ -# B175 — Guide: Observer-First Client Pattern +# Guide: Observer-First Client Pattern **Effort:** M -**Origin:** User direction (2026-04-01) ## Problem diff --git a/BACKLOG/B152.md b/docs/method/backlog/up-next/PERF_async-generator-traversal.md similarity index 77% rename from BACKLOG/B152.md rename to docs/method/backlog/up-next/PERF_async-generator-traversal.md index 9aa191f3..5dc11cdc 100644 --- a/BACKLOG/B152.md +++ b/docs/method/backlog/up-next/PERF_async-generator-traversal.md @@ -1,7 +1,6 @@ -# B152 — Async Generator Traversal API +# Async Generator Traversal API **Effort:** L -**Origin:** ROADMAP standalone (P4 Large-Graph Performance) ## Problem diff --git a/BACKLOG/OG-009-playback-head-alignment.md b/docs/method/backlog/up-next/PROTO_playback-head-alignment.md similarity index 79% rename from BACKLOG/OG-009-playback-head-alignment.md rename to docs/method/backlog/up-next/PROTO_playback-head-alignment.md index d425e03a..881eace8 100644 --- a/BACKLOG/OG-009-playback-head-alignment.md +++ b/docs/method/backlog/up-next/PROTO_playback-head-alignment.md @@ -1,6 +1,5 @@ -# OG-009 — Align Playback-Head And TTD Consumers After Read Nouns Stabilize +# Align Playback-Head And TTD Consumers After Read Nouns Stabilize -Status: QUEUED ## Problem diff --git a/BACKLOG/B116.md b/docs/method/backlog/up-next/PROTO_wire-format-migration-edgepropset.md similarity index 90% rename from BACKLOG/B116.md rename to docs/method/backlog/up-next/PROTO_wire-format-migration-edgepropset.md index ef71d2ae..6ec9e081 100644 --- a/BACKLOG/B116.md +++ b/docs/method/backlog/up-next/PROTO_wire-format-migration-edgepropset.md @@ -1,7 +1,6 @@ -# B116 — Persisted Wire-Format Migration (ADR 2) — EdgePropSet +# Persisted Wire-Format Migration (ADR 2) — EdgePropSet **Effort:** XL -**Origin:** M13.T3 / ADR 2 ## Problem diff --git a/docs/method/process.md b/docs/method/process.md new file mode 100644 index 00000000..8221ed22 --- /dev/null +++ b/docs/method/process.md @@ -0,0 +1,41 @@ +# How cycles run + +See [METHOD.md](/METHOD.md) for the full philosophy. This file is +the quick-reference for operating a cycle. + +## Starting a cycle + +1. Pick work from a lane (`asap/` first, then `up-next/`). +2. Create `docs/design//` with the next sequential + number. +3. Move the backlog file into the cycle directory as the design doc. + Flesh it out: sponsor human, sponsor agent, hill, playback + questions, non-goals. +4. You are now committed. + +## During a cycle + +- RED: write failing tests from playback questions. +- GREEN: make them pass. +- Do not reorganize the backlog mid-cycle. + +## Ending a cycle + +1. **Playback** — produce a witness artifact for each playback + question. Agent answers agent questions. Human answers human + questions. Write it down. +2. **PR** — open, review, merge to main. +3. **Retro** — write `docs/method/retro//`. + - Drift check (mandatory). + - New debt to `bad-code/`. + - Cool ideas to `cool-ideas/`. + - Backlog maintenance: process inbox, re-prioritize, merge + duplicates, kill the dead. +4. **Release** — only when externally meaningful behavior changed. + See [release.md](release.md). + +## Outcomes + +- **Hill met** — merge, close. +- **Partial** — merge what is honest. Retro explains the gap. +- **Not met** — write the retro anyway. Every cycle ends with one. diff --git a/docs/release.md b/docs/method/release.md similarity index 100% rename from docs/release.md rename to docs/method/release.md diff --git a/docs/method/retro/0001-method-bootstrap/method-bootstrap.md b/docs/method/retro/0001-method-bootstrap/method-bootstrap.md new file mode 100644 index 00000000..7ecb0440 --- /dev/null +++ b/docs/method/retro/0001-method-bootstrap/method-bootstrap.md @@ -0,0 +1,55 @@ +# Retrospective: 0001-method-bootstrap + +**Date:** 2026-04-01 +**Type:** Design +**Outcome:** Hill met + +## What happened + +Introduced The Method as the development process framework for +git-warp. Created `METHOD.md` signpost, stood up the full directory +structure (`docs/method/backlog/` with 5 lane directories, `legends/`, +`retro/`, `graveyard/`), and migrated all existing backlog items. + +49 B-number and OG items migrated from `BACKLOG/` to named files in +appropriate lanes. 10 tech debt entries from `.claude/bad_code.md` +became individual files in `bad-code/`. 13 cool ideas from +`.claude/cool_ideas.md` became individual files in `cool-ideas/`. +B-number headers stripped from all migrated files. + +## Drift check + +- `docs/release.md` moved to `docs/method/release.md` — CLAUDE.md + reference updated. +- `docs/ROADMAP.md` still references old structure — updated + migration notice. +- `.claude/bad_code.md` and `.claude/cool_ideas.md` replaced with + forwarding notices. +- No code changes. No test impact. No drift. + +## Playback + +### Agent + +- Can I find work by `ls` on a lane? **YES** — each lane is a + directory with descriptive filenames. +- Can I classify a new idea without asking? **YES** — lane + definitions are clear in METHOD.md. +- Do any B-numbers remain? **NO** — all stripped from headers and + filenames. Git history preserves provenance. + +### Human + +- Does `ls docs/method/backlog/asap/` show what matters? **YES** — + 9 high-priority items with legend prefixes. +- Can I understand items from filenames? **YES** — + `PROTO_strand-service-god-object.md` beats `B176.md`. +- Is BACKLOG/ gone? **YES** — `git rm -r BACKLOG/` done. + +## New debt + +None introduced. + +## Cool ideas + +None surfaced. From 73619db7a8bc6e8a0ca0643b970cc276c881c274 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 19:28:23 -0700 Subject: [PATCH 12/73] =?UTF-8?q?chore:=20declare=20CLEAN=5FCODE=20legend?= =?UTF-8?q?=20=E2=80=94=20structural=20quality=20domain?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Covers god object decomposition, raw error purge, type boundary cleanup, constructor hygiene, file size policy, and lint ratchets. Legend code: CC. All bad-code/ items and several asap/ items fall under this legend. --- docs/method/legends/CLEAN_CODE.md | 93 +++++++++++++++++++++++++++++++ 1 file changed, 93 insertions(+) create mode 100644 docs/method/legends/CLEAN_CODE.md diff --git a/docs/method/legends/CLEAN_CODE.md b/docs/method/legends/CLEAN_CODE.md new file mode 100644 index 00000000..67e971ca --- /dev/null +++ b/docs/method/legends/CLEAN_CODE.md @@ -0,0 +1,93 @@ +# CLEAN_CODE + +Un-shittifying the codebase. Systematically. + +## What it covers + +Structural quality work that makes the code honest: god object +decomposition, raw error replacement, type boundary cleanup, +constructor hygiene, redundant data structure elimination, and +enforcing the policies that prevent regression (file size limits, +one-thing-per-file, lint ratchets). + +This is not feature work. This is not performance optimization. +This is making the code say what it means, and meaning what it says. + +## Who cares + +### Sponsor human + +James — maintains this codebase long-term. Wants to open any file +and understand it without scrolling past 500 lines of mixed +concerns. Wants `new Error()` to never appear where a domain error +class exists. Wants the hexagonal boundary to be real, not +aspirational. + +### Sponsor agent + +Claude — reads and modifies this code every session. God objects +force full-file reads. Mixed concerns make targeted edits risky. +Raw errors lose context in stack traces. Type poison cascades +through downstream files. Every structural problem multiplies the +cost of every future task. + +## What success looks like + +- No source file exceeds 500 LOC (test files 800, CLI 300) +- Every thrown error is a domain error class, never raw `Error` +- Each file exports one primary thing +- Port boundaries are honest — domain services don't require I/O + infrastructure +- Constructor parameter lists are legible (config objects, not + positional sprawl) +- No redundant data structures sitting in memory alongside each + other +- The ESLint `max-lines` ratchet enforces the ceiling and the + relaxation list only shrinks + +## How you know + +- `npm run lint` passes with the `max-lines` rule enforced +- `grep -r 'new Error(' src/domain/` returns zero hits +- The relaxation block in `eslint.config.js` has fewer entries than + it did last cycle +- No file in `bad-code/` has been there for more than 3 cycles + without being pulled + +## Current surface + +### bad-code/ + +All 10 items in `docs/method/backlog/bad-code/` fall under this +legend: + +- `PROTO_strand-service-god-object.md` — 2048 LOC, 40+ methods +- `PROTO_audit-receipt-raw-error.md` — 18 raw Error throws +- `PROTO_sync-protocol-raw-error.md` — raw Error with manual code +- `PROTO_patchbuilder-12-param-constructor.md` — config sprawl +- `PROTO_receipt-op-type-redundant.md` — dead mapping table +- `PROTO_warpserve-domain-infra-blur.md` — hex boundary violation +- `DX_trailer-codec-type-poison.md` — untyped boundary infects 6 files +- `DX_exact-optional-conditional-spread.md` — 30 verbose sites +- `PERF_toposort-full-adjacency.md` — wasteful memory allocation +- `PERF_transitive-reduction-redundant-adjlist.md` — redundant structure + +### asap/ + +- `DX_max-file-size-policy.md` — the ratchet that prevents regression +- `DX_restore-dot-notation.md` — lint rule gap from TSC campaign +- `DX_agent-code-audit.md` — audit agent-authored code from TSC blitz +- `DX_trailer-codec-dts.md` — upstream fix that kills type poison +- `PROTO_effectsink-breaking-change.md` — breaking change hygiene +- `PROTO_warpkernel-port-cleanup.md` — persistence union types +- `PROTO_warpruntime-god-class.md` — the other god object + +## Legend code + +`CC` — for backlog items that belong to this legend. + +``` +CC_strand-service-decomposition.md +CC_raw-error-purge.md +CC_max-lines-ratchet.md +``` From dc1252c69f2d642ddcb6693fddad183c28db0aaa Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 19:45:35 -0700 Subject: [PATCH 13/73] design: code-nav AST-aware symbol extraction tool (cycle 0002) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Design doc for @git-stunts/code-nav — a tree-sitter-powered tool that lets LLM agents extract named symbols, view file outlines, and find definitions without reading entire files. JS/TS + Rust. CLI + MCP server. Phased: core extraction → MCP → Rust → polish. --- .../0002-code-nav-tool/code-nav-tool.md | 340 ++++++++++++++++++ 1 file changed, 340 insertions(+) create mode 100644 docs/design/0002-code-nav-tool/code-nav-tool.md diff --git a/docs/design/0002-code-nav-tool/code-nav-tool.md b/docs/design/0002-code-nav-tool/code-nav-tool.md new file mode 100644 index 00000000..6dd44199 --- /dev/null +++ b/docs/design/0002-code-nav-tool/code-nav-tool.md @@ -0,0 +1,340 @@ +# Code Nav: AST-Aware Symbol Extraction for LLM Agents + +**Cycle:** 0002-code-nav-tool +**Type:** Feature (new repo) +**Legend:** n/a (standalone tool, not git-warp internal) + +## Sponsor human + +James — maintains JS/TS and Rust codebases. Wants an agent that can +work on 2000+ LOC files without reading every line. Wants a tool +that works across his two primary language families and integrates +into the Claude Code workflow via MCP. + +## Sponsor agent + +Claude — reads entire files to find 30-line functions. Burns context +window on irrelevant code. Needs to understand code structure before +making targeted edits. Current workflow: Grep for name → Read with +offset/limit → hope the offset is right → often over-read. A +structural extraction tool would cut context waste by 10-50x on +large files. + +## Hill + +An agent working in a JavaScript, TypeScript, or Rust codebase can +extract any named symbol's source code, see the structural outline +of any file, and find where symbols are defined — without reading +full files. The tool runs as both a CLI and an MCP server. + +## Playback questions + +### Agent + +1. Can I get just the source code of `StrandService.tick()` without + reading StrandService.js? **YES/NO** +2. Can I see the shape of a 2000-line file (all method signatures, + no bodies) in under 50 lines of output? **YES/NO** +3. Can I find where `reduceV5` is defined across the codebase + without multiple grep rounds? **YES/NO** +4. Does it work on `.js`, `.ts`, `.tsx`, `.rs` files? **YES/NO** +5. Can I call it as an MCP tool from Claude Code? **YES/NO** + +### Human + +1. Can I install it with one command? **YES/NO** +2. Does it work on any JS/TS or Rust project without configuration? + **YES/NO** +3. Is it fast enough to not interrupt flow (<100ms per query)? + **YES/NO** +4. Can I use it from the terminal as a CLI too? **YES/NO** + +## Non-goals + +- Replacing LSP / IDE features (go-to-definition with full type + resolution, refactoring, diagnostics) +- Type inference or type checking +- Modifying code (this is read-only extraction) +- Supporting every language (JS/TS + Rust covers the need) +- Maintaining a persistent index or daemon process +- Replacing grep for text search — this is structural, not textual + +## Core operations + +### 1. `show ` + +Extract a named symbol's complete source code. + +```bash +# A top-level function +code-nav show reduceV5 +# → file: src/domain/services/JoinReducer.js:142-198 +# → full source of reduceV5() + +# A class method +code-nav show StrandService.tick +# → file: src/domain/services/StrandService.js:1240-1271 +# → full source of tick(), including JSDoc + +# A struct and its impl block +code-nav show VersionVector +# → file: src/crdt/version_vector.rs:12-89 +# → struct definition + impl block(s) + +# Nested: a method on a Rust impl +code-nav show VersionVector.merge +# → just the merge() method from the impl block +``` + +**Resolution order:** If `show foo` is ambiguous (multiple files +define `foo`), return all matches with file paths. The caller picks. + +**What "complete" means:** +- The full syntactic extent of the declaration (function body, + class body, struct + impl, enum + impl) +- Leading doc comments / JSDoc attached to the declaration +- Decorators / attributes attached to the declaration +- NOT: surrounding whitespace, imports, other declarations + +### 2. `outline ` + +Structural skeleton of a file — every declaration with signature +but no body. + +```bash +code-nav outline src/domain/services/StrandService.js +``` + +``` +src/domain/services/StrandService.js (2048 lines) + + exports: + STRAND_SCHEMA_VERSION = 1 :89 + STRAND_COORDINATE_VERSION = 'frontier-lamport/v1' :90 + STRAND_OVERLAY_KIND = 'patch-log' :91 + default class StrandService :901 + + class StrandService: + constructor({ graph }) :907 + async create(options = {}) :917 + async braid(strandId, options = {}) :952 + async get(strandId) :985 + async list() :1000 + async drop(strandId) :1024 + async materialize(strandId, options = {}) :1055 + async createPatchBuilder(strandId) :1076 + async patch(strandId, build) :1134 + async queueIntent(strandId, build) :1165 + async listIntents(strandId) :1207 + async tick(strandId) :1240 + async getPatchEntries(strandId, options = {}) :1505 + async patchesFor(strandId, entityId, options = {}) :1520 + async getOrThrow(strandId) :1545 + _buildRef(strandId) :1563 + _buildOverlayRef(strandId) :1582 + ... + + functions: + compareStrings(a, b) :100 + normalizeCreateOptions(options) :245 + frontierToRecord(frontier) :310 + ... +``` + +**For Rust files:** show `struct`, `enum`, `trait`, `impl` blocks, +`fn`, `const`, `static`, `type` aliases, `mod` declarations. + +**Key design choice:** private/internal symbols are included. The +agent needs to see the full shape to understand the code, not just +the public API. + +### 3. `exports ` + +Just the public surface — what this module exposes to importers. + +```bash +code-nav exports src/domain/services/JoinReducer.js +``` + +``` +named: createEmptyStateV5 (function) :42 +named: reduceV5 (function) :142 +named: applyFast (function) :301 +default: JoinReducer (class) :450 +``` + +For Rust: `pub` items at the module level. + +### 4. `find ` + +Where is this symbol **defined** across the codebase? + +```bash +code-nav find reduceV5 +``` + +``` +src/domain/services/JoinReducer.js:142 export function reduceV5(...) +``` + +Unlike grep, this only returns **definitions**, not usage sites. +A function call, import, or type reference is not a hit. + +### 5. `references ` (stretch) + +Where is this symbol **used** across the codebase? This is the +inverse of `find` — import sites, call sites, type references. + +Stretch goal because it requires cross-file resolution (following +imports). May be impractical without a full module resolver. Could +start with a simpler version: "files that import this symbol." + +### 6. `deps ` (stretch) + +What does this file import, and from where? + +```bash +code-nav deps src/domain/services/StrandService.js +``` + +``` +../errors/StrandError.js StrandError +../utils/RefLayout.js buildStrandRef, buildStrandBraidRef, ... +../utils/WriterId.js generateWriterId +./PatchBuilderV2.js PatchBuilderV2 +./JoinReducer.js createEmptyStateV5, reduceV5 +... +``` + +## Technology + +### Parser: tree-sitter + +tree-sitter is the right foundation: + +- **Multi-language**: mature grammars for JavaScript, TypeScript, + TSX, Rust — one parsing framework for all targets +- **Incremental**: re-parses only changed regions (future: watch + mode) +- **Battle-tested**: powers GitHub code navigation, Neovim, + Helix, Zed +- **Node.js bindings**: `tree-sitter` npm package + per-language + grammar packages +- **Fast**: parses 2000-line files in single-digit milliseconds + +oxc was considered but is JS/TS only. We need Rust coverage. + +ast-grep was considered and may be useful for the `find` operation +(it already does structural pattern matching). But ast-grep is a +search tool, not an extraction tool. We need to extract complete +syntactic extents, not match patterns. + +### Runtime: Node.js + +- tree-sitter has first-class Node.js bindings via native addon +- MCP SDK (`@modelcontextprotocol/sdk`) is TypeScript/Node +- James's primary dev environment is Node +- CLI framework: minimal — `node:util.parseArgs` + direct output +- No build step needed for pure JS + native addons + +### MCP server + +Expose each operation as an MCP tool: + +```json +{ + "tools": [ + { "name": "code_show", "description": "Extract a named symbol's source code" }, + { "name": "code_outline", "description": "Structural skeleton of a file" }, + { "name": "code_exports", "description": "Public exports of a module" }, + { "name": "code_find", "description": "Find where a symbol is defined" } + ] +} +``` + +Transport: stdio (standard for Claude Code MCP servers). + +### Project structure + +``` +@git-stunts/code-nav/ + bin/ + code-nav.js CLI entry point + src/ + parser/ + index.js tree-sitter init + grammar loading + javascript.js JS/TS/TSX extraction queries + rust.js Rust extraction queries + operations/ + show.js Symbol extraction + outline.js File skeleton + exports.js Public surface + find.js Definition search + mcp/ + server.js MCP server (stdio transport) + tools.js Tool definitions + handlers + output/ + formatter.js CLI output formatting + test/ + fixtures/ Sample JS/TS/Rust files + unit/ Operation tests + package.json + LICENSE Apache 2.0 +``` + +## Open questions + +1. **Scope resolution for `show`** — when you say `show foo`, should + it search the whole project or require a file hint? Searching the + whole project is more convenient but slower on large codebases. + Could default to project-wide with a `--file` flag for precision. + +2. **How deep does `outline` go?** — should it show nested functions + inside methods? Probably not by default — just the top-level + declarations and class/impl members. A `--depth` flag for deeper. + +3. **JSDoc attachment** — tree-sitter's JS grammar treats comments + as standalone nodes, not attached to declarations. Need a + heuristic: "comment immediately preceding a declaration with no + blank line gap belongs to that declaration." This is standard but + requires custom logic. + +4. **Rust impl block grouping** — `show VersionVector` should return + the struct AND all impl blocks. But what about trait impls + (`impl Display for VersionVector`)? Probably yes — include all + impl blocks for the type. + +5. **Performance on large monorepos** — `find` across a full project + means parsing every file. Could be slow on 1000+ file repos. + Mitigation: respect `.gitignore`, skip `node_modules`, and + consider a lightweight file-level symbol index cache. + +## Phasing + +### Phase 1 — Core (MVP) + +- `outline` for JS/TS files +- `show` for JS/TS files (top-level functions + class methods) +- CLI only, no MCP yet +- Tests against git-warp as the fixture codebase + +### Phase 2 — Full JS/TS + MCP + +- `exports` for JS/TS +- `find` across a project directory +- MCP server (stdio transport) +- Register in Claude Code config + +### Phase 3 — Rust + +- `outline` for Rust files +- `show` for Rust (functions, struct + impls, enum + impls, traits) +- `exports` for Rust (`pub` items) +- `find` for Rust + +### Phase 4 — Polish + +- `references` (stretch) +- `deps` (stretch) +- Performance: `.gitignore` awareness, symbol index cache +- `--json` output for all operations From ef89889f70135ee76b14b88209b59f3fcb579161 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 19:49:22 -0700 Subject: [PATCH 14/73] design: add before/after scenarios with token + context analysis MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Five real scenarios from git-warp sessions showing 53-96% context reduction. Key insight: the compounding effect over a full session saves 20K-90K tokens — the difference between hitting compression mid-task and finishing cleanly. --- .../0002-code-nav-tool/code-nav-tool.md | 186 +++++++++++++++++- 1 file changed, 185 insertions(+), 1 deletion(-) diff --git a/docs/design/0002-code-nav-tool/code-nav-tool.md b/docs/design/0002-code-nav-tool/code-nav-tool.md index 6dd44199..75d56270 100644 --- a/docs/design/0002-code-nav-tool/code-nav-tool.md +++ b/docs/design/0002-code-nav-tool/code-nav-tool.md @@ -59,7 +59,191 @@ full files. The tool runs as both a CLI and an MCP server. - Maintaining a persistent index or daemon process - Replacing grep for text search — this is structural, not textual -## Core operations +## Before and after + +Real scenarios from working on git-warp. Token counts are estimates +based on ~3.5 tokens per line of JavaScript. + +### Scenario 1: "Understand a god object before decomposing it" + +**Task:** Plan the StrandService decomposition. Need to know what +methods exist, how they group, and which are public vs private. + +**Before (today):** + +``` +Read StrandService.js (lines 1-100) → ~350 tokens (imports, typedefs) +Read StrandService.js (lines 100-200) → ~350 tokens (validation helpers) + ... still no class body ... +Read StrandService.js (lines 900-1070) → ~595 tokens (class start, create/braid/get/list/drop) +Read StrandService.js (lines 1070-1320) → ~875 tokens (createPatchBuilder, patch, queueIntent, listIntents) +Read StrandService.js (lines 1320-1560) → ~840 tokens (tick internals, getPatchEntries, patchesFor) +Read StrandService.js (lines 1560-1930) → ~1295 tokens (private helpers, materialization) +Read StrandService.js (lines 1930-2048) → ~413 tokens (commit, sync refs) +``` + +**Total: 7 tool calls, ~4718 tokens of context consumed.** And most +of that is method bodies I don't need yet — I just wanted the shape. +The 7 sequential reads also cost wall-clock time (~3-5 seconds of +tool roundtrips). + +**After (with code-nav):** + +``` +code_outline StrandService.js → ~50 lines, ~175 tokens +``` + +One call. Every method name, its parameters, and its line number. +Enough to plan the decomposition. When I need the body of a specific +method, one more call: + +``` +code_show StrandService.tick → ~32 lines, ~112 tokens +``` + +**Total: 1-2 tool calls, ~175-287 tokens.** That's a **94% reduction +in context consumed** and the information density is higher — pure +signal, no noise. + +### Scenario 2: "Fix a bug in one method of a large class" + +**Task:** `_commitQueuedPatch` in StrandService is using the wrong +tree structure. Need to read just that method, fix it, and move on. + +**Before:** + +``` +Grep for '_commitQueuedPatch' → 1 tool call, get line number (1973) +Read StrandService.js (lines 1960-2015) → ~192 tokens + ... but I'm guessing at the method boundary. + Did I get the whole thing? Is the JSDoc above line 1960? +Read StrandService.js (lines 1930-2015) → ~297 tokens (re-read with more context) +``` + +**Total: 3 tool calls, ~297 tokens used** (with 192 wasted on the +first imprecise read). And I had to eyeball where the method ends. + +**After:** + +``` +code_show StrandService._commitQueuedPatch → exact method, ~140 tokens +``` + +**Total: 1 tool call, ~140 tokens.** Exact boundaries, including +the JSDoc. No guessing. **53% reduction**, but more importantly: +zero wasted reads and zero risk of missing the top or bottom of the +method. + +### Scenario 3: "What does this module export? What can I use?" + +**Task:** Wire up a new service that needs to import from +JoinReducer.js. What's available? + +**Before:** + +``` +Grep for 'export' in JoinReducer.js → 1 tool call, noisy (matches 'export' in comments too) +Read JoinReducer.js (lines 1-50) → ~175 tokens (hope the exports are at the top) + ... they're not all at the top, some are inline ... +Read JoinReducer.js (lines 280-320) → ~140 tokens (found another export) +Grep for '^export' in JoinReducer.js → 1 tool call, better but still need context +``` + +**Total: 4 tool calls, ~315+ tokens, incomplete picture.** + +**After:** + +``` +code_exports JoinReducer.js → ~8 lines, ~40 tokens +``` + +**Total: 1 tool call, ~40 tokens.** Complete, structured, every +export with its type (function/class/const) and line number. +**87% reduction.** + +### Scenario 4: "Where is this function defined?" + +**Task:** `reduceV5` is called in 15 files. I need the definition, +not the call sites. + +**Before:** + +``` +Grep for 'reduceV5' (files_with_matches) → 1 tool call, 15 files +Grep for 'function reduceV5' → 1 tool call, maybe finds it + ... but what if it's `const reduceV5 = ` or `export { reduceV5 }`? +Grep for 'reduceV5' with context in likely file → 1 tool call, ~100 tokens +``` + +**Total: 2-3 tool calls, up to ~100 tokens, fragile.** The pattern +depends on the declaration style. + +**After:** + +``` +code_find reduceV5 → 1 line, ~15 tokens +``` + +**Total: 1 tool call, ~15 tokens.** Returns only the definition, +regardless of whether it's `function`, `const`, `class`, or +re-export. **85% reduction.** + +### Scenario 5: "I need to understand 8 files before a refactor" + +**Task:** Decompose WarpRuntime. Need the outline of WarpRuntime.js +(683 LOC), StrandService.js (2048 LOC), SyncController.js (680 LOC), +WarpGraph.js (800 LOC), Writer.js, PatchSession.js, Observer.js +(575 LOC), CheckpointService.js (567 LOC). + +**Before:** + +This is where the compounding cost hits. Reading 8 files at even +50% coverage: + +``` +~5353 total LOC × 0.5 coverage × 3.5 tokens/line = ~9,368 tokens +~24 tool calls (3 reads per file average) +``` + +That's **9,368 tokens of context** before writing a single line of +code. In a 200K token context window, that's ~5% consumed just on +orientation. And it compounds — as the conversation continues, +those 9K tokens of file contents push earlier context (my own +reasoning, your instructions, test output) toward the compression +boundary. + +**After:** + +``` +8 × code_outline calls = 8 tool calls +8 × ~50 lines × 3.5 tokens/line = ~1,400 tokens +``` + +**Total: 8 tool calls, ~1,400 tokens.** Same structural +understanding. **85% reduction.** The context window stays clean for +actual work — reasoning, test output, edits. + +### Summary + +| Scenario | Before (tokens) | After (tokens) | Reduction | Tool calls saved | +|---|---|---|---|---| +| Understand god object | ~4,718 | ~175 | 96% | 6 | +| Fix one method | ~297 | ~140 | 53% | 2 | +| List module exports | ~315 | ~40 | 87% | 3 | +| Find a definition | ~100 | ~15 | 85% | 2 | +| Pre-refactor survey (8 files) | ~9,368 | ~1,400 | 85% | 16 | + +**The compounding effect matters most.** A single `outline` call +saves ~4,500 tokens. But in a real session I do this 5-20 times — +reading files to understand context before acting. Over a full +session that's 20,000-90,000 tokens of saved context. That's the +difference between hitting the compression boundary mid-task (losing +earlier reasoning) and having room to finish cleanly. + +The token savings also translate directly to speed. Fewer tool calls += fewer round-trips = faster responses. An 8-file survey drops from +~24 sequential reads (~10 seconds of tool overhead) to 8 parallel +outline calls (~1 second). ### 1. `show ` From 10d27b54723522893154e2a2032b36ef3675b128 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 20:05:18 -0700 Subject: [PATCH 15/73] =?UTF-8?q?design:=20rewrite=20as=20safe-context=20?= =?UTF-8?q?=E2=80=94=20replay-safe=20structural=20reads?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Editor's Edition reframe: the product is not code-nav (a search tool). It is safe-context (a context governor). AST-aware extraction is one tooth of a policy-enforcing read layer grounded in empirical data from 1,091 sessions showing Read at 96.2 GB of context burden. Rewritten hill, 4-layer architecture (policy → structural extraction → transport → session hygiene), 7-command surface, phased delivery (governor first, precision tools second, session intelligence third). --- .../0002-code-nav-tool/code-nav-tool.md | 693 +++++++----------- 1 file changed, 282 insertions(+), 411 deletions(-) diff --git a/docs/design/0002-code-nav-tool/code-nav-tool.md b/docs/design/0002-code-nav-tool/code-nav-tool.md index 75d56270..bdad0f0d 100644 --- a/docs/design/0002-code-nav-tool/code-nav-tool.md +++ b/docs/design/0002-code-nav-tool/code-nav-tool.md @@ -1,524 +1,395 @@ -# Code Nav: AST-Aware Symbol Extraction for LLM Agents +# Safe Context: Replay-Safe Structural Reads for Coding Agents **Cycle:** 0002-code-nav-tool **Type:** Feature (new repo) -**Legend:** n/a (standalone tool, not git-warp internal) ## Sponsor human -James — maintains JS/TS and Rust codebases. Wants an agent that can -work on 2000+ LOC files without reading every line. Wants a tool -that works across his two primary language families and integrates -into the Claude Code workflow via MCP. +James — maintains large JS/TS and Rust codebases and wants coding +agents that can work precisely in large files without inflating +session cost. Has empirical data (Blacklight, 1,091 sessions, +291K messages) proving that context compounding from oversized reads +is the dominant cost driver in agentic coding. ## Sponsor agent -Claude — reads entire files to find 30-line functions. Burns context -window on irrelevant code. Needs to understand code structure before -making targeted edits. Current workflow: Grep for name → Read with -offset/limit → hope the offset is right → often over-read. A -structural extraction tool would cut context waste by 10-50x on -large files. +Claude — wastes context on full-file reads, oversized shell output, +and repeated exploration in long sessions. Read tool alone accounts +for 96.2 GB of context burden — 6.6x all other tools combined. 58% +of reads are full-file (no offset/limit). 64.5% of reads don't lead +to an edit of that file — they're exploration cost that could be +replaced by structural representations. Needs a policy-enforcing +access layer that returns the smallest correct representation needed +for the task. ## Hill -An agent working in a JavaScript, TypeScript, or Rust codebase can -extract any named symbol's source code, see the structural outline -of any file, and find where symbols are defined — without reading -full files. The tool runs as both a CLI and an MCP server. +An agent working in a JS/TS or Rust codebase can obtain the minimum +structurally correct context required to act — file shape, export +surface, exact symbol body, or bounded source range — without +injecting large raw artifacts into long-lived conversation state. +The tool runs as an MCP server and CLI and enforces replay-safe +behavior by default. ## Playback questions ### Agent -1. Can I get just the source code of `StrandService.tick()` without - reading StrandService.js? **YES/NO** -2. Can I see the shape of a 2000-line file (all method signatures, - no bodies) in under 50 lines of output? **YES/NO** -3. Can I find where `reduceV5` is defined across the codebase - without multiple grep rounds? **YES/NO** -4. Does it work on `.js`, `.ts`, `.tsx`, `.rs` files? **YES/NO** -5. Can I call it as an MCP tool from Claude Code? **YES/NO** +1. When I request a 2000-line file, do I get an outline instead of + the raw content? **YES/NO** +2. Can I extract just `StrandService.tick()` without reading + StrandService.js? **YES/NO** +3. Am I blocked from reading binary files, build output, and + generated artifacts? **YES/NO** +4. Does shell output get tailed instead of dumped in full? + **YES/NO** +5. Can I save/load session state across `/clear` boundaries? + **YES/NO** +6. Does it work on `.js`, `.ts`, `.tsx`, `.rs` files? **YES/NO** +7. Can I call every operation as an MCP tool? **YES/NO** ### Human -1. Can I install it with one command? **YES/NO** -2. Does it work on any JS/TS or Rust project without configuration? +1. Does the tool install with one command and work without + configuration? **YES/NO** +2. Can I see measurable reduction in context burden in Blacklight + data after deploying it? **YES/NO** +3. Does it work across Claude Code, Gemini CLI, and Codex CLI? **YES/NO** -3. Is it fast enough to not interrupt flow (<100ms per query)? - **YES/NO** -4. Can I use it from the terminal as a CLI too? **YES/NO** +4. Can I use it from the terminal as a standalone CLI? **YES/NO** ## Non-goals -- Replacing LSP / IDE features (go-to-definition with full type - resolution, refactoring, diagnostics) -- Type inference or type checking -- Modifying code (this is read-only extraction) -- Supporting every language (JS/TS + Rust covers the need) -- Maintaining a persistent index or daemon process -- Replacing grep for text search — this is structural, not textual +- Full semantic code intelligence (LSP replacement) +- Cross-file reference resolution in v1 +- Persistent whole-repo index in v1 +- Code modification (this is read-only) +- Arbitrary raw artifact passthrough +- Convenience wrapper around `cat` +- General-purpose memory system +- "Whatever the agent asked for, but prettier" — this tool is + opinionated about what it returns + +## The thesis + +The biggest cost in agentic coding is not code generation. It is +replayed context. Safe-context replaces oversized raw reads with +bounded structural representations, so agents stay precise without +poisoning their own session state. + +### Evidence (from Blacklight, 1,091 sessions) + +| Finding | Number | +|---|---| +| Read context burden | 96.2 GB (6.6x all other tools) | +| Full-file reads (no offset/limit) | 58% of all reads | +| Reads that don't lead to editing that file | 64.5% | +| Dynamic read cap alone | 54.5% burden reduction | +| Session length cap alone | 58.9% burden reduction | +| Both combined | 75.1% burden reduction | +| Top 3 sessions (of 715) | 23% of all lifetime burden | +| WarpGraph.js | 1,053 reads, 85 sessions, 1.74 GB burden | +| Worst single session | 12.7 GB burden, 5,900 messages | + +The data says: + +1. Read is the monster. +2. Long sessions are money furnaces. +3. Shell output is material (especially Gemini). +4. Subagent dumps are context bombs. +5. Policy + session management handle 75% before any indexing. ## Before and after -Real scenarios from working on git-warp. Token counts are estimates -based on ~3.5 tokens per line of JavaScript. - -### Scenario 1: "Understand a god object before decomposing it" - -**Task:** Plan the StrandService decomposition. Need to know what -methods exist, how they group, and which are public vs private. - -**Before (today):** - -``` -Read StrandService.js (lines 1-100) → ~350 tokens (imports, typedefs) -Read StrandService.js (lines 100-200) → ~350 tokens (validation helpers) - ... still no class body ... -Read StrandService.js (lines 900-1070) → ~595 tokens (class start, create/braid/get/list/drop) -Read StrandService.js (lines 1070-1320) → ~875 tokens (createPatchBuilder, patch, queueIntent, listIntents) -Read StrandService.js (lines 1320-1560) → ~840 tokens (tick internals, getPatchEntries, patchesFor) -Read StrandService.js (lines 1560-1930) → ~1295 tokens (private helpers, materialization) -Read StrandService.js (lines 1930-2048) → ~413 tokens (commit, sync refs) -``` - -**Total: 7 tool calls, ~4718 tokens of context consumed.** And most -of that is method bodies I don't need yet — I just wanted the shape. -The 7 sequential reads also cost wall-clock time (~3-5 seconds of -tool roundtrips). - -**After (with code-nav):** - -``` -code_outline StrandService.js → ~50 lines, ~175 tokens -``` +Real scenarios. Token counts are raw output. Context burden = +tokens x messages remaining in session. -One call. Every method name, its parameters, and its line number. -Enough to plan the decomposition. When I need the body of a specific -method, one more call: +### Scenario 1: Understand a god object -``` -code_show StrandService.tick → ~32 lines, ~112 tokens -``` +**Before:** 7 Read calls across StrandService.js. ~4,700 tokens +raw, but at turn 5 of a 200-turn session that's +`4,700 x 195 = 916,500 tokens of context burden`. -**Total: 1-2 tool calls, ~175-287 tokens.** That's a **94% reduction -in context consumed** and the information density is higher — pure -signal, no noise. +**After:** `safe_read("StrandService.js")` → policy intercepts, +returns `file_outline`. ~175 tokens raw, same position = +`175 x 195 = 34,125 burden`. Then `code_show("StrandService.tick")` +for the one method needed. **96% raw reduction, 96% burden +reduction.** -### Scenario 2: "Fix a bug in one method of a large class" +### Scenario 2: Pre-refactor survey of 8 files -**Task:** `_commitQueuedPatch` in StrandService is using the wrong -tree structure. Need to read just that method, fix it, and move on. +**Before:** ~24 Read calls, ~9,400 tokens raw. At turn 3 of a +150-turn session: `9,400 x 147 = 1,381,800 burden`. -**Before:** +**After:** 8 `file_outline` calls. ~1,400 tokens raw. +`1,400 x 147 = 205,800 burden`. **85% raw, 85% burden.** And the +context window stays clean for actual work — reasoning, test output, +edits. -``` -Grep for '_commitQueuedPatch' → 1 tool call, get line number (1973) -Read StrandService.js (lines 1960-2015) → ~192 tokens - ... but I'm guessing at the method boundary. - Did I get the whole thing? Is the JSDoc above line 1960? -Read StrandService.js (lines 1930-2015) → ~297 tokens (re-read with more context) -``` +### Scenario 3: The compounding catastrophe -**Total: 3 tool calls, ~297 tokens used** (with 192 wasted on the -first imprecise read). And I had to eyeball where the method ends. +**Before:** WarpGraph.js (800 LOC) read 12 times in a 400-message +session. Each read ~2,800 tokens. Total raw: 33,600. But +compounded across messages remaining at each read point: estimated +**5-8 million tokens of burden** from one file in one session. -**After:** +**After:** First access returns `file_outline` (~280 tokens). Agent +requests specific symbols as needed via `code_show`. Even 10 +targeted extractions total ~2,000 tokens raw. Burden drops by +**~95%** because each payload is small and the outline is never +re-read (agent has the shape). -``` -code_show StrandService._commitQueuedPatch → exact method, ~140 tokens -``` +### Scenario 4: The GIF incident -**Total: 1 tool call, ~140 tokens.** Exact boundaries, including -the JSDoc. No guessing. **53% reduction**, but more importantly: -zero wasted reads and zero risk of missing the top or bottom of the -method. +**Before:** `seek-demo.gif` read 4 times. 1.3 MB of binary per +read. **395 MB of context burden** from 4 tool calls. -### Scenario 3: "What does this module export? What can I use?" +**After:** `safe_read("seek-demo.gif")` → policy refuses. Returns: +`Binary file (GIF, 1.3 MB). Use ls -lh for metadata.` Zero bytes +of context burden. **100% reduction.** -**Task:** Wire up a new service that needs to import from -JoinReducer.js. What's available? +### Scenario 5: The test loop -**Before:** +**Before:** `npm test` run 30 times in an edit-test loop. Each run +outputs ~8 KB. Late in session with 200 messages remaining: +`8,000 x 30 x 100 (avg remaining) = 24,000,000 burden`. -``` -Grep for 'export' in JoinReducer.js → 1 tool call, noisy (matches 'export' in comments too) -Read JoinReducer.js (lines 1-50) → ~175 tokens (hope the exports are at the top) - ... they're not all at the top, some are inline ... -Read JoinReducer.js (lines 280-320) → ~140 tokens (found another export) -Grep for '^export' in JoinReducer.js → 1 tool call, better but still need context -``` +**After:** `run_capture("npm test", 60)` tees full output to +`/tmp/test.log`, returns only last 60 lines (~2 KB). If more needed, +`read_range("/tmp/test.log", 1, 50)`. Burden drops by **~75%**, and +the full output is still on disk if needed. -**Total: 4 tool calls, ~315+ tokens, incomplete picture.** +## Architecture -**After:** +### Layer 1: Policy (the king) -``` -code_exports JoinReducer.js → ~8 lines, ~40 tokens -``` +Decides what kind of answer is allowed. -**Total: 1 tool call, ~40 tokens.** Complete, structured, every -export with its type (function/class/const) and line number. -**87% reduction.** +- No binary/media reads (`.gif`, `.png`, `.jpg`, `.pdf`, `.zip`, + `.wasm`, `.bin`, `.sqlite`) +- No build/generated reads (`dist/`, `build/`, `target/`, `.next/`, + `node_modules/`) +- Dynamic size cap based on session depth: -### Scenario 4: "Where is this function defined?" + | Session stage | Messages elapsed | Max raw output | + |---|---|---| + | Early | < 50 | 20 KB | + | Mid | 50-200 | 10 KB | + | Late | > 200 | 4 KB | -**Task:** `reduceV5` is called in 15 files. I need the definition, -not the call sites. +- Over-cap reads are downgraded to `file_outline` + jump table +- Optional re-read warning ("you already read this file 3 turns + ago") -**Before:** +Policy is the product. Everything else enables it. -``` -Grep for 'reduceV5' (files_with_matches) → 1 tool call, 15 files -Grep for 'function reduceV5' → 1 tool call, maybe finds it - ... but what if it's `const reduceV5 = ` or `export { reduceV5 }`? -Grep for 'reduceV5' with context in likely file → 1 tool call, ~100 tokens -``` +### Layer 2: Structural extraction (the enabler) -**Total: 2-3 tool calls, up to ~100 tokens, fragile.** The pattern -depends on the declaration style. +Tree-sitter-backed extraction for JS/TS/Rust: -**After:** +- **File outline** — exports, declarations, class/impl members, + line ranges +- **Symbol body** — complete syntactic extent of a named + declaration, with doc comments +- **Export surface** — what this module exposes to importers +- **Definition finding** — where a symbol is defined (not used) -``` -code_find reduceV5 → 1 line, ~15 tokens -``` +Tree-sitter is the right foundation: +- Multi-language (JS, TS, TSX, Rust in one framework) +- Fast (single-digit ms per file parse) +- Battle-tested (GitHub, Neovim, Zed, Helix) +- Node.js bindings via native addon -**Total: 1 tool call, ~15 tokens.** Returns only the definition, -regardless of whether it's `function`, `const`, `class`, or -re-export. **85% reduction.** +### Layer 3: Transport (necessary, not interesting) -### Scenario 5: "I need to understand 8 files before a refactor" +- **MCP server** (stdio) — primary delivery. Works with Claude Code, + Gemini CLI, Codex CLI +- **CLI** — for human use and testing -**Task:** Decompose WarpRuntime. Need the outline of WarpRuntime.js -(683 LOC), StrandService.js (2048 LOC), SyncController.js (680 LOC), -WarpGraph.js (800 LOC), Writer.js, PatchSession.js, Observer.js -(575 LOC), CheckpointService.js (567 LOC). +### Layer 4: Session hygiene (the other big lever) -**Before:** +- `state_save()` / `state_load()` — write/read + `WORKING_STATE.md` for cross-clear continuity +- Tripwires (phase 3): + - `messages > 500` + - `edit_bash_transitions > 30` + - `tool_calls_since_last_user_message > 80` + - Any single output > 20 KB after 300 messages -This is where the compounding cost hits. Reading 8 files at even -50% coverage: +## Command surface -``` -~5353 total LOC × 0.5 coverage × 3.5 tokens/line = ~9,368 tokens -~24 tool calls (3 reads per file average) -``` +### 1. `safe_read(path, intent?)` -That's **9,368 tokens of context** before writing a single line of -code. In a 200K token context window, that's ~5% consumed just on -orientation. And it compounds — as the conversation continues, -those 9K tokens of file contents push earlier context (my own -reasoning, your instructions, test output) toward the compression -boundary. +Primary entry point. The main product. -**After:** +Returns one of: +- **Exact file content** when safely under the cap +- **Structural outline** when too large +- **"Pick a symbol/range" guidance** when exploration is needed +- **Refusal** for binary/build/generated garbage -``` -8 × code_outline calls = 8 tool calls -8 × ~50 lines × 3.5 tokens/line = ~1,400 tokens -``` +The `intent` parameter is optional. If provided ("I need to +understand the class shape" vs "I need to edit line 45"), the policy +can make smarter decisions. -**Total: 8 tool calls, ~1,400 tokens.** Same structural -understanding. **85% reduction.** The context window stays clean for -actual work — reasoning, test output, edits. - -### Summary - -| Scenario | Before (tokens) | After (tokens) | Reduction | Tool calls saved | -|---|---|---|---|---| -| Understand god object | ~4,718 | ~175 | 96% | 6 | -| Fix one method | ~297 | ~140 | 53% | 2 | -| List module exports | ~315 | ~40 | 87% | 3 | -| Find a definition | ~100 | ~15 | 85% | 2 | -| Pre-refactor survey (8 files) | ~9,368 | ~1,400 | 85% | 16 | - -**The compounding effect matters most.** A single `outline` call -saves ~4,500 tokens. But in a real session I do this 5-20 times — -reading files to understand context before acting. Over a full -session that's 20,000-90,000 tokens of saved context. That's the -difference between hitting the compression boundary mid-task (losing -earlier reasoning) and having room to finish cleanly. - -The token savings also translate directly to speed. Fewer tool calls -= fewer round-trips = faster responses. An 8-file survey drops from -~24 sequential reads (~10 seconds of tool overhead) to 8 parallel -outline calls (~1 second). - -### 1. `show ` - -Extract a named symbol's complete source code. - -```bash -# A top-level function -code-nav show reduceV5 -# → file: src/domain/services/JoinReducer.js:142-198 -# → full source of reduceV5() - -# A class method -code-nav show StrandService.tick -# → file: src/domain/services/StrandService.js:1240-1271 -# → full source of tick(), including JSDoc - -# A struct and its impl block -code-nav show VersionVector -# → file: src/crdt/version_vector.rs:12-89 -# → struct definition + impl block(s) - -# Nested: a method on a Rust impl -code-nav show VersionVector.merge -# → just the merge() method from the impl block -``` +### 2. `file_outline(path, opts?)` -**Resolution order:** If `show foo` is ambiguous (multiple files -define `foo`), return all matches with file paths. The caller picks. +Structural skeleton. Exports, top-level declarations, class/impl +members, line ranges. No bodies. -**What "complete" means:** -- The full syntactic extent of the declaration (function body, - class body, struct + impl, enum + impl) -- Leading doc comments / JSDoc attached to the declaration -- Decorators / attributes attached to the declaration -- NOT: surrounding whitespace, imports, other declarations +Cheap, structural, high-signal. This is what replaces 64.5% of +exploration reads. -### 2. `outline ` +### 3. `code_show(target, opts?)` -Structural skeleton of a file — every declaration with signature -but no body. +Precise extraction. The scalpel. -```bash -code-nav outline src/domain/services/StrandService.js -``` +- `StrandService.tick` — a class method +- `src/foo.rs#VersionVector.merge` — file-qualified Rust method +- `reduceV5` — top-level function (project-wide search if ambiguous) -``` -src/domain/services/StrandService.js (2048 lines) - - exports: - STRAND_SCHEMA_VERSION = 1 :89 - STRAND_COORDINATE_VERSION = 'frontier-lamport/v1' :90 - STRAND_OVERLAY_KIND = 'patch-log' :91 - default class StrandService :901 - - class StrandService: - constructor({ graph }) :907 - async create(options = {}) :917 - async braid(strandId, options = {}) :952 - async get(strandId) :985 - async list() :1000 - async drop(strandId) :1024 - async materialize(strandId, options = {}) :1055 - async createPatchBuilder(strandId) :1076 - async patch(strandId, build) :1134 - async queueIntent(strandId, build) :1165 - async listIntents(strandId) :1207 - async tick(strandId) :1240 - async getPatchEntries(strandId, options = {}) :1505 - async patchesFor(strandId, entityId, options = {}) :1520 - async getOrThrow(strandId) :1545 - _buildRef(strandId) :1563 - _buildOverlayRef(strandId) :1582 - ... - - functions: - compareStrings(a, b) :100 - normalizeCreateOptions(options) :245 - frontierToRecord(frontier) :310 - ... -``` +Returns the complete syntactic extent: body, JSDoc/doc comments, +decorators/attributes. Nothing else. -**For Rust files:** show `struct`, `enum`, `trait`, `impl` blocks, -`fn`, `const`, `static`, `type` aliases, `mod` declarations. +### 4. `code_find(symbol, opts?)` -**Key design choice:** private/internal symbols are included. The -agent needs to see the full shape to understand the code, not just -the public API. +Definitions only. Not grep. Not references. Not "anything containing +this string." -### 3. `exports ` +Returns file path + line number for every definition of the symbol +across the project. -Just the public surface — what this module exposes to importers. +### 5. `read_range(path, start, end)` -```bash -code-nav exports src/domain/services/JoinReducer.js -``` +For when you know where you're going. Bounded, no policy +interception (the caller already has a precise target). -``` -named: createEmptyStateV5 (function) :42 -named: reduceV5 (function) :142 -named: applyFast (function) :301 -default: JoinReducer (class) :450 -``` +### 6. `run_capture(cmd, tail?)` -For Rust: `pub` items at the module level. +Runs a shell command. Tees full output to a log file. Returns only +the last N lines (default 60). Full output available on disk via +`read_range` if needed. -### 4. `find ` +Because the data shows shell output is material, and for Gemini it +was the #1 burden source. -Where is this symbol **defined** across the codebase? +### 7. `state_save(content)` / `state_load()` -```bash -code-nav find reduceV5 -``` +Thin wrapper over `WORKING_STATE.md`. Saves/loads structured session +state for cross-clear continuity. -``` -src/domain/services/JoinReducer.js:142 export function reduceV5(...) -``` +Because the data is screaming that runaway sessions are the other +half of the disaster. -Unlike grep, this only returns **definitions**, not usage sites. -A function call, import, or type reference is not a hit. +## Open questions -### 5. `references ` (stretch) +1. **Session depth tracking** — How does the MCP server know how + deep the session is? MCP tools don't receive conversation + metadata. Options: (a) the agent tells it via a parameter, + (b) the server counts its own tool calls as a proxy, + (c) a hook injects session depth. -Where is this symbol **used** across the codebase? This is the -inverse of `find` — import sites, call sites, type references. +2. **Re-read detection** — Tracking "you already read this" requires + the server to maintain per-session state. Feasible since the + server lives for the session duration, but needs a simple + in-memory cache. -Stretch goal because it requires cross-file resolution (following -imports). May be impractical without a full module resolver. Could -start with a simpler version: "files that import this symbol." +3. **JSDoc attachment** — tree-sitter treats comments as standalone + nodes. Need a heuristic: "comment immediately preceding a + declaration with no blank line gap belongs to it." -### 6. `deps ` (stretch) +4. **Rust impl grouping** — `code_show VersionVector` should return + struct + all impl blocks, including trait impls. Requires walking + the full file AST, not just pattern matching. -What does this file import, and from where? +5. **Project root detection** — For `code_find` (project-wide + search), how to determine the project root? Options: + `.git` presence, `package.json`, `Cargo.toml`, or explicit + config. -```bash -code-nav deps src/domain/services/StrandService.js -``` +6. **Cross-LLM MCP compatibility** — Claude Code, Gemini CLI, and + Codex CLI all support MCP but with slightly different + configuration. Need to verify stdio transport works identically + across all three. -``` -../errors/StrandError.js StrandError -../utils/RefLayout.js buildStrandRef, buildStrandBraidRef, ... -../utils/WriterId.js generateWriterId -./PatchBuilderV2.js PatchBuilderV2 -./JoinReducer.js createEmptyStateV5, reduceV5 -... -``` +## Phasing -## Technology +### Phase 1 — The Governor -### Parser: tree-sitter +Ship: `safe_read`, `file_outline`, `read_range`, `run_capture`, +`state_save`/`state_load`. JS/TS only. MCP + CLI. -tree-sitter is the right foundation: +**Goal:** change behavior immediately. This phase alone should +deliver the 54.5% read burden reduction that the dynamic cap +promises, plus shell output containment. -- **Multi-language**: mature grammars for JavaScript, TypeScript, - TSX, Rust — one parsing framework for all targets -- **Incremental**: re-parses only changed regions (future: watch - mode) -- **Battle-tested**: powers GitHub code navigation, Neovim, - Helix, Zed -- **Node.js bindings**: `tree-sitter` npm package + per-language - grammar packages -- **Fast**: parses 2000-line files in single-digit milliseconds +### Phase 2 — Precision tools -oxc was considered but is JS/TS only. We need Rust coverage. +Add: `code_show`, `code_find`, `exports`. Rust support for all +structural operations. -ast-grep was considered and may be useful for the `find` operation -(it already does structural pattern matching). But ast-grep is a -search tool, not an extraction tool. We need to extract complete -syntactic extents, not match patterns. +**Goal:** make safe reads frictionless. When the governor +downgrades a read to an outline, the agent can immediately request +the exact symbol it needs. -### Runtime: Node.js +### Phase 3 — Session intelligence -- tree-sitter has first-class Node.js bindings via native addon -- MCP SDK (`@modelcontextprotocol/sdk`) is TypeScript/Node -- James's primary dev environment is Node -- CLI framework: minimal — `node:util.parseArgs` + direct output -- No build step needed for pure JS + native addons +Add: tripwires, re-read warnings, session-depth-aware enforcement, +automatic `WORKING_STATE.md` nudges. -### MCP server +**Goal:** stop runaway sessions before they become archaeological +sites. -Expose each operation as an MCP tool: +### Phase 4 — Optional sophistication -```json -{ - "tools": [ - { "name": "code_show", "description": "Extract a named symbol's source code" }, - { "name": "code_outline", "description": "Structural skeleton of a file" }, - { "name": "code_exports", "description": "Public exports of a module" }, - { "name": "code_find", "description": "Find where a symbol is defined" } - ] -} -``` +Maybe: lightweight symbol cache, import/deps views, +references-lite, symbol-aware revision diffs. -Transport: stdio (standard for Claude Code MCP servers). +Not before the first three phases prove themselves. -### Project structure +## Project structure -``` -@git-stunts/code-nav/ +```text +@git-stunts/safe-context/ bin/ - code-nav.js CLI entry point + safe-context.js CLI entry point src/ + policy/ + rules.js Ban lists, size caps, dynamic thresholds + gate.js Decision engine (pass/outline/refuse) parser/ - index.js tree-sitter init + grammar loading - javascript.js JS/TS/TSX extraction queries - rust.js Rust extraction queries + index.js Tree-sitter init + grammar loading + javascript.js JS/TS/TSX extraction queries + rust.js Rust extraction queries operations/ - show.js Symbol extraction - outline.js File skeleton - exports.js Public surface - find.js Definition search + safe-read.js Policy-enforced read + outline.js File skeleton + show.js Symbol extraction + find.js Definition search + range.js Bounded reads + capture.js Shell output tailing + state.js Session state save/load mcp/ - server.js MCP server (stdio transport) - tools.js Tool definitions + handlers + server.js MCP server (stdio transport) + tools.js Tool definitions + handlers output/ - formatter.js CLI output formatting + formatter.js CLI output formatting test/ - fixtures/ Sample JS/TS/Rust files - unit/ Operation tests + fixtures/ Sample JS/TS/Rust files + unit/ Operation tests + policy/ Policy decision tests package.json - LICENSE Apache 2.0 + LICENSE Apache 2.0 ``` -## Open questions - -1. **Scope resolution for `show`** — when you say `show foo`, should - it search the whole project or require a file hint? Searching the - whole project is more convenient but slower on large codebases. - Could default to project-wide with a `--file` flag for precision. - -2. **How deep does `outline` go?** — should it show nested functions - inside methods? Probably not by default — just the top-level - declarations and class/impl members. A `--depth` flag for deeper. - -3. **JSDoc attachment** — tree-sitter's JS grammar treats comments - as standalone nodes, not attached to declarations. Need a - heuristic: "comment immediately preceding a declaration with no - blank line gap belongs to that declaration." This is standard but - requires custom logic. - -4. **Rust impl block grouping** — `show VersionVector` should return - the struct AND all impl blocks. But what about trait impls - (`impl Display for VersionVector`)? Probably yes — include all - impl blocks for the type. - -5. **Performance on large monorepos** — `find` across a full project - means parsing every file. Could be slow on 1000+ file repos. - Mitigation: respect `.gitignore`, skip `node_modules`, and - consider a lightweight file-level symbol index cache. - -## Phasing - -### Phase 1 — Core (MVP) - -- `outline` for JS/TS files -- `show` for JS/TS files (top-level functions + class methods) -- CLI only, no MCP yet -- Tests against git-warp as the fixture codebase - -### Phase 2 — Full JS/TS + MCP - -- `exports` for JS/TS -- `find` across a project directory -- MCP server (stdio transport) -- Register in Claude Code config - -### Phase 3 — Rust - -- `outline` for Rust files -- `show` for Rust (functions, struct + impls, enum + impls, traits) -- `exports` for Rust (`pub` items) -- `find` for Rust - -### Phase 4 — Polish +## Success criteria -- `references` (stretch) -- `deps` (stretch) -- Performance: `.gitignore` awareness, symbol index cache -- `--json` output for all operations +- Large exploratory reads are replaced by outlines and targeted + reads +- Binary/build/generated reads are blocked or redirected +- Long-session compounding is reduced through policy and state + resets +- Agents can operate effectively in 2K+ LOC files without reading + the whole file +- Measurable reduction in context burden visible in Blacklight data + after deployment From 0d29b39aa0fbf53ef943b8fe4686549406bae817 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 20:08:24 -0700 Subject: [PATCH 16/73] =?UTF-8?q?retro:=20close=20cycle=200002=20=E2=80=94?= =?UTF-8?q?=20code-nav=20pivoted=20to=20safe-context?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Design cycle that started as AST-aware symbol extraction and pivoted to a full context governor after empirical data from Blacklight (96.2 GB Read burden) reframed the problem. Remaining work re-enters backlog as safe-context Phase 1. --- docs/design/0003-safe-context/safe-context.md | 10 ++ .../retro/0002-code-nav-tool/code-nav-tool.md | 91 +++++++++++++++++++ 2 files changed, 101 insertions(+) create mode 100644 docs/design/0003-safe-context/safe-context.md create mode 100644 docs/method/retro/0002-code-nav-tool/code-nav-tool.md diff --git a/docs/design/0003-safe-context/safe-context.md b/docs/design/0003-safe-context/safe-context.md new file mode 100644 index 00000000..e7a77d22 --- /dev/null +++ b/docs/design/0003-safe-context/safe-context.md @@ -0,0 +1,10 @@ +# safe-context Phase 1 — The Governor + +**Effort:** L + +Ship the core context governor as a cross-LLM MCP server + CLI. +Policy-enforced reads, structural outlines, bounded ranges, shell +capture, session state. JS/TS only. + +Design exploration in `docs/design/0002-code-nav-tool/`. +Empirical grounding in `~/git/blacklight/LLM_TOKEN_USE.md`. diff --git a/docs/method/retro/0002-code-nav-tool/code-nav-tool.md b/docs/method/retro/0002-code-nav-tool/code-nav-tool.md new file mode 100644 index 00000000..4c72cb09 --- /dev/null +++ b/docs/method/retro/0002-code-nav-tool/code-nav-tool.md @@ -0,0 +1,91 @@ +# Retrospective: 0002-code-nav-tool + +**Date:** 2026-04-01 +**Type:** Design +**Outcome:** Partial — pivoted + +## What happened + +Started as a design cycle for "code-nav" — an AST-aware symbol +extraction tool for LLM agents. Wrote a full design doc with hill, +playback questions, phasing, and project structure. Added concrete +before/after scenarios with token cost analysis. + +Then James introduced empirical data from Blacklight (1,091 sessions, +291K messages, 4.5 months). The data reframed the problem: + +- Read burden is 96.2 GB — 6.6x all other tools combined +- The dominant cost is context compounding, not individual reads +- A dynamic read cap alone cuts burden by 54.5% +- Session length caps cut it by 58.9% +- Both combined: 75.1% + +James's Editor's Edition review delivered the verdict: + +- **APPROVE** the insight (AST-aware extraction is right) +- **REJECT** the framing (code-nav alone is too small) +- **ENHANCE** into safe-context — a policy-enforcing read layer + where AST extraction is one capability, not the product + +The design doc was rewritten from scratch as safe-context. The cycle +is closing as a pivot — the design deliverable is complete, but the +product identity changed fundamentally mid-cycle. + +## Hill assessment + +**Original hill:** "An agent can extract any named symbol's source +code, see the structural outline of any file, and find where symbols +are defined — without reading full files." + +**Status:** Not met (pivoted before implementation). The hill was +correct but undersized. It was replaced by: + +"An agent can obtain the minimum structurally correct context +required to act — without injecting large raw artifacts into +long-lived conversation state." + +## Drift check + +- Cycle 0002's design directory contains the full evolution: the + original code-nav doc and its rewrite as safe-context. Provenance + is intact. +- No code was written. No tests. No code drift possible. +- The Method structure from cycle 0001 worked as designed — the + design doc lived in `docs/design/0002-code-nav-tool/` throughout. + +## What we learned + +1. **Design before data is design in the dark.** The original + code-nav design was reasonable — correct technology choice + (tree-sitter), correct operations (outline, show, find), correct + phasing. But it was solving a symptom. The Blacklight data + revealed the disease: context compounding. Without that data, we + would have shipped a nice utility that addressed ~25% of the + problem. + +2. **The Editor's Edition pattern works.** James reviewed the design + not as "is this correct?" but as "is this ambitious enough?" The + APPROVE/REJECT/ENHANCE framework forced a clear verdict that + preserved the good work while upgrading the framing. + +3. **Pivoting mid-design is cheap.** No code was written, no tests + to rewrite, no sunk cost. This is exactly why The Method puts + design before RED. The cost of this pivot was one document + rewrite. + +## New debt + +None. + +## Cool ideas + +- **Blacklight as validation harness** — after deploying + safe-context, re-run the Blacklight analysis to measure actual + burden reduction. The before/after data is the ultimate playback + witness. + +## Backlog impact + +Remaining work re-enters the backlog as a new item: +`DX_safe-context-phase-1.md` in `asap/`. The pivot doesn't kill the +work — it sharpens it. From d27b0eb2beeaadec4de840741751e4f62cf57347 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 20:17:34 -0700 Subject: [PATCH 17/73] =?UTF-8?q?design:=20cycle=200003=20safe-context=20P?= =?UTF-8?q?hase=201=20=E2=80=94=20The=20Governor?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Full design doc with command contracts, output shapes, test strategy, project structure, and technology choices. Phase 1 scope: safe_read, file_outline, read_range, run_capture, state_save/load. JS/TS only. MCP + CLI. Tree-sitter for structural extraction. --- docs/design/0003-safe-context/safe-context.md | 384 +++++++++++++++++- 1 file changed, 378 insertions(+), 6 deletions(-) diff --git a/docs/design/0003-safe-context/safe-context.md b/docs/design/0003-safe-context/safe-context.md index e7a77d22..7aa6277a 100644 --- a/docs/design/0003-safe-context/safe-context.md +++ b/docs/design/0003-safe-context/safe-context.md @@ -1,10 +1,382 @@ # safe-context Phase 1 — The Governor -**Effort:** L +**Cycle:** 0003-safe-context +**Type:** Feature (new repo: `@git-stunts/safe-context`) +**Pulled from:** `asap/DX_safe-context-phase-1.md` +**Prior art:** `docs/design/0002-code-nav-tool/code-nav-tool.md` -Ship the core context governor as a cross-LLM MCP server + CLI. -Policy-enforced reads, structural outlines, bounded ranges, shell -capture, session state. JS/TS only. +## Sponsor human -Design exploration in `docs/design/0002-code-nav-tool/`. -Empirical grounding in `~/git/blacklight/LLM_TOKEN_USE.md`. +James — maintains JS/TS and Rust codebases. Has empirical proof +(Blacklight, 1,091 sessions) that Read context burden is the +dominant cost in agentic coding. Wants a tool that enforces +replay-safe behavior across Claude Code, Gemini CLI, and Codex CLI +without requiring agents to be disciplined on their own. + +## Sponsor agent + +Claude — 96.2 GB of Read context burden. 58% full-file reads. 64.5% +exploration reads that never lead to an edit. Needs a policy layer +that prevents it from stuffing its own context with gravel, and +structural extraction that makes the policy usable instead of +obnoxious. + +## Hill + +An agent working in a JS/TS codebase can obtain the minimum +structurally correct context required to act — file shape, export +surface, or bounded source range — without injecting large raw +artifacts into long-lived conversation state. The tool runs as an +MCP server and CLI and enforces replay-safe behavior by default. + +Phase 1 scope: JS/TS only. `safe_read`, `file_outline`, +`read_range`, `run_capture`, `state_save`/`state_load`. No +`code_show` or `code_find` yet (Phase 2). + +## Playback questions + +### Agent + +1. When I `safe_read` a 2000-line JS file, do I get an outline + instead of raw content? **YES/NO** +2. When I `safe_read` a 50-line config file, do I get the raw + content? **YES/NO** +3. Am I blocked from reading `.gif`, `.png`, `.wasm`, and + `node_modules/`? **YES/NO** +4. Does `run_capture("npm test")` return only the tail, with full + output on disk? **YES/NO** +5. Can I `state_save` before a `/clear` and `state_load` after? + **YES/NO** +6. Can I call every operation as an MCP tool from Claude Code? + **YES/NO** + +### Human + +1. Can I `npm install -g @git-stunts/safe-context` and it works? + **YES/NO** +2. Does `safe-context outline src/domain/services/StrandService.js` + return a useful structural skeleton from the CLI? **YES/NO** +3. Can I point it at any JS/TS project with zero config? **YES/NO** +4. Can I register it as an MCP server in one line of JSON? **YES/NO** + +## Non-goals + +- Rust support (Phase 2) +- `code_show` / `code_find` / `exports` (Phase 2) +- Persistent whole-repo index +- Cross-file reference resolution +- Code modification +- LSP replacement +- Semantic type resolution +- Session tripwires and auto-nudges (Phase 3) + +## Command contracts + +### `safe_read(path, intent?)` + +**Input:** +- `path` — file path (absolute or relative to project root) +- `intent` — optional string hint ("understand shape", "find + method X", "edit line 45") + +**Policy decisions:** + +| Condition | Response | +|---|---| +| Binary extension (`.gif`, `.png`, `.jpg`, `.pdf`, `.zip`, `.wasm`, `.bin`, `.sqlite`, `.ico`, `.mp4`, `.mov`) | Refuse. Return file type + size metadata. | +| Build/generated path (`node_modules/`, `dist/`, `build/`, `.next/`, `target/`, `coverage/`) | Refuse. Suggest source path. | +| File does not exist | Error with path. | +| File <= threshold (default 150 lines) | Return raw content. | +| File > threshold | Return `file_outline` result + "use read_range for details". | + +The threshold is configurable. Default 150 lines balances the data: +most utility files, configs, and small modules pass through; god +objects and large services get outlined. + +**Output shape:** +```json +{ + "action": "content" | "outline" | "refused", + "path": "src/foo.js", + "lines": 2048, + "bytes": 68402, + "content": "..." | null, + "outline": { ... } | null, + "reason": "..." | null +} +``` + +### `file_outline(path)` + +**Input:** file path. + +**Output:** structural skeleton of the file. + +```json +{ + "path": "src/domain/services/StrandService.js", + "lines": 2048, + "language": "javascript", + "exports": [ + { "name": "STRAND_SCHEMA_VERSION", "kind": "const", "line": 89 }, + { "name": "default", "kind": "class", "alias": "StrandService", "line": 901 } + ], + "declarations": [ + { "name": "compareStrings", "kind": "function", "line": 100, "endLine": 102 }, + { "name": "normalizeCreateOptions", "kind": "function", "line": 245, "endLine": 308 } + ], + "classes": [ + { + "name": "StrandService", + "line": 901, + "endLine": 2048, + "members": [ + { "name": "constructor", "kind": "method", "line": 907, "params": "{ graph }" }, + { "name": "create", "kind": "method", "line": 917, "async": true, "params": "options = {}" }, + { "name": "braid", "kind": "method", "line": 952, "async": true, "params": "strandId, options = {}" }, + { "name": "get", "kind": "method", "line": 985, "async": true, "params": "strandId" }, + { "name": "tick", "kind": "method", "line": 1240, "async": true, "params": "strandId" }, + { "name": "_buildRef", "kind": "method", "line": 1563, "params": "strandId", "private": true } + ] + } + ] +} +``` + +**CLI text output:** + +``` +src/domain/services/StrandService.js (2048 lines, javascript) + + exports: + const STRAND_SCHEMA_VERSION :89 + default class StrandService :901 + + functions: + compareStrings(a, b) :100-102 + normalizeCreateOptions(options) :245-308 + ... + + class StrandService :901-2048 + constructor({ graph }) :907 + async create(options = {}) :917 + async braid(strandId, options = {}) :952 + async get(strandId) :985 + async list() :1000 + async drop(strandId) :1024 + async materialize(strandId, options = {}) :1055 + async createPatchBuilder(strandId) :1076 + async patch(strandId, build) :1134 + async queueIntent(strandId, build) :1165 + async listIntents(strandId) :1207 + async tick(strandId) :1240 + async getPatchEntries(strandId, options = {}) :1505 + async patchesFor(strandId, entityId, options = {}) :1520 + async getOrThrow(strandId) :1545 + _buildRef(strandId) :1563 [private] + _buildOverlayRef(strandId) :1582 [private] + _buildBraidPrefix(strandId) :1601 [private] + _buildBraidRef(strandId, braidedStrandId) :1621 [private] + _readDescriptorByOid(oid, strandId) :1642 [private] + _writeDescriptor(descriptor) :1677 [private] + _loadBraidedReadOverlays(target, braidedStrandIds) :1693 [private] + _readOverlayMetadata(strandId) :1724 [private] + _hydrateOverlayMetadata(descriptor) :1744 [private] + _collectBasePatches(descriptor) :1781 [private] + _collectOverlayPatches(descriptor) :1813 [private] + _collectBraidedOverlayPatches(descriptor) :1827 [private] + _collectPatchEntries(descriptor, { ceiling }) :1850 [private] + _materializeDescriptor(descriptor, opts) :1881 [private] + _syncOverlayDescriptor(descriptor, { patch, sha }) :1936 [private] + _commitQueuedPatch(params) :1973 [private] + _syncBraidRefs(strandId, readOverlays) :2027 [private] +``` + +That is 35 lines. Not 2048. + +### `read_range(path, start, end)` + +**Input:** file path, start line (1-indexed), end line (inclusive). + +**Output:** raw content of the specified range with line numbers. + +No policy interception — the caller already has a precise target. +This is the escape hatch when the agent knows exactly what it needs. + +### `run_capture(cmd, tail?)` + +**Input:** +- `cmd` — shell command string +- `tail` — number of lines to return (default 60) + +**Behavior:** +1. Execute `cmd` via shell +2. Tee full output to a temp log file +3. Return last `tail` lines + the log file path +4. Return exit code + +**Output shape:** +```json +{ + "exitCode": 1, + "tail": "... last 60 lines ...", + "logFile": "/tmp/safe-context/capture-1712023456.log", + "totalLines": 342, + "truncated": true +} +``` + +Agent can `read_range` the log file if it needs more. + +### `state_save(content)` / `state_load()` + +**Input (save):** markdown string of session state. +**Output (load):** the saved content, or null if no state file. + +**Storage:** `.safe-context/WORKING_STATE.md` in the project root. + +This is deliberately simple. A markdown file. No schema, no +structure enforcement. The agent writes what it needs to remember. +The human can read it with `cat`. + +## Technology + +### Tree-sitter + +- `tree-sitter` npm package (native addon) +- `tree-sitter-javascript` grammar (covers JS + JSX) +- `tree-sitter-typescript` grammar (covers TS + TSX) +- Parses any file in single-digit ms +- No persistent process needed — parse on demand + +### MCP + +- `@modelcontextprotocol/sdk` for server implementation +- stdio transport (standard for all three LLM agents) +- One tool definition per command + +### Runtime + +- Node.js >= 20 (tree-sitter native addon) +- Zero config — no tsconfig, no build step, no daemon +- `pnpm` for package management + +## Project structure + +```text +safe-context/ + bin/ + safe-context.js CLI entry point + src/ + policy/ + rules.js Ban lists, thresholds + gate.js Decision engine + parser/ + index.js Tree-sitter init + grammar loading + javascript.js JS/TS/TSX outline extraction + operations/ + safe-read.js Policy-enforced read + outline.js Structural skeleton + range.js Bounded reads + capture.js Shell output tailing + state.js Session state save/load + mcp/ + server.js MCP server (stdio) + tools.js Tool definitions + handlers + format/ + text.js CLI text formatter + json.js JSON output formatter + test/ + fixtures/ + small.js Under threshold (pass-through) + large-class.js Over threshold (outline) + binary.gif Binary refusal + generated/ Build path refusal + unit/ + policy.test.js Gate decisions + outline.test.js Structural extraction + safe-read.test.js Integration (policy + extraction) + capture.test.js Shell capture + state.test.js State save/load + integration/ + mcp.test.js MCP server round-trip + package.json + LICENSE Apache 2.0 + README.md +``` + +## Test strategy + +Tests are the spec. Playback questions map directly to test cases. + +### Policy tests (`policy.test.js`) + +``` +safe_read("foo.gif") -> action: "refused" +safe_read("node_modules/x") -> action: "refused" +safe_read("small.js") -> action: "content" (under threshold) +safe_read("large-class.js") -> action: "outline" (over threshold) +safe_read("missing.js") -> error +``` + +### Outline tests (`outline.test.js`) + +``` +outline("large-class.js") + -> has exports array + -> has classes array with members + -> members have name, kind, line, params + -> async methods marked async: true + -> private methods (leading _) marked private: true + -> no function bodies in output + -> line numbers are accurate (spot-check) + +outline("plain-functions.js") + -> has declarations array + -> each has name, kind, line, endLine + +outline("typescript.ts") + -> handles interfaces, type aliases, enums + -> handles decorated classes +``` + +### Capture tests (`capture.test.js`) + +``` +run_capture("echo hello", 10) + -> exitCode: 0 + -> tail contains "hello" + -> logFile exists on disk + -> logFile contains "hello" + +run_capture("seq 1 500", 5) + -> tail contains lines 496-500 + -> truncated: true + -> totalLines: 500 + -> logFile contains all 500 lines +``` + +### State tests (`state.test.js`) + +``` +state_save("# Working on X") + -> file exists at .safe-context/WORKING_STATE.md + -> content matches + +state_load() + -> returns saved content + +state_load() with no prior save + -> returns null +``` + +### MCP integration tests (`mcp.test.js`) + +``` +spawn MCP server via stdio + -> server lists all 6 tools + -> safe_read call returns valid response + -> file_outline call returns valid response + -> run_capture call returns valid response + -> state_save + state_load round-trips +``` From 98126ec7b6b22808e61835f3956d5db33258bec1 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 20:21:07 -0700 Subject: [PATCH 18/73] cool-idea: WARP provenance layer for safe-context MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Tree-sitter is the parser. WARP graphs are the memory of structural truth over time — symbol lineage, structural deltas, observer- relative views. Not for MVP, but the ramp from ephemeral extraction to provenance-aware incremental context is clear. --- ...ROTO_safe-context-warp-provenance-layer.md | 45 +++++++++++++++++++ 1 file changed, 45 insertions(+) create mode 100644 docs/method/backlog/cool-ideas/PROTO_safe-context-warp-provenance-layer.md diff --git a/docs/method/backlog/cool-ideas/PROTO_safe-context-warp-provenance-layer.md b/docs/method/backlog/cool-ideas/PROTO_safe-context-warp-provenance-layer.md new file mode 100644 index 00000000..1ffe2d3a --- /dev/null +++ b/docs/method/backlog/cool-ideas/PROTO_safe-context-warp-provenance-layer.md @@ -0,0 +1,45 @@ +# WARP provenance layer for safe-context + +Tree-sitter is the parser. WARP graphs are the memory of structural +truth over time. + +## The insight + +Line numbers are trash — they drift constantly. A provenance-aware +structure can track symbol lineage across edits: + +- "this is the same symbol, just transformed" +- "what changed since I last observed this file?" +- "read only the delta, in the smallest meaningful unit" + +## What WARP models here + +- File revision worldlines +- Symbol identity across revisions (stable even when lines drift) +- Structural rewrite events (method moved, param list changed, + export surface widened — not line diffs) +- Agent observations of symbols +- Tool outputs as witnesses + +## Concrete features this unlocks + +- `since_last_read` — symbols changed since last observation +- `symbol_diff` — structural delta between worldlines +- `hot_regions` — symbols that churn most under edit-test loops +- `structural_checkpoint` — working state as touched symbol lineage +- Observer-relative views — human sees public API changes, agent + sees exact changed symbols + dependencies + +## The ramp + +1. MVP (Phase 1): tree-sitter, no WARP. Ship safe-context. +2. Provenance model: file version, symbol identities, ranges, + hashes, parent container, export status, observation timestamps. +3. Full: worldlines, structural deltas, observer-relative views. + +## Key distinction + +Track symbol lineage and structural deltas, not raw AST tombstones. +Current parse tree is ephemeral. Structural entities matter. +Provenance of those entities matters. Replayable transformations +matter. From ac2dce2610ebd6b146f16ddd27a88c11fc6ed90a Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 20:28:01 -0700 Subject: [PATCH 19/73] design: rename safe-context to graft Git has trees and branches. Grafting is attaching new growth onto existing rootstock. CLI: git graft, MCP: graft-mcp, npm: @git-stunts/graft. --- docs/design/0003-safe-context/safe-context.md | 25 ++++++++++++------- 1 file changed, 16 insertions(+), 9 deletions(-) diff --git a/docs/design/0003-safe-context/safe-context.md b/docs/design/0003-safe-context/safe-context.md index 7aa6277a..00c74b73 100644 --- a/docs/design/0003-safe-context/safe-context.md +++ b/docs/design/0003-safe-context/safe-context.md @@ -1,10 +1,17 @@ -# safe-context Phase 1 — The Governor +# Graft — Phase 1: The Governor **Cycle:** 0003-safe-context -**Type:** Feature (new repo: `@git-stunts/safe-context`) +**Type:** Feature (new repo: `@git-stunts/graft`) **Pulled from:** `asap/DX_safe-context-phase-1.md` **Prior art:** `docs/design/0002-code-nav-tool/code-nav-tool.md` +**Product:** `graft` — structural reads and context governance for +coding agents. CLI as `git graft`, MCP as `graft-mcp`. + +The name: Git has trees and branches. Grafting is attaching new +growth onto existing rootstock — semantic eyesight grafted onto +Git's history substrate. + ## Sponsor human James — maintains JS/TS and Rust codebases. Has empirical proof @@ -52,9 +59,9 @@ Phase 1 scope: JS/TS only. `safe_read`, `file_outline`, ### Human -1. Can I `npm install -g @git-stunts/safe-context` and it works? +1. Can I `npm install -g @git-stunts/graft` and it works? **YES/NO** -2. Does `safe-context outline src/domain/services/StrandService.js` +2. Does `git graft outline src/domain/services/StrandService.js` return a useful structural skeleton from the CLI? **YES/NO** 3. Can I point it at any JS/TS project with zero config? **YES/NO** 4. Can I register it as an MCP server in one line of JSON? **YES/NO** @@ -220,7 +227,7 @@ This is the escape hatch when the agent knows exactly what it needs. { "exitCode": 1, "tail": "... last 60 lines ...", - "logFile": "/tmp/safe-context/capture-1712023456.log", + "logFile": "/tmp/graft/capture-1712023456.log", "totalLines": 342, "truncated": true } @@ -233,7 +240,7 @@ Agent can `read_range` the log file if it needs more. **Input (save):** markdown string of session state. **Output (load):** the saved content, or null if no state file. -**Storage:** `.safe-context/WORKING_STATE.md` in the project root. +**Storage:** `.graft/WORKING_STATE.md` in the project root. This is deliberately simple. A markdown file. No schema, no structure enforcement. The agent writes what it needs to remember. @@ -264,9 +271,9 @@ The human can read it with `cat`. ## Project structure ```text -safe-context/ +graft/ bin/ - safe-context.js CLI entry point + graft.js CLI entry point src/ policy/ rules.js Ban lists, thresholds @@ -360,7 +367,7 @@ run_capture("seq 1 500", 5) ``` state_save("# Working on X") - -> file exists at .safe-context/WORKING_STATE.md + -> file exists at .graft/WORKING_STATE.md -> content matches state_load() From 22a23e195c9e0188b92734f7afb1c109a2c1b920 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 20:37:44 -0700 Subject: [PATCH 20/73] =?UTF-8?q?design:=20apply=20review=20feedback=20?= =?UTF-8?q?=E2=80=94=20close=20all=20escape=20hatches?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit REJECT fixes: - read_range bounded: max 250 lines, 20 KB, clipped with metadata - state_save capped at 8 KB with recommended template - safe_read uses lines + bytes (not lines alone), special-cases lockfiles and .min.js MUST fixes: - file_outline bounded: 60-char params, 200-entry cap - Project root defined: .git detection, symlink resolution, path escaping refused, logs in .graft/logs/ - Machine-stable reason codes for all policy decisions - intent documented as advisory only — never weakens policy - Build path refusal: no source-path guessing Repo: flyingrobots/graft (not git-stunts) --- docs/design/0003-safe-context/safe-context.md | 163 +++++++++++++++--- 1 file changed, 142 insertions(+), 21 deletions(-) diff --git a/docs/design/0003-safe-context/safe-context.md b/docs/design/0003-safe-context/safe-context.md index 00c74b73..35f0dcfe 100644 --- a/docs/design/0003-safe-context/safe-context.md +++ b/docs/design/0003-safe-context/safe-context.md @@ -1,7 +1,7 @@ # Graft — Phase 1: The Governor **Cycle:** 0003-safe-context -**Type:** Feature (new repo: `@git-stunts/graft`) +**Type:** Feature (new repo: `@flyingrobots/graft`) **Pulled from:** `asap/DX_safe-context-phase-1.md` **Prior art:** `docs/design/0002-code-nav-tool/code-nav-tool.md` @@ -59,7 +59,7 @@ Phase 1 scope: JS/TS only. `safe_read`, `file_outline`, ### Human -1. Can I `npm install -g @git-stunts/graft` and it works? +1. Can I `npm install -g @flyingrobots/graft` and it works? **YES/NO** 2. Does `git graft outline src/domain/services/StrandService.js` return a useful structural skeleton from the CLI? **YES/NO** @@ -91,14 +91,27 @@ Phase 1 scope: JS/TS only. `safe_read`, `file_outline`, | Condition | Response | |---|---| | Binary extension (`.gif`, `.png`, `.jpg`, `.pdf`, `.zip`, `.wasm`, `.bin`, `.sqlite`, `.ico`, `.mp4`, `.mov`) | Refuse. Return file type + size metadata. | -| Build/generated path (`node_modules/`, `dist/`, `build/`, `.next/`, `target/`, `coverage/`) | Refuse. Suggest source path. | +| Build/generated path (`node_modules/`, `dist/`, `build/`, `.next/`, `target/`, `coverage/`) | Refuse. No source-path guessing — just state what was blocked and why. | | File does not exist | Error with path. | -| File <= threshold (default 150 lines) | Return raw content. | -| File > threshold | Return `file_outline` result + "use read_range for details". | +| File <= line threshold AND <= byte threshold | Return raw content. | +| File > either threshold | Return `file_outline` result + next-step hints. | +| Known junk patterns (`.min.js`, lockfiles, giant JSON) | Refuse. Return metadata only. | -The threshold is configurable. Default 150 lines balances the data: -most utility files, configs, and small modules pass through; god -objects and large services get outlined. +**Thresholds (configurable):** + +| Metric | Default | +|---|---| +| Max lines | 150 | +| Max bytes | 12 KB | + +Both must pass for raw content. A 40-line minified atrocity that's +50 KB still gets outlined. Lockfiles (`package-lock.json`, +`pnpm-lock.yaml`, `yarn.lock`) and `.min.js` files are always +refused regardless of size. + +**Intent is advisory only.** It may affect messaging and next-step +hints. It never weakens safety bounds. An agent saying "edit line +45" does not unlock a larger read. **Output shape:** ```json @@ -109,7 +122,9 @@ objects and large services get outlined. "bytes": 68402, "content": "..." | null, "outline": { ... } | null, - "reason": "..." | null + "reason": "over_line_threshold" | "binary_extension" | "generated_path" | "lockfile" | "minified" | null, + "policy": { "lineThreshold": 150, "byteThreshold": 12000, "triggeredBy": "over_line_threshold" } | null, + "next": ["read_range(path, 1240, 1271) for method tick", "file_outline(path) for full shape"] | null } ``` @@ -119,6 +134,14 @@ objects and large services get outlined. **Output:** structural skeleton of the file. +**Formatting bounds:** +- Parameter strings truncated at 60 chars (ellipsized) +- Default values and destructuring patterns compacted +- Generic type parameters summarized, not expanded +- Max 80 chars per signature line +- Output capped at 200 entries (declarations + members). If a file + has more, the tail is elided with a count. + ```json { "path": "src/domain/services/StrandService.js", @@ -207,8 +230,31 @@ That is 35 lines. Not 2048. **Output:** raw content of the specified range with line numbers. -No policy interception — the caller already has a precise target. -This is the escape hatch when the agent knows exactly what it needs. +**Bounded.** The governor still governs: + +| Constraint | Default | +|---|---| +| Max line span | 250 lines | +| Max byte output | 20 KB | + +If the requested range exceeds either cap, the response is clipped +and metadata shows what happened: + +```json +{ + "path": "src/foo.js", + "requested": { "start": 1, "end": 800 }, + "returned": { "start": 1, "end": 250 }, + "truncated": true, + "reason": "range_exceeds_max_lines", + "content": "..." +} +``` + +Binary and build-path bans still apply. `read_range("foo.gif", 1, 10)` +is still refused. + +This is a scoped read, not a policy bypass. ### `run_capture(cmd, tail?)` @@ -242,9 +288,58 @@ Agent can `read_range` the log file if it needs more. **Storage:** `.graft/WORKING_STATE.md` in the project root. -This is deliberately simple. A markdown file. No schema, no -structure enforcement. The agent writes what it needs to remember. -The human can read it with `cat`. +**Capped at 8 KB.** If content exceeds the cap, the save is +rejected with `reason: "state_exceeds_max_bytes"`. The agent must +be concise. This is a breadcrumb trail, not a second context +window. + +Recommended template (not enforced, but nudged in error messages): + +```markdown +# Task +# Current hypothesis +# Files touched +# Next 3 actions +# Open questions +``` + +The human can read it with `cat`. The agent can load it after +`/clear` and pick up where it left off. + +## Project root + +All paths are resolved relative to the project root. Detection +order: + +1. Explicit `--root` flag (CLI) or `root` param (MCP) +2. Nearest ancestor directory containing `.git/` +3. Current working directory (fallback) + +**Rules:** +- Symlinks are resolved before path checks +- Paths that escape the project root are refused + (`reason: "path_escapes_root"`) +- Temp log files from `run_capture` live in `.graft/logs/` inside + the project root, not `/tmp/` +- `.graft/` should be added to `.gitignore` + +## Reason codes + +All policy decisions use machine-stable enum strings, not prose. + +| Code | Trigger | +|---|---| +| `binary_extension` | File has banned extension | +| `generated_path` | Path matches build/generated pattern | +| `lockfile` | `package-lock.json`, `pnpm-lock.yaml`, `yarn.lock` | +| `minified` | `.min.js`, `.min.css` | +| `over_line_threshold` | Lines exceed safe_read threshold | +| `over_byte_threshold` | Bytes exceed safe_read threshold | +| `range_exceeds_max_lines` | read_range span too large | +| `range_exceeds_max_bytes` | read_range output too large | +| `state_exceeds_max_bytes` | state_save content too large | +| `path_escapes_root` | Path resolves outside project root | +| `missing_file` | File does not exist | ## Technology @@ -295,9 +390,14 @@ graft/ json.js JSON output formatter test/ fixtures/ - small.js Under threshold (pass-through) - large-class.js Over threshold (outline) + small.js Under both thresholds (pass-through) + large-class.js Over line threshold (outline) + wide-minified.js Under lines, over bytes (outline) + huge-file.js 300+ declarations (outline cap test) + plain-functions.js Top-level functions only + typescript.ts TS-specific constructs binary.gif Binary refusal + vendor.min.js Minified refusal generated/ Build path refusal unit/ policy.test.js Gate decisions @@ -319,11 +419,26 @@ Tests are the spec. Playback questions map directly to test cases. ### Policy tests (`policy.test.js`) ``` -safe_read("foo.gif") -> action: "refused" -safe_read("node_modules/x") -> action: "refused" -safe_read("small.js") -> action: "content" (under threshold) -safe_read("large-class.js") -> action: "outline" (over threshold) -safe_read("missing.js") -> error +safe_read("foo.gif") -> refused, reason: binary_extension +safe_read("node_modules/x.js") -> refused, reason: generated_path +safe_read("dist/bundle.js") -> refused, reason: generated_path +safe_read("package-lock.json") -> refused, reason: lockfile +safe_read("vendor.min.js") -> refused, reason: minified +safe_read("small.js") -> content (under both thresholds) +safe_read("large-class.js") -> outline (over line threshold) +safe_read("wide-minified.js") -> outline (under lines, over bytes) +safe_read("missing.js") -> error, reason: missing_file +safe_read("../../etc/passwd") -> refused, reason: path_escapes_root +safe_read(bigFile, intent="edit") -> outline (intent does NOT relax policy) + +read_range("foo.js", 1, 800) -> truncated to 250 lines +read_range("foo.js", 1, 100) -> exact range returned +read_range("foo.gif", 1, 10) -> refused, reason: binary_extension + +state_save("# short") -> saved +state_save("x".repeat(9000)) -> refused, reason: state_exceeds_max_bytes +state_load() after save -> returns saved content +state_load() with no prior save -> returns null ``` ### Outline tests (`outline.test.js`) @@ -337,6 +452,8 @@ outline("large-class.js") -> private methods (leading _) marked private: true -> no function bodies in output -> line numbers are accurate (spot-check) + -> params truncated at 60 chars when long + -> total entries <= 200 outline("plain-functions.js") -> has declarations array @@ -345,6 +462,10 @@ outline("plain-functions.js") outline("typescript.ts") -> handles interfaces, type aliases, enums -> handles decorated classes + +outline("huge-file-300-functions.js") + -> entries capped at 200 + -> tail elided with count ``` ### Capture tests (`capture.test.js`) From 4227e4cd26ef4fab5f59aca9ad9486a01de3384a Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 20:41:27 -0700 Subject: [PATCH 21/73] design: add enforcement hooks, doctor, stats, agent ideas MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Two enforcement layers: MCP server (cross-LLM, voluntary) and Claude Code hooks (enforced). PreToolUse hooks on Read and Bash route through graft's policy gate — the agent never sees raw large files or full test dumps. New commands: graft doctor (diagnostic), graft stats (decision metrics). NDJSON metrics log for Blacklight validation. Agent ideas: parse cache, outline focus mode, smarter next-step hints, estimated savings in every response. --- docs/design/0003-safe-context/safe-context.md | 188 +++++++++++++++++- 1 file changed, 187 insertions(+), 1 deletion(-) diff --git a/docs/design/0003-safe-context/safe-context.md b/docs/design/0003-safe-context/safe-context.md index 35f0dcfe..44460885 100644 --- a/docs/design/0003-safe-context/safe-context.md +++ b/docs/design/0003-safe-context/safe-context.md @@ -382,12 +382,17 @@ graft/ range.js Bounded reads capture.js Shell output tailing state.js Session state save/load + hooks/ + gate.js Hook enforcement (Read gate, Bash gate) mcp/ server.js MCP server (stdio) tools.js Tool definitions + handlers format/ text.js CLI text formatter json.js JSON output formatter + metrics/ + logger.js NDJSON decision logger + stats.js Summary stats from log test/ fixtures/ small.js Under both thresholds (pass-through) @@ -502,9 +507,190 @@ state_load() with no prior save ``` spawn MCP server via stdio - -> server lists all 6 tools + -> server lists all tools -> safe_read call returns valid response -> file_outline call returns valid response -> run_capture call returns valid response -> state_save + state_load round-trips ``` + +## Enforcement: hooks + +The MCP server is voluntary. The agent can still call native `Read` +and bypass the governor entirely. And it will — not maliciously, +just because `Read` is familiar and consequences are later. + +The research says it plainly: + +> Models often "agree and then ignore" instruction-only rules. +> Enforcement is stronger. + +So graft ships **two layers**: + +### Layer 1: MCP server (cross-LLM, voluntary) + +The tools described above. Works on Claude Code, Gemini CLI, +Codex CLI. Agent uses these instead of native Read/Bash. Relies on +project instructions (CLAUDE.md, GEMINI.md) to prefer graft tools. + +### Layer 2: Claude Code hooks (enforced) + +`PreToolUse` hooks intercept native tool calls and route them +through graft's policy gate. + +**Read hook:** + +When the agent calls native `Read`, the hook: + +1. Runs the path through graft's policy (binary? build? over + threshold?) +2. If policy says **content** (small, safe): allow the Read through + unchanged +3. If policy says **outline** (too large): block the Read, return + the outline as the tool result with next-step hints +4. If policy says **refused** (binary, build, lockfile): block the + Read, return the reason and metadata + +The agent never sees the raw 2000-line file. It gets the outline +and can follow up with `read_range` for specific sections. + +**Bash hook (test capture):** + +When the agent calls native `Bash` with a command matching known +test runners (`npm test`, `vitest`, `jest`, `cargo test`, `pytest`, +`make test`), the hook: + +1. Routes through `run_capture` instead +2. Tees full output to `.graft/logs/` +3. Returns only the tail + +The agent gets the test result without the full dump. The full +output is on disk if needed. + +**Hook configuration:** + +Graft ships a `graft hooks install` command that writes the hook +config. For Claude Code: + +```json +{ + "hooks": { + "PreToolUse": [ + { + "matcher": "Read", + "command": "graft gate read" + }, + { + "matcher": "Bash", + "command": "graft gate bash" + } + ] + } +} +``` + +The `graft gate` subcommands read the tool call input from stdin +(hook protocol), apply policy, and exit 0 (allow) or exit 2 +(block + replacement output). + +**Gemini/Codex:** No equivalent hook mechanism yet. Enforcement is +MCP-only + project instructions. When those agents add hooks, graft +adapts. + +## Additional commands + +### `graft doctor` + +Diagnostic command for debugging policy behavior. + +```bash +git graft doctor +``` + +``` +project root: /Users/james/git/git-stunts/git-warp (.git detected) +line threshold: 150 +byte threshold: 12,000 +range max lines: 250 +range max bytes: 20,000 +state max bytes: 8,192 +log directory: .graft/logs/ (exists, 3 files, 42 KB) +state file: .graft/WORKING_STATE.md (exists, 1.2 KB) +parser: tree-sitter (javascript, typescript loaded) +node version: v22.3.0 +hooks installed: yes (Read gate, Bash gate) +.gitignore: .graft/ present +``` + +Answers "why did my read get blocked?" before anyone has to ask. + +### `graft stats` + +Minimal decision metrics. Not a dashboard — a quick summary. + +```bash +git graft stats +``` + +``` +session decisions (since last clear): + content: 12 reads passed through + outline: 8 reads downgraded to outline + refused: 3 reads blocked (2 binary, 1 generated) + ranges: 5 bounded reads + captures: 4 shell captures (avg 47 tail lines) + +estimated bytes avoided: ~340 KB +``` + +Graft logs every decision to `.graft/metrics.jsonl` as append-only +NDJSON. One line per decision. This is how we prove graft works +when Blacklight re-analyzes post-deployment. + +```json +{"ts":"...","op":"safe_read","action":"outline","path":"StrandService.js","lines":2048,"bytes":68402,"reason":"over_line_threshold"} +{"ts":"...","op":"read_range","path":"StrandService.js","start":1240,"end":1271,"truncated":false} +{"ts":"...","op":"safe_read","action":"refused","path":"foo.gif","reason":"binary_extension"} +``` + +## Agent ideas (from the equal collaborator at the table) + +A few things I want to surface as the agent who will actually use +this tool every day: + +### Parse cache + +Tree-sitter is fast, but if I outline the same file twice in a +session, cache the parse tree. Simple `Map` +with mtime invalidation. Not persistence — just in-memory for the +MCP server's lifetime. This matters because I will absolutely +outline a file, read a range, then outline it again to re-orient. + +### Outline focus mode + +`file_outline(path, { focus: "StrandService" })` returns only that +class's members, not the whole file skeleton. Still Phase 1 friendly. +Huge for files with multiple classes or hundreds of top-level +functions — I usually care about one class at a time. + +### Smarter next-step hints + +When `safe_read` returns an outline, the `next` array should +reference specific interesting symbols by name, not generic +"use read_range." If the outline shows a class with 25 methods, the +hint should say "StrandService has 25 methods — use +read_range(path, 917, 950) for create() or +read_range(path, 1240, 1271) for tick()." This is agent candy that +makes the outline → targeted read flow feel instant. + +### Estimated savings in every response + +Every graft response includes: + +```json +"savings": { "bytesAvoided": 68402 } +``` + +Not rigorous. Perfect for a README. And it makes the value visible +on every single call — the agent and human both see "this outline +saved 68 KB of context" in real time. From 48a870726186c36ab4ca4d4b62d211202ceb03b4 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 20:41:59 -0700 Subject: [PATCH 22/73] =?UTF-8?q?design:=20add=20.graftignore=20=E2=80=94?= =?UTF-8?q?=20gitignore-style=20policy=20file?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit User-defined patterns for paths that should always be refused. Covers secrets (.env, credentials), large data files, project- specific generated code. Optional — built-in bans still apply without it. Reason code: graftignore. --- docs/design/0003-safe-context/safe-context.md | 33 +++++++++++++++++++ 1 file changed, 33 insertions(+) diff --git a/docs/design/0003-safe-context/safe-context.md b/docs/design/0003-safe-context/safe-context.md index 44460885..4f726e52 100644 --- a/docs/design/0003-safe-context/safe-context.md +++ b/docs/design/0003-safe-context/safe-context.md @@ -306,6 +306,36 @@ Recommended template (not enforced, but nudged in error messages): The human can read it with `cat`. The agent can load it after `/clear` and pick up where it left off. +## `.graftignore` + +A gitignore-style file in the project root. Paths matching any +pattern are always refused by `safe_read` and `read_range`, with +`reason: "graftignore"`. + +```text +# Secrets +.env +.env.* +credentials.json +**/secrets/** + +# Large generated files +*.sql.dump +*.csv +data/ + +# Project-specific +src/generated/** +``` + +If `.graftignore` does not exist, only the built-in bans (binary +extensions, build paths, lockfiles, minified) apply. The file is +optional — graft works without it. + +Uses the same glob syntax as `.gitignore` (via the `ignore` or +`picomatch` npm package — whichever tree-sitter already pulls in, +to avoid a new dep). + ## Project root All paths are resolved relative to the project root. Detection @@ -339,6 +369,7 @@ All policy decisions use machine-stable enum strings, not prose. | `range_exceeds_max_bytes` | read_range output too large | | `state_exceeds_max_bytes` | state_save content too large | | `path_escapes_root` | Path resolves outside project root | +| `graftignore` | Path matches `.graftignore` pattern | | `missing_file` | File does not exist | ## Technology @@ -434,6 +465,7 @@ safe_read("large-class.js") -> outline (over line threshold) safe_read("wide-minified.js") -> outline (under lines, over bytes) safe_read("missing.js") -> error, reason: missing_file safe_read("../../etc/passwd") -> refused, reason: path_escapes_root +safe_read(".env") -> refused, reason: graftignore (with .graftignore) safe_read(bigFile, intent="edit") -> outline (intent does NOT relax policy) read_range("foo.js", 1, 800) -> truncated to 250 lines @@ -619,6 +651,7 @@ state file: .graft/WORKING_STATE.md (exists, 1.2 KB) parser: tree-sitter (javascript, typescript loaded) node version: v22.3.0 hooks installed: yes (Read gate, Bash gate) +.graftignore: present (7 patterns) .gitignore: .graft/ present ``` From d75186cb3b9bdff224eed0d34b86ab4395e29c4a Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 20:49:36 -0700 Subject: [PATCH 23/73] =?UTF-8?q?design:=20apply=20round=202=20review=20?= =?UTF-8?q?=E2=80=94=20close=20all=20remaining=20gaps?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit REJECTs fixed: - action: "error" distinct from "refused" (operational vs policy) - Broken-file outline: best-effort with partial: true + parseErrors - .graftignore uses picomatch (declared dep, not transitive) - run_capture: cwd, env, timeout, max log size all specified Promoted from COULD/SHOULD/COOL to MUST: - Built-in secret file bans (.env, *.pem, *.key, id_rsa, etc.) - Outline truncation metadata (entryCount, elidedCount) - focus param on file_outline - doctor --json, stats --json, stats --since-clear - Log retention (metrics 1MB, logs 10MB, oldest pruned) - Symlink escape test, broken-syntax test, secret-file tests - Smart next-step hints with symbol names from outline - Estimated savings in every response - Parse cache (mtime-invalidated Map) - Explicit CLI binary names (graft + git-graft) - root param on safe_read, read_range, file_outline - explain field (one-sentence human-readable reason) --- docs/design/0003-safe-context/safe-context.md | 210 ++++++++++++++---- 1 file changed, 169 insertions(+), 41 deletions(-) diff --git a/docs/design/0003-safe-context/safe-context.md b/docs/design/0003-safe-context/safe-context.md index 4f726e52..ce3cf4b5 100644 --- a/docs/design/0003-safe-context/safe-context.md +++ b/docs/design/0003-safe-context/safe-context.md @@ -56,6 +56,8 @@ Phase 1 scope: JS/TS only. `safe_read`, `file_outline`, **YES/NO** 6. Can I call every operation as an MCP tool from Claude Code? **YES/NO** +7. When I outline a half-edited file with broken syntax, do I get + a best-effort outline with `partial: true`? **YES/NO** ### Human @@ -83,6 +85,7 @@ Phase 1 scope: JS/TS only. `safe_read`, `file_outline`, **Input:** - `path` — file path (absolute or relative to project root) +- `root` — optional project root override - `intent` — optional string hint ("understand shape", "find method X", "edit line 45") @@ -92,7 +95,8 @@ Phase 1 scope: JS/TS only. `safe_read`, `file_outline`, |---|---| | Binary extension (`.gif`, `.png`, `.jpg`, `.pdf`, `.zip`, `.wasm`, `.bin`, `.sqlite`, `.ico`, `.mp4`, `.mov`) | Refuse. Return file type + size metadata. | | Build/generated path (`node_modules/`, `dist/`, `build/`, `.next/`, `target/`, `coverage/`) | Refuse. No source-path guessing — just state what was blocked and why. | -| File does not exist | Error with path. | +| File does not exist | Error (not a refusal — see error model below). | +| Secret file (`.env`, `*.pem`, `*.key`, `id_rsa`, `id_ed25519`, `credentials.json`) | Refuse. Built-in, not `.graftignore`-dependent. | | File <= line threshold AND <= byte threshold | Return raw content. | | File > either threshold | Return `file_outline` result + next-step hints. | | Known junk patterns (`.min.js`, lockfiles, giant JSON) | Refuse. Return metadata only. | @@ -113,18 +117,32 @@ refused regardless of size. hints. It never weakens safety bounds. An agent saying "edit line 45" does not unlock a larger read. +**Action model:** + +| Action | Meaning | +|---|---| +| `content` | Raw file returned (under thresholds) | +| `outline` | Structural skeleton returned (over thresholds) | +| `refused` | Policy blocked the read (binary, build, secret, graftignore) | +| `error` | Operational failure (missing file, unreadable, bad path) | + +`refused` = the governor said no. `error` = something broke. These +are different: a refusal is correct behavior; an error is a problem. + **Output shape:** ```json { - "action": "content" | "outline" | "refused", + "action": "content" | "outline" | "refused" | "error", "path": "src/foo.js", "lines": 2048, "bytes": 68402, "content": "..." | null, "outline": { ... } | null, - "reason": "over_line_threshold" | "binary_extension" | "generated_path" | "lockfile" | "minified" | null, + "reason": "over_line_threshold" | "binary_extension" | ... | null, + "explain": "File exceeded 150-line cap; outline returned instead." | null, "policy": { "lineThreshold": 150, "byteThreshold": 12000, "triggeredBy": "over_line_threshold" } | null, - "next": ["read_range(path, 1240, 1271) for method tick", "file_outline(path) for full shape"] | null + "next": ["read_range(path, 1240, 1271) for method tick"] | null, + "savings": { "bytesAvoided": 68402 } | null } ``` @@ -140,7 +158,45 @@ hints. It never weakens safety bounds. An agent saying "edit line - Generic type parameters summarized, not expanded - Max 80 chars per signature line - Output capped at 200 entries (declarations + members). If a file - has more, the tail is elided with a count. + has more, the tail is elided with metadata: + +```json +{ + "entryCount": 200, + "totalEntryCount": 317, + "truncated": true, + "elidedCount": 117 +} +``` + +**Broken files (syntactically invalid JS/TS):** + +Agents constantly work on half-edited, mid-refactor files. This is +normal, not an error. Tree-sitter produces partial parse trees for +broken syntax — it does not bail. + +Contract: outline is **best-effort**. If the file has parse errors, +the outline includes whatever structure tree-sitter recovered, plus +metadata: + +```json +{ + "partial": true, + "parseErrors": [ + { "line": 188, "message": "unterminated class body" } + ] +} +``` + +The outline is still useful — it shows the symbols that parsed +cleanly. The `partial` flag tells the agent "this file is broken, +so the outline may be incomplete." This is strictly better than +refusing to outline a broken file. + +**Root parameter:** `file_outline(path, { root?, focus? })` + +`root` overrides project root detection for this call. `focus` +limits output to a single class or top-level declaration by name. ```json { @@ -226,7 +282,8 @@ That is 35 lines. Not 2048. ### `read_range(path, start, end)` -**Input:** file path, start line (1-indexed), end line (inclusive). +**Input:** file path, start line (1-indexed), end line (inclusive), +optional `root` override. **Output:** raw content of the specified range with line numbers. @@ -261,19 +318,31 @@ This is a scoped read, not a policy bypass. **Input:** - `cmd` — shell command string - `tail` — number of lines to return (default 60) +- `cwd` — working directory (default: project root) +- `timeout` — max seconds (default: 120) **Behavior:** -1. Execute `cmd` via shell -2. Tee full output to a temp log file +1. Execute `cmd` via the user's default shell +2. Tee full output (stdout + stderr merged) to a log file 3. Return last `tail` lines + the log file path 4. Return exit code +**Execution contract:** + +| Setting | Value | +|---|---| +| Working directory | Project root (or explicit `cwd` param) | +| Environment | Inherited from parent process | +| Timeout | 120 seconds default (configurable via `timeout` param) | +| Max log size | 5 MB. If output exceeds this, the log is truncated from the head and the tail is preserved. | +| Nonzero exit | Not an error — return the exit code + tail normally. Tests fail; that's expected. | + **Output shape:** ```json { "exitCode": 1, "tail": "... last 60 lines ...", - "logFile": "/tmp/graft/capture-1712023456.log", + "logFile": ".graft/logs/capture-1712023456.log", "totalLines": 342, "truncated": true } @@ -332,9 +401,9 @@ If `.graftignore` does not exist, only the built-in bans (binary extensions, build paths, lockfiles, minified) apply. The file is optional — graft works without it. -Uses the same glob syntax as `.gitignore` (via the `ignore` or -`picomatch` npm package — whichever tree-sitter already pulls in, -to avoid a new dep). +Uses `.gitignore` glob syntax via `picomatch` (declared dependency, +not transitive — don't build product behavior on accidental dep +chains). ## Project root @@ -369,6 +438,7 @@ All policy decisions use machine-stable enum strings, not prose. | `range_exceeds_max_bytes` | read_range output too large | | `state_exceeds_max_bytes` | state_save content too large | | `path_escapes_root` | Path resolves outside project root | +| `secret_file` | Built-in secret ban (`.env`, `*.pem`, etc.) | | `graftignore` | Path matches `.graftignore` pattern | | `missing_file` | File does not exist | @@ -394,6 +464,41 @@ All policy decisions use machine-stable enum strings, not prose. - Zero config — no tsconfig, no build step, no daemon - `pnpm` for package management +### Install and binary names + +```bash +npm install -g @flyingrobots/graft +``` + +This installs two binaries: + +| Binary | Purpose | +|---|---| +| `graft` | Standalone CLI (`graft outline foo.js`) | +| `git-graft` | Git subcommand shim (`git graft outline foo.js`) | + +Git automatically finds `git-graft` on `$PATH` and exposes it as +`git graft`. Both binaries are the same entrypoint. + +MCP server is started via: + +```bash +graft mcp +``` + +Claude Code config: + +```json +{ + "mcpServers": { + "graft": { + "command": "graft", + "args": ["mcp"] + } + } +} +``` + ## Project structure ```text @@ -434,6 +539,8 @@ graft/ typescript.ts TS-specific constructs binary.gif Binary refusal vendor.min.js Minified refusal + broken-syntax.js Partial parse (missing braces) + secret.env Secret file refusal generated/ Build path refusal unit/ policy.test.js Gate decisions @@ -465,8 +572,13 @@ safe_read("large-class.js") -> outline (over line threshold) safe_read("wide-minified.js") -> outline (under lines, over bytes) safe_read("missing.js") -> error, reason: missing_file safe_read("../../etc/passwd") -> refused, reason: path_escapes_root -safe_read(".env") -> refused, reason: graftignore (with .graftignore) +safe_read("/tmp -> ../../etc") -> refused (symlink resolved, escapes root) +safe_read(".env") -> refused, reason: secret_file (built-in, no .graftignore needed) +safe_read(".env.production") -> refused, reason: secret_file +safe_read("deploy.pem") -> refused, reason: secret_file +safe_read("data/dump.csv") -> refused, reason: graftignore (with .graftignore) safe_read(bigFile, intent="edit") -> outline (intent does NOT relax policy) +safe_read("missing.js") -> action: error, reason: missing_file read_range("foo.js", 1, 800) -> truncated to 250 lines read_range("foo.js", 1, 100) -> exact range returned @@ -502,7 +614,17 @@ outline("typescript.ts") outline("huge-file-300-functions.js") -> entries capped at 200 - -> tail elided with count + -> tail elided with elidedCount: 100+ + +outline("broken-syntax.js") + -> partial: true + -> parseErrors array present + -> recovered symbols still included + -> still useful, not an error + +outline("large-class.js", { focus: "StrandService" }) + -> only StrandService members returned + -> other classes/functions excluded ``` ### Capture tests (`capture.test.js`) @@ -686,44 +808,50 @@ when Blacklight re-analyzes post-deployment. {"ts":"...","op":"safe_read","action":"refused","path":"foo.gif","reason":"binary_extension"} ``` -## Agent ideas (from the equal collaborator at the table) +**Log retention:** +- `metrics.jsonl`: max 1 MB. When exceeded, oldest entries are + pruned (keep the tail). +- `.graft/logs/` (capture logs): max 10 MB total. Oldest logs + pruned first. Individual capture logs capped at 5 MB. +- `graft stats --since-clear` resets the metric window. -A few things I want to surface as the agent who will actually use -this tool every day: +`graft doctor` and `graft stats` both accept `--json` for machine +consumption. -### Parse cache +## Parse cache -Tree-sitter is fast, but if I outline the same file twice in a -session, cache the parse tree. Simple `Map` -with mtime invalidation. Not persistence — just in-memory for the -MCP server's lifetime. This matters because I will absolutely -outline a file, read a range, then outline it again to re-orient. +Tree-sitter is fast, but the MCP server lives for the session +duration. If the same file is outlined twice, cache the parse tree. -### Outline focus mode +`Map` — invalidated by mtime change. +In-memory only, no persistence. This matters because the agent will +outline a file, read a range, then outline again to re-orient. -`file_outline(path, { focus: "StrandService" })` returns only that -class's members, not the whole file skeleton. Still Phase 1 friendly. -Huge for files with multiple classes or hundreds of top-level -functions — I usually care about one class at a time. +## Smart next-step hints -### Smarter next-step hints +When `safe_read` returns an outline, the `next` array references +specific symbols by name, not generic suggestions. If the outline +shows a class with 25 methods, the hints name the public methods +and their line ranges: + +```json +"next": [ + "read_range(path, 917, 950) — create()", + "read_range(path, 1240, 1271) — tick()", + "file_outline(path, { focus: 'StrandService' }) — just this class" +] +``` -When `safe_read` returns an outline, the `next` array should -reference specific interesting symbols by name, not generic -"use read_range." If the outline shows a class with 25 methods, the -hint should say "StrandService has 25 methods — use -read_range(path, 917, 950) for create() or -read_range(path, 1240, 1271) for tick()." This is agent candy that -makes the outline → targeted read flow feel instant. +When `intent` mentions a symbol name and it appears in the outline, +that symbol's range is promoted to the first hint. -### Estimated savings in every response +## Estimated savings -Every graft response includes: +Every graft response that avoids returning raw content includes: ```json "savings": { "bytesAvoided": 68402 } ``` -Not rigorous. Perfect for a README. And it makes the value visible -on every single call — the agent and human both see "this outline -saved 68 KB of context" in real time. +Not rigorous. Perfect for a README. Makes the value visible on +every call — agent and human both see "this outline saved 68 KB." From 3ebf3b75df9867ca8e1c074066cdd72833379a79 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 20:57:36 -0700 Subject: [PATCH 24/73] cool-ideas: graft post-Phase 1 ideas from design review graft pack, graft since, graft explain, graft init, focus auto, capture handles, policy profiles (balanced/strict/feral), receipt mode, symbol heatmap, and the line where graft becomes a WARP substrate. --- .../backlog/cool-ideas/DX_graft-cool-ideas.md | 40 +++++++++++++++++++ 1 file changed, 40 insertions(+) create mode 100644 docs/method/backlog/cool-ideas/DX_graft-cool-ideas.md diff --git a/docs/method/backlog/cool-ideas/DX_graft-cool-ideas.md b/docs/method/backlog/cool-ideas/DX_graft-cool-ideas.md new file mode 100644 index 00000000..ac63e581 --- /dev/null +++ b/docs/method/backlog/cool-ideas/DX_graft-cool-ideas.md @@ -0,0 +1,40 @@ +# Graft cool ideas (post-Phase 1) + +Ideas surfaced during the design review. Not Phase 1 scope. + +## Commands + +- **graft pack** — one-shot handoff bundle: WORKING_STATE.md, top + touched files, last 10 decisions, recommended next reads. Great + for `/clear`, bug reports, "what was I doing yesterday?" +- **graft since ``** — symbols changed since HEAD~1, main, + or a specific commit. The Git/WARP bridge starts here. +- **graft explain ``** — built-in help for machine + codes (`graft explain over_byte_threshold`) +- **graft init** — scaffolds `.graftignore`, `.gitignore` update, + CLAUDE/GEMINI/Codex instruction snippets, optional hook install + +## Features + +- **focus: "auto"** — if intent mentions a symbol name, auto-promote + it in next hints and optionally return focused outline first +- **capture_range(handle, start, end)** — opaque log handles instead + of path-based artifacts. Cleaner, harder to misuse. +- **policy profiles** — `balanced`, `strict`, `feral`. Yes, feral + is ridiculous. Yes, people will use it immediately. +- **receipt mode** — every decision emits a compact receipt blob for + Blacklight: what was requested, returned, why, bytes avoided, what + the agent did next +- **symbol heatmap** — after enough metrics, show which files/symbols + most often trigger outlines, bounded reads, re-orientation. Gold + for Phase 2 prioritization. + +## The line to WARP + +- **graft changed-since-last-read** — the doorway. Not Phase 1. + This is where graft stops being a governor and starts being a + substrate. +- Graft = governor at the edge (what context is allowed) +- WARP = memory underneath (structural truth over time) +- The mutation happens when "current file shape" stops being enough + and you need observer-relative structural history as a primitive From 3e442e894ca5c86be8a8106b38b286dd792390e9 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 21:03:06 -0700 Subject: [PATCH 25/73] =?UTF-8?q?design:=20add=20internal=20vocabulary=20?= =?UTF-8?q?=E2=80=94=20graft=20as=20applied=20WARP=20optics?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit projection, focus, residual, receipt, witness. Not public CLI names — internal architecture doctrine. Gives coherence: policy is not arbitrary, projection is not truncation, focus is not slicing. "Derive the smallest lawful view of the code that lets the agent act without poisoning its own context." --- docs/design/0003-safe-context/safe-context.md | 21 +++++++++++++++++++ 1 file changed, 21 insertions(+) diff --git a/docs/design/0003-safe-context/safe-context.md b/docs/design/0003-safe-context/safe-context.md index ce3cf4b5..2d59788e 100644 --- a/docs/design/0003-safe-context/safe-context.md +++ b/docs/design/0003-safe-context/safe-context.md @@ -68,6 +68,27 @@ Phase 1 scope: JS/TS only. `safe_read`, `file_outline`, 3. Can I point it at any JS/TS project with zero config? **YES/NO** 4. Can I register it as an MCP server in one line of JSON? **YES/NO** +## Internal vocabulary + +Graft is WARP optics with a job: derive the smallest lawful view +of the code that lets the agent act without poisoning its own +context. + +These terms are internal architecture language, not public CLI +names: + +| Term | Meaning | +|---|---| +| **projection** | The output mode chosen by policy: content, outline, refusal, error. Not the full file — a lawful reduced view. | +| **focus** | The targeting mechanism: file, class, method, range, export surface. Bounds what the agent sees. | +| **residual** | The hidden context not surfaced to the agent. The 1,900 lines of StrandService that the outline doesn't show. Exists, acknowledged, not transmitted. | +| **receipt** | A structured decision log entry: what was requested, what was returned, why, bytes avoided. | +| **witness** | (Future) The exact focus chosen, the lines returned, why that focus was selected, what larger whole it came from. "What did the agent see before it made this edit?" | + +This vocabulary gives the architecture coherence: policy is not +arbitrary, projection is not just truncation, focus is not just +slicing, receipts are not just logs. + ## Non-goals - Rust support (Phase 2) From 125a772c6f9b6b32cd3ce35b40c257b0deb8af25 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 21:13:36 -0700 Subject: [PATCH 26/73] =?UTF-8?q?retro:=20close=20cycle=200003=20=E2=80=94?= =?UTF-8?q?=20graft=20design=20complete,=20impl=20moves=20to=20new=20repo?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Design cycle delivered a locked spec for Graft (context governor for coding agents) through two APPROVE/REJECT/ENHANCE review rounds. All escape hatches bounded, broken-file behavior specified, hooks enforcement defined. Implementation begins as graft cycle 0001 at flyingrobots/graft. --- .../retro/0003-safe-context/safe-context.md | 115 ++++++++++++++++++ 1 file changed, 115 insertions(+) create mode 100644 docs/method/retro/0003-safe-context/safe-context.md diff --git a/docs/method/retro/0003-safe-context/safe-context.md b/docs/method/retro/0003-safe-context/safe-context.md new file mode 100644 index 00000000..53a55284 --- /dev/null +++ b/docs/method/retro/0003-safe-context/safe-context.md @@ -0,0 +1,115 @@ +# Retrospective: 0003-safe-context + +**Date:** 2026-04-01 +**Type:** Design +**Outcome:** Hill met (pivoted to new repo) + +## What happened + +Design cycle for Graft — a context governor for coding agents. +Started from the code-nav pivot (cycle 0002) and iterated through +two full review rounds with APPROVE/REJECT/ENHANCE feedback. + +The design doc went through three major evolutions: + +1. **Initial draft** — command contracts, output shapes, test + strategy, project structure. Tree-sitter for parsing, MCP + CLI + for transport. + +2. **Round 1 review** — closed all escape hatches. read_range + bounded (250 lines / 20 KB), state_save capped (8 KB), dual + thresholds (lines + bytes), built-in secret bans, machine-stable + reason codes, project root definition, .graftignore. + +3. **Round 2 review** — error vs refused distinction, broken-file + best-effort outlines (partial: true), run_capture execution + contract (cwd/env/timeout/log size), explicit CLI binary names, + log retention, outline truncation metadata. + +Final additions: enforcement hooks (PreToolUse on Read and Bash), +graft doctor/stats, internal vocabulary (projection, focus, +residual, receipt, witness), and the WARP optics framing. + +Product named "Graft" — grafting semantic eyesight onto Git's +history substrate. Repo created at `flyingrobots/graft`, scaffolded +with METHOD.md, pushed to GitHub. + +## Hill assessment + +**Hill:** "An agent working in a JS/TS codebase can obtain the +minimum structurally correct context required to act — without +injecting large raw artifacts into long-lived conversation state." + +**Status:** Design complete. The hill is fully specified with +command contracts, policy rules, enforcement layers, error models, +edge cases (broken files, secrets, symlinks), and test strategy. +Implementation begins as graft cycle 0001. + +## Drift check + +- Design doc lives in `docs/design/0003-safe-context/safe-context.md` +- Cool ideas logged in `docs/method/backlog/cool-ideas/DX_graft-cool-ideas.md` +- WARP provenance layer logged in `cool-ideas/PROTO_safe-context-warp-provenance-layer.md` +- CLEAN_CODE legend declared in `docs/method/legends/CLEAN_CODE.md` +- No code written. No test drift. No architecture drift. +- Cycle directory still named `0003-safe-context` (pre-rename to + graft). Provenance preserved intentionally. + +## Playback + +### Agent + +Design questions answered clearly: +- Command contracts with exact output shapes? **YES** +- All escape hatches bounded? **YES** (read_range, state, outline) +- Broken-file behavior specified? **YES** (best-effort, partial) +- Enforcement architecture defined? **YES** (MCP + hooks) +- Internal vocabulary coherent? **YES** (projection/focus/residual/receipt/witness) + +### Human + +- Does the design feel like a product? **YES** (per review: "first + version that feels like a product instead of a clever utility") +- Are the governor's bounds tight? **YES** (per review: "stops + feeling like a design sketch and starts feeling like a repo that + wants to exist") +- Is the naming right? **YES** — Graft. Git has trees and branches. + +## What we learned + +1. **Data before design.** The Blacklight research transformed a + nice utility into a real product. Without empirical evidence of + 96.2 GB Read burden, we would have built code-nav and missed 75% + of the problem. + +2. **Two review rounds caught real bugs.** Unrestricted read_range + was a policy bypass. Unbounded state_save would recreate the + problem in markdown. Flat line thresholds ignored byte-heavy + files. These weren't obvious until someone said "the governor + only works if it's hard to accidentally bypass." + +3. **Internal vocabulary matters.** Naming the concepts (projection, + focus, residual, receipt) gave the architecture coherence that + made the review rounds productive instead of circular. + +4. **Design cycles can spawn repos.** The Method worked across the + boundary — design doc in git-warp, product in flyingrobots/graft. + The cycle closes here; implementation opens there. + +## New debt + +None in git-warp. + +## Cool ideas + +Logged during cycle: +- graft pack, graft since, graft explain, graft init +- focus auto, capture handles, policy profiles (balanced/strict/feral) +- receipt mode, symbol heatmap, changed-since-last-read +- WARP provenance layer as Phase 3+ substrate + +## Backlog impact + +Implementation continues as graft repo cycle 0001. +`asap/DX_safe-context-phase-1.md` was consumed by this cycle's +design doc — no orphan backlog item remains. From 251e165dfe8c145afc1e5b66d4ca815355a75513 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 21:21:38 -0700 Subject: [PATCH 27/73] fix(test): update release-policy-shape path after docs restructure docs/release.md moved to docs/method/release.md during the Method bootstrap (cycle 0001) but the test path was not updated. --- test/unit/scripts/release-policy-shape.test.js | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/test/unit/scripts/release-policy-shape.test.js b/test/unit/scripts/release-policy-shape.test.js index 6b2291e9..4caed9b8 100644 --- a/test/unit/scripts/release-policy-shape.test.js +++ b/test/unit/scripts/release-policy-shape.test.js @@ -3,7 +3,7 @@ import { fileURLToPath } from 'node:url'; import { describe, expect, it } from 'vitest'; const releaseDoc = readFileSync( - fileURLToPath(new URL('../../../docs/release.md', import.meta.url)), + fileURLToPath(new URL('../../../docs/method/release.md', import.meta.url)), 'utf8', ); const preflight = readFileSync( From f6a6e33047af71fe213da731263b0c9cfc0b379d Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 21:26:20 -0700 Subject: [PATCH 28/73] refactor: replace RECEIPT_OP_TYPE mapping with OpStrategy.receiptName Each OpStrategy object now carries its own TickReceipt-compatible operation name (e.g. NodeRemove strategy has receiptName 'NodeTombstone'). This eliminates the redundant RECEIPT_OP_TYPE lookup tables that duplicated knowledge already present in the strategy registry. - Added `receiptName` property to all 8 strategy objects - Added `receiptName` to OpStrategy typedef and load-time validation - Deleted RECEIPT_OP_TYPE from JoinReducer.js - Deleted duplicate RECEIPT_OP_TYPE from ConflictAnalyzerService.js - ConflictAnalyzerService now imports OP_STRATEGIES (already imports from JoinReducer, so no new dependency direction) Closes bad-code/PROTO_receipt-op-type-redundant --- .../services/ConflictAnalyzerService.js | 25 +++++------- src/domain/services/JoinReducer.js | 39 +++++++------------ 2 files changed, 23 insertions(+), 41 deletions(-) diff --git a/src/domain/services/ConflictAnalyzerService.js b/src/domain/services/ConflictAnalyzerService.js index 5e9a91ee..dba6ac3b 100644 --- a/src/domain/services/ConflictAnalyzerService.js +++ b/src/domain/services/ConflictAnalyzerService.js @@ -9,7 +9,7 @@ */ import QueryError from '../errors/QueryError.js'; -import { reduceV5, normalizeRawOp } from './JoinReducer.js'; +import { reduceV5, normalizeRawOp, OP_STRATEGIES } from './JoinReducer.js'; import { canonicalStringify } from '../utils/canonicalStringify.js'; import { createEventId } from '../utils/EventId.js'; import { decodeEdgeKey } from './KeyCodec.js'; @@ -32,20 +32,15 @@ const VALID_TARGET_KINDS = new Set(['node', 'edge', 'node_property', 'edge_prope const TARGET_SELECTOR_FIELDS = ['entityId', 'propertyKey', 'from', 'to', 'label']; /** - * Receipt op type mapping. Kept local so the analyzer can interpret canonical ops - * without depending on JoinReducer internals that are not part of the public API. + * Resolves a canonical op type to its TickReceipt-compatible name via OP_STRATEGIES. + * Returns undefined for unknown/forward-compatible op types. + * @param {string} opType + * @returns {string|undefined} */ -/** @type {Readonly>} */ -const RECEIPT_OP_TYPE = Object.freeze({ - NodeAdd: 'NodeAdd', - NodeRemove: 'NodeTombstone', - EdgeAdd: 'EdgeAdd', - EdgeRemove: 'EdgeTombstone', - PropSet: 'PropSet', - NodePropSet: 'NodePropSet', - EdgePropSet: 'EdgePropSet', - BlobValue: 'BlobValue', -}); +function receiptNameForOp(opType) { + const strategy = OP_STRATEGIES.get(opType); + return strategy !== undefined ? strategy.receiptName : undefined; +} const CLASSIFICATION_NOTES = Object.freeze({ RECEIPT_SUPERSEDED: 'receipt_superseded', @@ -1517,7 +1512,7 @@ async function analyzeFrameOps(service, { frame, scannedPatchShas, diagnostics, async function analyzeOneOp(service, { frame, opIndex, receiptOpIndex, receipt, diagnostics }) { const rawOp = /** @type {import('../types/WarpTypesV2.js').RawOpV2 | {type: string}} */ (frame.patch.ops[opIndex]); const canonOp = cloneObject(/** @type {Record} */ (normalizeRawOp(rawOp))); - const receiptOpType = RECEIPT_OP_TYPE[/** @type {string} */ (canonOp['type'])]; + const receiptOpType = receiptNameForOp(/** @type {string} */ (canonOp['type'])); if (typeof receiptOpType !== 'string' || receiptOpType.length === 0) { return null; } diff --git a/src/domain/services/JoinReducer.js b/src/domain/services/JoinReducer.js index 4b127f30..faaad4a5 100644 --- a/src/domain/services/JoinReducer.js +++ b/src/domain/services/JoinReducer.js @@ -244,6 +244,7 @@ function requireDot(op) { /** * @typedef {Object} OpStrategy + * @property {string} receiptName - The TickReceipt-compatible operation type name (e.g. 'NodeTombstone' for NodeRemove) * @property {(state: WarpStateV5, op: OpLike, eventId: import('../utils/EventId.js').EventId) => void} mutate * @property {(state: WarpStateV5, op: OpLike, eventId: import('../utils/EventId.js').EventId) => OpOutcomeResult} outcome * @property {(state: WarpStateV5, op: OpLike) => SnapshotBeforeOp} snapshot @@ -253,6 +254,7 @@ function requireDot(op) { /** @type {OpStrategy} */ const nodeAddStrategy = { + receiptName: 'NodeAdd', validate(op) { requireString(op, 'node'); requireDot(op); }, mutate(state, op) { orsetAdd(state.nodeAlive, /** @type {string} */ (op.node), /** @type {import('../crdt/Dot.js').Dot} */ (op.dot)); @@ -272,6 +274,7 @@ const nodeAddStrategy = { /** @type {OpStrategy} */ const nodeRemoveStrategy = { + receiptName: 'NodeTombstone', validate(op) { requireIterable(op, 'observedDots'); }, mutate(state, op) { orsetRemove(state.nodeAlive, /** @type {Set} */ (/** @type {unknown} */ (op.observedDots))); @@ -292,6 +295,7 @@ const nodeRemoveStrategy = { /** @type {OpStrategy} */ const edgeAddStrategy = { + receiptName: 'EdgeAdd', validate(op) { requireString(op, 'from'); requireString(op, 'to'); requireString(op, 'label'); requireDot(op); }, mutate(state, op, eventId) { const edgeKey = encodeEdgeKey(/** @type {string} */ (op.from), /** @type {string} */ (op.to), /** @type {string} */ (op.label)); @@ -320,6 +324,7 @@ const edgeAddStrategy = { /** @type {OpStrategy} */ const edgeRemoveStrategy = { + receiptName: 'EdgeTombstone', validate(op) { requireIterable(op, 'observedDots'); }, mutate(state, op) { orsetRemove(state.edgeAlive, /** @type {Set} */ (/** @type {unknown} */ (op.observedDots))); @@ -379,6 +384,7 @@ function accumulatePropDiff(diff, state, nodeId, key, before) { /** @type {OpStrategy} */ const nodePropSetStrategy = { + receiptName: 'NodePropSet', validate(op) { requireString(op, 'node'); requireString(op, 'key'); }, mutate(state, op, eventId) { mutateProp(state, encodePropKey(/** @type {string} */ (op.node), /** @type {string} */ (op.key)), eventId, op.value); @@ -396,6 +402,7 @@ const nodePropSetStrategy = { /** @type {OpStrategy} */ const edgePropSetStrategy = { + receiptName: 'EdgePropSet', validate(op) { requireString(op, 'from'); requireString(op, 'to'); requireString(op, 'label'); requireString(op, 'key'); }, mutate(state, op, eventId) { mutateProp(state, encodeEdgePropKey(/** @type {string} */ (op.from), /** @type {string} */ (op.to), /** @type {string} */ (op.label), /** @type {string} */ (op.key)), eventId, op.value); @@ -413,6 +420,7 @@ const edgePropSetStrategy = { /** @type {OpStrategy} */ const propSetStrategy = { + receiptName: 'PropSet', validate(op) { requireString(op, 'node'); requireString(op, 'key'); }, mutate(state, op, eventId) { // Legacy raw PropSet — must NOT carry edge-property encoding at this point. @@ -438,6 +446,7 @@ const propSetStrategy = { /** @type {OpStrategy} */ const blobValueStrategy = { + receiptName: 'BlobValue', validate() { /* no-op: forward-compat */ }, mutate() { /* no-op: BlobValue has no state effect */ }, outcome(_state, op) { @@ -473,6 +482,9 @@ for (const [type, strategy] of OP_STRATEGIES) { throw new Error(`OpStrategy '${type}' missing required method '${method}'`); } } + if (typeof strategy.receiptName !== 'string' || strategy.receiptName.length === 0) { + throw new Error(`OpStrategy '${type}' missing required property 'receiptName'`); + } } /** @@ -495,30 +507,6 @@ export function applyOpV2(state, op, eventId) { strategy.mutate(state, op, eventId); } -/** - * Maps internal operation type names to TickReceipt-compatible operation type names. - * - * The internal representation uses "Remove" for tombstone operations, but the - * TickReceipt API uses "Tombstone" to be more explicit about CRDT semantics. - * This mapping ensures receipt consumers see the canonical operation names. - * - * Mappings: - * - NodeRemove -> NodeTombstone (CRDT tombstone semantics) - * - EdgeRemove -> EdgeTombstone (CRDT tombstone semantics) - * - All others pass through unchanged - * - * @const {Record} - */ -const RECEIPT_OP_TYPE = { - NodeAdd: 'NodeAdd', - NodeRemove: 'NodeTombstone', - EdgeAdd: 'EdgeAdd', - EdgeRemove: 'EdgeTombstone', - PropSet: 'PropSet', - NodePropSet: 'NodePropSet', - EdgePropSet: 'EdgePropSet', - BlobValue: 'BlobValue', -}; /** * Set of valid receipt op types (from TickReceipt) for fast membership checks. @@ -908,8 +896,7 @@ export function applyWithReceipt(state, patch, patchSha) { // Apply the op (mutates state) strategy.mutate(state, canonOp, eventId); - const mappedOp = /** @type {Record} */ (RECEIPT_OP_TYPE)[canonOp.type]; - const receiptOp = (typeof mappedOp === 'string' && mappedOp.length > 0) ? mappedOp : canonOp.type; + const receiptOp = strategy.receiptName; // Skip unknown/forward-compatible op types that aren't valid receipt ops if (!VALID_RECEIPT_OPS.has(receiptOp)) { continue; From 4d3fd44b6c12539b9f490d6a389bb395146c5eb9 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 21:27:49 -0700 Subject: [PATCH 29/73] fix: replace raw Error with SyncError for E_SYNC_DIVERGENCE SyncProtocol.loadPatchRange() was constructing a plain Error and manually attaching a code property. Now uses SyncError which provides proper error typing, context serialization, and consistent error hierarchy. Also added E_SYNC_DIVERGENCE to the SyncError error code table. Closes bad-code/PROTO_sync-protocol-raw-error --- src/domain/errors/SyncError.js | 1 + src/domain/services/SyncProtocol.js | 10 +++++----- 2 files changed, 6 insertions(+), 5 deletions(-) diff --git a/src/domain/errors/SyncError.js b/src/domain/errors/SyncError.js index 7c94c1e5..e233bd70 100644 --- a/src/domain/errors/SyncError.js +++ b/src/domain/errors/SyncError.js @@ -15,6 +15,7 @@ import WarpError from './WarpError.js'; * | `E_SYNC_REMOTE` | Remote server returned a 5xx error | * | `E_SYNC_PROTOCOL` | Protocol violation: 4xx, invalid JSON, or malformed response | * | `E_SYNC_PAYLOAD_INVALID` | Sync payload failed shape/resource-limit validation (B64) | + * | `E_SYNC_DIVERGENCE` | Writer chains have diverged (no common ancestor) | * | `SYNC_ERROR` | Generic/default sync error | * * @class SyncError diff --git a/src/domain/services/SyncProtocol.js b/src/domain/services/SyncProtocol.js index 5f80519a..cb73d8e7 100644 --- a/src/domain/services/SyncProtocol.js +++ b/src/domain/services/SyncProtocol.js @@ -41,6 +41,7 @@ import nullLogger from '../utils/nullLogger.js'; import { decodePatchMessage, assertOpsCompatible, SCHEMA_V3 } from './WarpMessageCodec.js'; import { join, cloneStateV5, isKnownRawOp } from './JoinReducer.js'; import SchemaUnsupportedError from '../errors/SchemaUnsupportedError.js'; +import SyncError from '../errors/SyncError.js'; import EncryptionError from '../errors/EncryptionError.js'; import PersistenceError from '../errors/PersistenceError.js'; import { cloneFrontier, updateFrontier } from './Frontier.js'; @@ -233,11 +234,10 @@ export async function loadPatchRange(persistence, _graphName, writerId, fromSha, // If fromSha was specified but we didn't reach it, we have divergence if (fromSha !== null && fromSha !== undefined && fromSha.length > 0 && cur === null) { - const err = /** @type {Error & { code: string }} */ (new Error( - `Divergence detected: ${toSha} does not descend from ${fromSha} for writer ${writerId}` - )); - err.code = 'E_SYNC_DIVERGENCE'; - throw err; + throw new SyncError( + `Divergence detected: ${toSha} does not descend from ${fromSha} for writer ${writerId}`, + { code: 'E_SYNC_DIVERGENCE', context: { writerId, fromSha, toSha } }, + ); } return patches; From 55d45b8e01003e3f38433a4720f3c14b1e1b8809 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 21:32:01 -0700 Subject: [PATCH 30/73] refactor: replace raw Errors in AuditReceiptService with AuditError Introduced AuditError domain error class with four error codes: - E_AUDIT_INVALID: receipt field validation failures - E_AUDIT_CAS_FAILED: compare-and-swap ref update failures - E_AUDIT_DEGRADED: service degraded after exhausting retries - E_AUDIT_CHAIN_GAP: missing commit in audit chain ancestry Replaced all 16 raw Error throws in AuditReceiptService with typed AuditError instances carrying serializable context. Added AuditError to package exports, index.d.ts, and type surface contract. Closes bad-code/PROTO_audit-receipt-raw-error --- contracts/type-surface.m8.json | 3 ++ index.d.ts | 19 ++++++++++ index.js | 2 + src/domain/errors/AuditError.js | 43 ++++++++++++++++++++++ src/domain/errors/index.js | 1 + src/domain/services/AuditReceiptService.js | 36 +++++++++--------- 6 files changed, 87 insertions(+), 17 deletions(-) create mode 100644 src/domain/errors/AuditError.js diff --git a/contracts/type-surface.m8.json b/contracts/type-surface.m8.json index 21b3a4c8..95bd66b5 100644 --- a/contracts/type-surface.m8.json +++ b/contracts/type-surface.m8.json @@ -3,6 +3,9 @@ "$comment": "M8 IRONCLAD type surface manifest — explicit runtime exports plus type-only declarations for index.d.ts validation", "version": 1, "exports": { + "AuditError": { + "kind": "class" + }, "BisectService": { "kind": "class" }, diff --git a/index.d.ts b/index.d.ts index 20011749..22f7c04a 100644 --- a/index.d.ts +++ b/index.d.ts @@ -1210,6 +1210,25 @@ export class PatchError extends Error { }); } +/** + * Error class for audit receipt validation and persistence failures. + */ +export class AuditError extends Error { + readonly name: 'AuditError'; + readonly code: string; + readonly context: Record; + + static readonly E_AUDIT_INVALID: 'E_AUDIT_INVALID'; + static readonly E_AUDIT_CAS_FAILED: 'E_AUDIT_CAS_FAILED'; + static readonly E_AUDIT_DEGRADED: 'E_AUDIT_DEGRADED'; + static readonly E_AUDIT_CHAIN_GAP: 'E_AUDIT_CHAIN_GAP'; + + constructor(message: string, options?: { + code?: string; + context?: Record; + }); +} + /** * Error class for sync transport operations. */ diff --git a/index.js b/index.js index 07c12f5e..ab4c18ba 100644 --- a/index.js +++ b/index.js @@ -49,6 +49,7 @@ import NoOpLogger from './src/infrastructure/adapters/NoOpLogger.js'; import ConsoleLogger, { LogLevel } from './src/infrastructure/adapters/ConsoleLogger.js'; import ClockAdapter from './src/infrastructure/adapters/ClockAdapter.js'; import { + AuditError, EncryptionError, ForkError, IndexError, @@ -210,6 +211,7 @@ export { DenoHttpAdapter, // Error types for integrity failure handling + AuditError, EncryptionError, PatchError, ForkError, diff --git a/src/domain/errors/AuditError.js b/src/domain/errors/AuditError.js new file mode 100644 index 00000000..59dd6d91 --- /dev/null +++ b/src/domain/errors/AuditError.js @@ -0,0 +1,43 @@ +import WarpError from './WarpError.js'; + +/** + * Error class for audit receipt validation and persistence failures. + * + * ## Error Codes + * + * | Code | Description | + * |------|-------------| + * | `E_AUDIT_INVALID` | Receipt field validation failed (version, OIDs, ticks, etc.) | + * | `E_AUDIT_CAS_FAILED` | Compare-and-swap failed during audit commit | + * | `E_AUDIT_DEGRADED` | Audit service degraded after exhausting retries | + * | `E_AUDIT_CHAIN_GAP` | Audit chain has a gap (missing commit in ancestry) | + * + * @class AuditError + * @extends WarpError + * + * @property {string} name - Always 'AuditError' for instanceof checks + * @property {string} code - Machine-readable error code for programmatic handling + * @property {Record} context - Serializable context object with error details + */ +export default class AuditError extends WarpError { + /** Receipt field validation failed. */ + static E_AUDIT_INVALID = 'E_AUDIT_INVALID'; + + /** Compare-and-swap failed during audit commit. */ + static E_AUDIT_CAS_FAILED = 'E_AUDIT_CAS_FAILED'; + + /** Audit service degraded after exhausting retries. */ + static E_AUDIT_DEGRADED = 'E_AUDIT_DEGRADED'; + + /** Audit chain has a gap (missing commit in ancestry). */ + static E_AUDIT_CHAIN_GAP = 'E_AUDIT_CHAIN_GAP'; + + /** + * Creates an AuditError with the given message and error code. + * @param {string} message - Human-readable error description + * @param {{ code?: string, context?: Record }} [options={}] - Error options + */ + constructor(message, options = {}) { + super(message, options.code ?? 'E_AUDIT_INVALID', options); + } +} diff --git a/src/domain/errors/index.js b/src/domain/errors/index.js index 592af009..3c919190 100644 --- a/src/domain/errors/index.js +++ b/src/domain/errors/index.js @@ -4,6 +4,7 @@ * @module domain/errors */ +export { default as AuditError } from './AuditError.js'; export { default as EmptyMessageError } from './EmptyMessageError.js'; export { default as EncryptionError } from './EncryptionError.js'; export { default as PersistenceError } from './PersistenceError.js'; diff --git a/src/domain/services/AuditReceiptService.js b/src/domain/services/AuditReceiptService.js index 3d9cae25..650b6411 100644 --- a/src/domain/services/AuditReceiptService.js +++ b/src/domain/services/AuditReceiptService.js @@ -10,6 +10,7 @@ * @see docs/specs/AUDIT_RECEIPT.md */ +import AuditError from '../errors/AuditError.js'; import { buildAuditRef } from '../utils/RefLayout.js'; import { encodeAuditMessage } from './AuditMessageCodec.js'; @@ -92,7 +93,7 @@ const OID_HEX_PATTERN = /^[0-9a-f]{40}([0-9a-f]{24})?$/; * * @param {{ version: number, graphName: string, writerId: string, dataCommit: string, tickStart: number, tickEnd: number, opsDigest: string, prevAuditCommit: string, timestamp: number }} fields * @returns {Readonly>} - * @throws {Error} If any field is invalid + * @throws {AuditError} If any field is invalid (code: E_AUDIT_INVALID) */ export function buildReceiptRecord(fields) { const { @@ -102,66 +103,66 @@ export function buildReceiptRecord(fields) { // version if (version !== 1) { - throw new Error(`Invalid version: must be 1, got ${version}`); + throw new AuditError(`Invalid version: must be 1, got ${version}`, { context: { version } }); } // graphName — validated by RefLayout if (typeof graphName !== 'string' || graphName.length === 0) { - throw new Error('Invalid graphName: must be a non-empty string'); + throw new AuditError('Invalid graphName: must be a non-empty string', { context: { graphName } }); } // writerId — validated by RefLayout if (typeof writerId !== 'string' || writerId.length === 0) { - throw new Error('Invalid writerId: must be a non-empty string'); + throw new AuditError('Invalid writerId: must be a non-empty string', { context: { writerId } }); } // dataCommit const dc = dataCommit.toLowerCase(); if (!OID_HEX_PATTERN.test(dc)) { - throw new Error(`Invalid dataCommit OID: ${dataCommit}`); + throw new AuditError(`Invalid dataCommit OID: ${dataCommit}`, { context: { dataCommit } }); } // opsDigest const od = opsDigest.toLowerCase(); if (!/^[0-9a-f]{64}$/.test(od)) { - throw new Error(`Invalid opsDigest: must be 64-char lowercase hex, got ${opsDigest}`); + throw new AuditError(`Invalid opsDigest: must be 64-char lowercase hex, got ${opsDigest}`, { context: { opsDigest } }); } // prevAuditCommit const pac = prevAuditCommit.toLowerCase(); if (!OID_HEX_PATTERN.test(pac)) { - throw new Error(`Invalid prevAuditCommit OID: ${prevAuditCommit}`); + throw new AuditError(`Invalid prevAuditCommit OID: ${prevAuditCommit}`, { context: { prevAuditCommit } }); } // OID length consistency const oidLen = dc.length; if (pac.length !== oidLen) { - throw new Error(`OID length mismatch: dataCommit=${dc.length}, prevAuditCommit=${pac.length}`); + throw new AuditError(`OID length mismatch: dataCommit=${dc.length}, prevAuditCommit=${pac.length}`, { context: { dataCommitLen: dc.length, prevAuditCommitLen: pac.length } }); } // tick constraints if (!Number.isInteger(tickStart) || tickStart < 1) { - throw new Error(`Invalid tickStart: must be integer >= 1, got ${tickStart}`); + throw new AuditError(`Invalid tickStart: must be integer >= 1, got ${tickStart}`, { context: { tickStart } }); } if (!Number.isInteger(tickEnd) || tickEnd < tickStart) { - throw new Error(`Invalid tickEnd: must be integer >= tickStart, got ${tickEnd}`); + throw new AuditError(`Invalid tickEnd: must be integer >= tickStart, got ${tickEnd}`, { context: { tickEnd, tickStart } }); } if (version === 1 && tickStart !== tickEnd) { - throw new Error(`v1 requires tickStart === tickEnd, got ${tickStart} !== ${tickEnd}`); + throw new AuditError(`v1 requires tickStart === tickEnd, got ${tickStart} !== ${tickEnd}`, { context: { tickStart, tickEnd } }); } // Zero-hash sentinel only for genesis (tickStart === 1) const zeroHash = '0'.repeat(oidLen); if (pac === zeroHash && tickStart > 1) { - throw new Error('Non-genesis receipt cannot use zero-hash sentinel'); + throw new AuditError('Non-genesis receipt cannot use zero-hash sentinel', { context: { tickStart, prevAuditCommit: pac } }); } // timestamp if (!Number.isInteger(timestamp) || timestamp < 0) { - throw new Error(`Invalid timestamp: must be non-negative safe integer, got ${timestamp}`); + throw new AuditError(`Invalid timestamp: must be non-negative safe integer, got ${timestamp}`, { context: { timestamp } }); } if (!Number.isSafeInteger(timestamp)) { - throw new Error(`Invalid timestamp: exceeds Number.MAX_SAFE_INTEGER: ${timestamp}`); + throw new AuditError(`Invalid timestamp: exceeds Number.MAX_SAFE_INTEGER: ${timestamp}`, { context: { timestamp } }); } // Build with keys in sorted order (canonical for CBOR) @@ -317,8 +318,9 @@ export class AuditReceiptService { actual: writer, patchSha, }); - throw new Error( + throw new AuditError( `Audit writer mismatch: expected '${this._writerId}', got '${writer}'`, + { context: { expected: this._writerId, actual: writer, patchSha } }, ); } @@ -405,7 +407,7 @@ export class AuditReceiptService { } catch { if (this._retrying) { // Second CAS failure during retry → degrade - throw new Error('CAS failed during retry'); + throw new AuditError('CAS failed during retry', { code: AuditError.E_AUDIT_CAS_FAILED, context: { writerId: this._writerId, ref: this._auditRef } }); } // CAS mismatch — retry once with refreshed tip return await this._retryAfterCasConflict(commitSha, tickReceipt); @@ -450,7 +452,7 @@ export class AuditReceiptService { writerId: this._writerId, reason: 'second CAS failure', }); - throw new Error('Audit service degraded after second CAS failure'); + throw new AuditError('Audit service degraded after second CAS failure', { code: AuditError.E_AUDIT_DEGRADED, context: { writerId: this._writerId } }); } finally { this._retrying = false; } From cf368678403edc113d0528d681609bb60ae60cd9 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 21:34:07 -0700 Subject: [PATCH 31/73] fix(cli): replace __dirname polyfill with import.meta.url resolution MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit The CLI used new URL(import.meta.url).pathname → path.dirname() → path.resolve() chains to locate package.json and hook templates. This is fragile under bundling or npm link. Now uses fileURLToPath(new URL('../..', import.meta.url)) to resolve the package root, which is the idiomatic ESM pattern and works across Node, Bun, and Deno. Closes backlog/asap/TUI_cli-dirname-fragility --- bin/cli/shared.js | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/bin/cli/shared.js b/bin/cli/shared.js index 3518561f..d9a2c35d 100644 --- a/bin/cli/shared.js +++ b/bin/cli/shared.js @@ -1,6 +1,7 @@ import fs from 'node:fs'; import path from 'node:path'; import process from 'node:process'; +import { fileURLToPath } from 'node:url'; import readline from 'node:readline'; import { execFileSync } from 'node:child_process'; import { textEncode } from '../../src/domain/utils/bytes.js'; @@ -202,10 +203,9 @@ export async function readCheckpointDate(persistence, checkpointSha) { * @returns {import('../../src/domain/services/HookInstaller.js').HookInstaller} */ export function createHookInstaller() { - const __filename = new URL(import.meta.url).pathname; - const __dirname = path.dirname(__filename); - const templateDir = path.resolve(__dirname, '..', '..', 'scripts', 'hooks'); - const rawJson = fs.readFileSync(path.resolve(__dirname, '..', '..', 'package.json'), 'utf8'); + const packageRoot = fileURLToPath(new URL('../..', import.meta.url)); + const templateDir = path.join(packageRoot, 'scripts', 'hooks'); + const rawJson = fs.readFileSync(path.join(packageRoot, 'package.json'), 'utf8'); const version = readPackageVersion(rawJson); return new HookInstaller({ fs: /** @type {import('../../src/domain/services/HookInstaller.js').FsAdapter} */ (/** @type {unknown} */ (fs)), From a98e8f1a3ecce29a7a9844b514d0510e00cbd02d Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 21:36:34 -0700 Subject: [PATCH 32/73] refactor(lint): restore dot-notation via @typescript-eslint/dot-notation Re-enabled dot-notation enforcement using the type-aware rule which respects tsconfig noPropertyAccessFromIndexSignature. The base ESLint dot-notation remains off; the type-aware variant handles index-signature vs regular-property distinction correctly. Auto-fixed 53 violations across 5 files. Three sites that cast to Record need bracket access (TS4111) and get targeted eslint-disable comments. Closes backlog/asap/DX_restore-dot-notation --- bin/cli/infrastructure.js | 14 ++++---- bin/warp-graph.js | 1 + eslint.config.js | 3 +- src/domain/services/AuditVerifierService.js | 34 +++++++++---------- .../services/StreamingBitmapIndexBuilder.js | 2 +- src/domain/trust/reasonCodes.js | 22 ++++++------ src/domain/types/WarpErrors.js | 2 ++ 7 files changed, 41 insertions(+), 37 deletions(-) diff --git a/bin/cli/infrastructure.js b/bin/cli/infrastructure.js index 6f03259b..13fbc472 100644 --- a/bin/cli/infrastructure.js +++ b/bin/cli/infrastructure.js @@ -407,13 +407,13 @@ export function parseArgs(argv) { /** @type {CliOptions} */ const options = { - repo: path.resolve(typeof values['repo'] === 'string' ? values['repo'] : process.cwd()), - json: Boolean(values['json']), - ndjson: Boolean(values['ndjson']), - view: typeof values['view'] === 'string' ? values['view'] : null, - graph: typeof values['graph'] === 'string' ? values['graph'] : null, - writer: typeof values['writer'] === 'string' ? values['writer'] : 'cli', - help: Boolean(values['help']), + repo: path.resolve(typeof values.repo === 'string' ? values.repo : process.cwd()), + json: Boolean(values.json), + ndjson: Boolean(values.ndjson), + view: typeof values.view === 'string' ? values.view : null, + graph: typeof values.graph === 'string' ? values.graph : null, + writer: typeof values.writer === 'string' ? values.writer : 'cli', + help: Boolean(values.help), }; return { options, command, commandArgs }; diff --git a/bin/warp-graph.js b/bin/warp-graph.js index 7b1a99eb..22d3da64 100755 --- a/bin/warp-graph.js +++ b/bin/warp-graph.js @@ -76,6 +76,7 @@ async function main() { // Long-running commands may return a `close` function. // Wait for SIGINT/SIGTERM instead of exiting immediately. const close = result !== null && result !== undefined && typeof result === 'object' && 'close' in /** @type {Record} */ (result) + // eslint-disable-next-line @typescript-eslint/dot-notation -- Record requires bracket access (TS4111) ? /** @type {() => Promise} */ (/** @type {Record} */ (result)['close']) : null; diff --git a/eslint.config.js b/eslint.config.js index e2d6dff0..84d38175 100644 --- a/eslint.config.js +++ b/eslint.config.js @@ -170,8 +170,9 @@ export default tseslint.config( }], "no-useless-computed-key": "error", "no-useless-rename": "error", - // dot-notation disabled: conflicts with tsconfig noPropertyAccessFromIndexSignature + // Base dot-notation off; type-aware version below respects noPropertyAccessFromIndexSignature "dot-notation": "off", + "@typescript-eslint/dot-notation": "error", "grouped-accessor-pairs": ["error", "getBeforeSet"], "accessor-pairs": "error", diff --git a/src/domain/services/AuditVerifierService.js b/src/domain/services/AuditVerifierService.js index 432edfb1..6795a047 100644 --- a/src/domain/services/AuditVerifierService.js +++ b/src/domain/services/AuditVerifierService.js @@ -106,35 +106,35 @@ function validateReceiptSchema(receipt) { return `missing field: ${k}`; } } - if (rec['version'] !== 1) { - return `unsupported version: ${rec['version']}`; + if (rec.version !== 1) { + return `unsupported version: ${rec.version}`; } - if (typeof rec['graphName'] !== 'string' || rec['graphName'].length === 0) { + if (typeof rec.graphName !== 'string' || rec.graphName.length === 0) { return 'graphName must be a non-empty string'; } - if (typeof rec['writerId'] !== 'string' || rec['writerId'].length === 0) { + if (typeof rec.writerId !== 'string' || rec.writerId.length === 0) { return 'writerId must be a non-empty string'; } - if (typeof rec['dataCommit'] !== 'string') { + if (typeof rec.dataCommit !== 'string') { return 'dataCommit must be a string'; } - if (typeof rec['opsDigest'] !== 'string') { + if (typeof rec.opsDigest !== 'string') { return 'opsDigest must be a string'; } - if (typeof rec['prevAuditCommit'] !== 'string') { + if (typeof rec.prevAuditCommit !== 'string') { return 'prevAuditCommit must be a string'; } - if (!Number.isInteger(rec['tickStart']) || /** @type {number} */ (rec['tickStart']) < 1) { - return `tickStart must be integer >= 1, got ${rec['tickStart']}`; + if (!Number.isInteger(rec.tickStart) || /** @type {number} */ (rec.tickStart) < 1) { + return `tickStart must be integer >= 1, got ${rec.tickStart}`; } - if (!Number.isInteger(rec['tickEnd']) || /** @type {number} */ (rec['tickEnd']) < /** @type {number} */ (rec['tickStart'])) { - return `tickEnd must be integer >= tickStart, got ${rec['tickEnd']}`; + if (!Number.isInteger(rec.tickEnd) || /** @type {number} */ (rec.tickEnd) < /** @type {number} */ (rec.tickStart)) { + return `tickEnd must be integer >= tickStart, got ${rec.tickEnd}`; } - if (rec['version'] === 1 && rec['tickStart'] !== rec['tickEnd']) { - return `v1 requires tickStart === tickEnd, got ${rec['tickStart']} !== ${rec['tickEnd']}`; + if (rec.version === 1 && rec.tickStart !== rec.tickEnd) { + return `v1 requires tickStart === tickEnd, got ${rec.tickStart} !== ${rec.tickEnd}`; } - if (!Number.isInteger(rec['timestamp']) || /** @type {number} */ (rec['timestamp']) < 0) { - return `timestamp must be non-negative integer, got ${rec['timestamp']}`; + if (!Number.isInteger(rec.timestamp) || /** @type {number} */ (rec.timestamp) < 0) { + return `timestamp must be non-negative integer, got ${rec.timestamp}`; } return null; } @@ -758,7 +758,7 @@ export class AuditVerifierService { status: 'error', source, sourceDetail, - reasonCode: TRUST_REASON_CODES['TRUST_RECORD_CHAIN_INVALID'], + reasonCode: TRUST_REASON_CODES.TRUST_RECORD_CHAIN_INVALID, reason: `Trust chain read failed: ${recordsResult.error.message}`, }); } @@ -795,7 +795,7 @@ export class AuditVerifierService { sourceDetail, writerIds: options.writerIds || [], recordsScanned: records.length, - reasonCode: TRUST_REASON_CODES['TRUST_RECORD_CHAIN_INVALID'], + reasonCode: TRUST_REASON_CODES.TRUST_RECORD_CHAIN_INVALID, reason: `Trust chain invalid: ${(typeof chainResult.errors[0]?.error === 'string' && chainResult.errors[0].error.length > 0) ? chainResult.errors[0].error : 'unknown chain error'}`, }); } diff --git a/src/domain/services/StreamingBitmapIndexBuilder.js b/src/domain/services/StreamingBitmapIndexBuilder.js index 35c4e4fa..95f8371c 100644 --- a/src/domain/services/StreamingBitmapIndexBuilder.js +++ b/src/domain/services/StreamingBitmapIndexBuilder.js @@ -331,7 +331,7 @@ export default class StreamingBitmapIndexBuilder { const type = key.substring(0, 3); const sha = key.substring(4); const prefix = sha.substring(0, 2); - const bucket = type === 'fwd' ? bitmapShards['fwd'] : bitmapShards['rev']; + const bucket = type === 'fwd' ? bitmapShards.fwd : bitmapShards.rev; if (bucket[prefix] === undefined) { bucket[prefix] = {}; diff --git a/src/domain/trust/reasonCodes.js b/src/domain/trust/reasonCodes.js index 9a5daa15..df8c4336 100644 --- a/src/domain/trust/reasonCodes.js +++ b/src/domain/trust/reasonCodes.js @@ -56,23 +56,23 @@ export const TRUST_REASON_CODES = Object.freeze({ /** @type {ReadonlySet} */ export const POSITIVE_CODES = Object.freeze(new Set([ - TRUST_REASON_CODES['WRITER_BOUND_TO_ACTIVE_KEY'], + TRUST_REASON_CODES.WRITER_BOUND_TO_ACTIVE_KEY, ])); /** @type {ReadonlySet} */ export const NEGATIVE_CODES = Object.freeze(new Set([ - TRUST_REASON_CODES['WRITER_HAS_NO_ACTIVE_BINDING'], - TRUST_REASON_CODES['WRITER_BOUND_KEY_REVOKED'], - TRUST_REASON_CODES['BINDING_REVOKED'], - TRUST_REASON_CODES['KEY_UNKNOWN'], + TRUST_REASON_CODES.WRITER_HAS_NO_ACTIVE_BINDING, + TRUST_REASON_CODES.WRITER_BOUND_KEY_REVOKED, + TRUST_REASON_CODES.BINDING_REVOKED, + TRUST_REASON_CODES.KEY_UNKNOWN, ])); /** @type {ReadonlySet} */ export const SYSTEM_CODES = Object.freeze(new Set([ - TRUST_REASON_CODES['TRUST_REF_MISSING'], - TRUST_REASON_CODES['TRUST_PIN_INVALID'], - TRUST_REASON_CODES['TRUST_RECORD_SCHEMA_INVALID'], - TRUST_REASON_CODES['TRUST_SIGNATURE_INVALID'], - TRUST_REASON_CODES['TRUST_RECORD_CHAIN_INVALID'], - TRUST_REASON_CODES['TRUST_POLICY_INVALID'], + TRUST_REASON_CODES.TRUST_REF_MISSING, + TRUST_REASON_CODES.TRUST_PIN_INVALID, + TRUST_REASON_CODES.TRUST_RECORD_SCHEMA_INVALID, + TRUST_REASON_CODES.TRUST_SIGNATURE_INVALID, + TRUST_REASON_CODES.TRUST_RECORD_CHAIN_INVALID, + TRUST_REASON_CODES.TRUST_POLICY_INVALID, ])); diff --git a/src/domain/types/WarpErrors.js b/src/domain/types/WarpErrors.js index 58edeaf1..c71c5d8d 100644 --- a/src/domain/types/WarpErrors.js +++ b/src/domain/types/WarpErrors.js @@ -26,6 +26,7 @@ export function hasErrorCode(err) { typeof err === 'object' && err !== null && 'code' in err && + // eslint-disable-next-line @typescript-eslint/dot-notation -- Record requires bracket access (TS4111) typeof (/** @type {Record} */ (err))['code'] === 'string' ); } @@ -40,6 +41,7 @@ export function hasMessage(err) { typeof err === 'object' && err !== null && 'message' in err && + // eslint-disable-next-line @typescript-eslint/dot-notation -- Record requires bracket access (TS4111) typeof (/** @type {Record} */ (err))['message'] === 'string' ); } From ef80da5c452b27f56d06b94fa56b9be7298b3791 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 21:37:45 -0700 Subject: [PATCH 33/73] chore: update CHANGELOG and close resolved backlog items Removed 6 resolved backlog items: - bad-code/PROTO_receipt-op-type-redundant (replaced by OpStrategy.receiptName) - bad-code/PROTO_sync-protocol-raw-error (replaced by SyncError) - bad-code/PROTO_audit-receipt-raw-error (replaced by AuditError) - bad-code/PERF_transitive-reduction-redundant-adjlist (already fixed) - asap/DX_restore-dot-notation (restored via @typescript-eslint) - asap/TUI_cli-dirname-fragility (fixed via import.meta.url) Updated CHANGELOG [Unreleased] section with all changes. --- CHANGELOG.md | 8 +++++- .../backlog/asap/DX_restore-dot-notation.md | 12 -------- .../backlog/asap/TUI_cli-dirname-fragility.md | 28 ------------------- ..._transitive-reduction-redundant-adjlist.md | 16 ----------- .../bad-code/PROTO_audit-receipt-raw-error.md | 10 ------- .../PROTO_receipt-op-type-redundant.md | 10 ------- .../bad-code/PROTO_sync-protocol-raw-error.md | 9 ------ 7 files changed, 7 insertions(+), 86 deletions(-) delete mode 100644 docs/method/backlog/asap/DX_restore-dot-notation.md delete mode 100644 docs/method/backlog/asap/TUI_cli-dirname-fragility.md delete mode 100644 docs/method/backlog/bad-code/PERF_transitive-reduction-redundant-adjlist.md delete mode 100644 docs/method/backlog/bad-code/PROTO_audit-receipt-raw-error.md delete mode 100644 docs/method/backlog/bad-code/PROTO_receipt-op-type-redundant.md delete mode 100644 docs/method/backlog/bad-code/PROTO_sync-protocol-raw-error.md diff --git a/CHANGELOG.md b/CHANGELOG.md index bfb5821e..24fef184 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -14,14 +14,20 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 - **Zero-error TypeScript campaign complete** — eliminated all 1,707 `tsc --noEmit` errors across 271 files. Mechanical TS4111 bracket-access sweep (614), null guards for `noUncheckedIndexedAccess`, conditional spreads for `exactOptionalPropertyTypes`, unused variable removal. All 8 pre-push IRONCLAD gates now pass. - **JoinReducer OpStrategy registry** — replaced five triplicated switch statements over 8 canonical op types with a frozen `Map` registry. Each strategy defines `mutate`, `outcome`, `snapshot`, `accumulate`, `validate`. Adding a new op type without all five methods is a hard error at module load time. Cross-path equivalence tests verify `applyFast`, `applyWithReceipt`, and `applyWithDiff` produce identical CRDT state. -- **ESLint `dot-notation` disabled** — conflicts with `noPropertyAccessFromIndexSignature` tsconfig flag. The TypeScript flag provides type safety; the ESLint rule is purely stylistic. +- **ESLint `dot-notation` restored** — re-enabled via `@typescript-eslint/dot-notation` which respects `noPropertyAccessFromIndexSignature`. The type-aware variant correctly allows bracket access on index-signature types while enforcing dot notation elsewhere. - `EffectSinkPort.deliver()` return type widened to `DeliveryObservation | DeliveryObservation[]` to match `MultiplexSink` behavior. - **Zero-error lint campaign complete** — eliminated all 1,876 ESLint errors across ~180 source files. Every raw `Error` replaced with domain error classes. Every port stub uses `WarpError` with `E_NOT_IMPLEMENTED`. `MessageCodecInternal` type-poisoning from `@git-stunts/trailer-codec` fixed at root via `unknown` intermediary casts. Errors barrel (`src/domain/errors/index.js`) now exports all 27 error classes. - **Lint ratchet enforcement** — `npm run lint:ratchet` asserts zero ESLint errors codebase-wide. Added as CI Gate 4b. Pre-push hook (Gate 4) already blocked non-zero exits; ratchet makes the invariant explicit and auditable. - **Git hooks wired** — `core.hooksPath` set to `scripts/hooks/` on `npm install`. Pre-commit lints staged JS files. Pre-push runs full 8-gate IRONCLAD firewall. +- **OpStrategy.receiptName** — each OpStrategy entry now carries its own TickReceipt-compatible operation name, eliminating the redundant `RECEIPT_OP_TYPE` lookup tables in JoinReducer and ConflictAnalyzerService. +- **SyncProtocol uses SyncError** — `E_SYNC_DIVERGENCE` now throws `SyncError` instead of a raw `Error` with manually attached code property. +- **AuditReceiptService uses AuditError** — all 16 raw `Error` throws replaced with typed `AuditError` carrying serializable context and machine-readable error codes (`E_AUDIT_INVALID`, `E_AUDIT_CAS_FAILED`, `E_AUDIT_DEGRADED`). +- **CLI import.meta.url resolution** — replaced `__dirname` polyfill pattern in CLI with idiomatic `fileURLToPath(new URL('../..', import.meta.url))` for resilient package root resolution. ### Added +- **`AuditError`** — domain error class for audit receipt validation and persistence failures. Exported from package root with four static error codes. + - **Effect emission & delivery observation substrate slice** — new receipt families for outbound effects and their delivery lifecycle. `EffectEmission` records that the system produced an outbound effect candidate at a causal coordinate. `DeliveryObservation` records how a sink handled that emission (delivered, suppressed, failed, skipped). `ExternalizationPolicy` provides execution context (live/replay/inspect) that shapes delivery behavior. Preset lenses `LIVE_LENS`, `REPLAY_LENS`, and `INSPECT_LENS` cover common modes. - **`EffectSinkPort`** — abstract port for effect delivery sinks, following the hexagonal architecture pattern. - **`MultiplexSink`** — domain service that fans out one emission to multiple child sinks (composite pattern over `EffectSinkPort`). diff --git a/docs/method/backlog/asap/DX_restore-dot-notation.md b/docs/method/backlog/asap/DX_restore-dot-notation.md deleted file mode 100644 index 52d4080f..00000000 --- a/docs/method/backlog/asap/DX_restore-dot-notation.md +++ /dev/null @@ -1,12 +0,0 @@ -# Restore `dot-notation` via `@typescript-eslint/dot-notation` - -**Effort:** S - -## Problem - -ESLint `dot-notation` was disabled globally to resolve conflict with `noPropertyAccessFromIndexSignature`. The proper fix is switching to `@typescript-eslint/dot-notation` which respects the tsconfig flag. This restores lint coverage for actual dot-notation misuse while allowing bracket access on index signatures. - -## Notes - -- Source: P1b priority tier (TSC Zero Campaign Drift Audit) -- High priority diff --git a/docs/method/backlog/asap/TUI_cli-dirname-fragility.md b/docs/method/backlog/asap/TUI_cli-dirname-fragility.md deleted file mode 100644 index af6db976..00000000 --- a/docs/method/backlog/asap/TUI_cli-dirname-fragility.md +++ /dev/null @@ -1,28 +0,0 @@ -# CLI __dirname Path Traversal Fragility - -**Effort:** S - -## Problem - -`bin/cli/shared.js` uses relative path traversal to locate static assets: - -```js -const __dirname = path.dirname(__filename); -const templateDir = path.resolve(__dirname, '..', '..', 'scripts', 'hooks'); -const rawJson = fs.readFileSync(path.resolve(__dirname, '..', '..', 'package.json'), 'utf8'); -``` - -This binds runtime behavior to the physical directory layout of the source repository. If the CLI is ever bundled (esbuild, rollup, webpack) or installed via a global npm link with a different structure, `__dirname` resolves incorrectly and causes fatal runtime crashes. - -## Fix Options - -1. **Inline static assets at build time** — read `package.json` version at build, embed as a constant. Hook templates could be inlined or resolved relative to `import.meta.url`. -2. **Use `import.meta.url`** — already ESM, so `new URL('../..', import.meta.url)` is the standard pattern. More resilient than `__dirname` polyfill. -3. **`createRequire(import.meta.url)` for JSON** — `createRequire(import.meta.url)('../package.json')` works in Node, Bun, Deno. - -## Notes - -- The CLI already uses ESM (`import` statements throughout) -- `import.meta.url` is the idiomatic ESM approach -- This also affects `bin/cli/commands/install-hooks.js` which references `scripts/hooks/` -- Multi-runtime support (Node/Bun/Deno) means the fix must work across all three diff --git a/docs/method/backlog/bad-code/PERF_transitive-reduction-redundant-adjlist.md b/docs/method/backlog/bad-code/PERF_transitive-reduction-redundant-adjlist.md deleted file mode 100644 index 863b3d86..00000000 --- a/docs/method/backlog/bad-code/PERF_transitive-reduction-redundant-adjlist.md +++ /dev/null @@ -1,16 +0,0 @@ -# transitiveReduction builds adjacency list redundantly - -**Effort:** S - -## Problem - -After getting `_neighborEdgeMap` from topo sort (which already has -full neighbor data), `transitiveReduction()` builds a *second* -`adjList: Map` by extracting just the neighborIds. -Two representations of the same edge set sit in memory -simultaneously. - -## Fix - -Use `_neighborEdgeMap` directly in the BFS, accessing `.neighborId` -inline instead of pre-extracting. diff --git a/docs/method/backlog/bad-code/PROTO_audit-receipt-raw-error.md b/docs/method/backlog/bad-code/PROTO_audit-receipt-raw-error.md deleted file mode 100644 index 16e6fce0..00000000 --- a/docs/method/backlog/bad-code/PROTO_audit-receipt-raw-error.md +++ /dev/null @@ -1,10 +0,0 @@ -# AuditReceiptService uses raw Error (18 occurrences) - -**Effort:** S - -## Problem - -All validation and CAS errors in `AuditReceiptService.js` throw -plain `Error` instead of a domain error class. Should use a -dedicated `AuditError` (which doesn't exist yet) or `PatchError` -for validation failures and `PersistenceError` for CAS conflicts. diff --git a/docs/method/backlog/bad-code/PROTO_receipt-op-type-redundant.md b/docs/method/backlog/bad-code/PROTO_receipt-op-type-redundant.md deleted file mode 100644 index 29ab944a..00000000 --- a/docs/method/backlog/bad-code/PROTO_receipt-op-type-redundant.md +++ /dev/null @@ -1,10 +0,0 @@ -# RECEIPT_OP_TYPE mapping redundant with OpStrategy - -**Effort:** XS - -## Problem - -`JoinReducer.js` `RECEIPT_OP_TYPE` maps internal names to receipt -names (e.g. `NodeRemove` -> `NodeTombstone`). With the OpStrategy -registry, this could be a `receiptName` property on each strategy -object. Low priority — cosmetic. diff --git a/docs/method/backlog/bad-code/PROTO_sync-protocol-raw-error.md b/docs/method/backlog/bad-code/PROTO_sync-protocol-raw-error.md deleted file mode 100644 index 0f6c5632..00000000 --- a/docs/method/backlog/bad-code/PROTO_sync-protocol-raw-error.md +++ /dev/null @@ -1,9 +0,0 @@ -# SyncProtocol uses raw Error with manual code property - -**Effort:** XS - -## Problem - -`SyncProtocol.js` (~line 233) constructs `new Error()` then manually -casts to `Error & { code: string }`. Should use `SyncError` from -domain errors. From a0bb2107aea77bc319ab7029c298a082ccb8e144 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 22:13:38 -0700 Subject: [PATCH 34/73] refactor: promote AuditReceipt from typedef to concrete class AuditReceipt is now a real JavaScript class with a constructor, frozen after construction. This replaces the JSDoc @typedef that existed only at type-check time. - Class fields declared in alphabetical order for deterministic Object.keys() (canonical CBOR serialization depends on this) - buildReceiptRecord() returns AuditReceipt instances - AuditVerifierService imports the class, validates CBOR-decoded blobs via Record bracket access (honest typing for deserialized data) - Deleted UncheckedAuditReceipt typedef (never needed with proper Record cast) --- src/domain/services/AuditReceiptService.js | 55 ++++++++++++++++++--- src/domain/services/AuditVerifierService.js | 45 +++++++---------- 2 files changed, 64 insertions(+), 36 deletions(-) diff --git a/src/domain/services/AuditReceiptService.js b/src/domain/services/AuditReceiptService.js index 650b6411..923ce539 100644 --- a/src/domain/services/AuditReceiptService.js +++ b/src/domain/services/AuditReceiptService.js @@ -81,6 +81,46 @@ export async function computeOpsDigest(ops, crypto) { return await crypto.hash('sha256', combined); } +// ============================================================================ +// Receipt Value Object +// ============================================================================ + +/** + * Immutable audit receipt value object. + * + * Instances are frozen after construction. Keys are stored in sorted + * order for deterministic CBOR serialization. + */ +export class AuditReceipt { + /** @type {string} */ dataCommit; + /** @type {string} */ graphName; + /** @type {string} */ opsDigest; + /** @type {string} */ prevAuditCommit; + /** @type {number} */ tickEnd; + /** @type {number} */ tickStart; + /** @type {number} */ timestamp; + /** @type {number} */ version; + /** @type {string} */ writerId; + + /** + * Creates an immutable audit receipt from validated fields. + * @param {{ version: number, graphName: string, writerId: string, dataCommit: string, tickStart: number, tickEnd: number, opsDigest: string, prevAuditCommit: string, timestamp: number }} fields + */ + constructor({ version, graphName, writerId, dataCommit, tickStart, tickEnd, opsDigest, prevAuditCommit, timestamp }) { + // Alphabetical key order for canonical CBOR + this.dataCommit = dataCommit; + this.graphName = graphName; + this.opsDigest = opsDigest; + this.prevAuditCommit = prevAuditCommit; + this.tickEnd = tickEnd; + this.tickStart = tickStart; + this.timestamp = timestamp; + this.version = version; + this.writerId = writerId; + Object.freeze(this); + } +} + // ============================================================================ // Receipt Construction // ============================================================================ @@ -92,7 +132,7 @@ const OID_HEX_PATTERN = /^[0-9a-f]{40}([0-9a-f]{24})?$/; * Validates and builds a frozen receipt record with keys in sorted order. * * @param {{ version: number, graphName: string, writerId: string, dataCommit: string, tickStart: number, tickEnd: number, opsDigest: string, prevAuditCommit: string, timestamp: number }} fields - * @returns {Readonly>} + * @returns {AuditReceipt} * @throws {AuditError} If any field is invalid (code: E_AUDIT_INVALID) */ export function buildReceiptRecord(fields) { @@ -165,17 +205,16 @@ export function buildReceiptRecord(fields) { throw new AuditError(`Invalid timestamp: exceeds Number.MAX_SAFE_INTEGER: ${timestamp}`, { context: { timestamp } }); } - // Build with keys in sorted order (canonical for CBOR) - return Object.freeze({ - dataCommit: dc, + return new AuditReceipt({ + version, graphName, + writerId, + dataCommit: dc, + tickStart, + tickEnd, opsDigest: od, prevAuditCommit: pac, - tickEnd, - tickStart, timestamp, - version, - writerId, }); } diff --git a/src/domain/services/AuditVerifierService.js b/src/domain/services/AuditVerifierService.js index 6795a047..9a53f331 100644 --- a/src/domain/services/AuditVerifierService.js +++ b/src/domain/services/AuditVerifierService.js @@ -16,18 +16,7 @@ * @see docs/specs/AUDIT_RECEIPT.md Section 8 */ -/** - * @typedef {Object} AuditReceipt - * @property {number} version - * @property {string} graphName - * @property {string} writerId - * @property {string} dataCommit - * @property {string} opsDigest - * @property {string} prevAuditCommit - * @property {number} tickStart - * @property {number} tickEnd - * @property {number} timestamp - */ +/** @typedef {import('./AuditReceiptService.js').AuditReceipt} AuditReceipt */ import { buildAuditPrefix, buildAuditRef } from '../utils/RefLayout.js'; import { decodeAuditMessage } from './AuditMessageCodec.js'; @@ -92,7 +81,7 @@ function validateReceiptSchema(receipt) { if (receipt === null || receipt === undefined || typeof receipt !== 'object') { return 'receipt is not an object'; } - const rec = /** @type {{ version?: unknown, graphName?: unknown, writerId?: unknown, dataCommit?: unknown, opsDigest?: unknown, prevAuditCommit?: unknown, tickStart?: unknown, tickEnd?: unknown, timestamp?: unknown }} */ (receipt); + const rec = /** @type {Record} */ (receipt); const keys = Object.keys(rec); if (keys.length !== 9) { return `expected 9 fields, got ${keys.length}`; @@ -106,35 +95,35 @@ function validateReceiptSchema(receipt) { return `missing field: ${k}`; } } - if (rec.version !== 1) { - return `unsupported version: ${rec.version}`; + if (rec['version'] !== 1) { + return `unsupported version: ${rec['version']}`; } - if (typeof rec.graphName !== 'string' || rec.graphName.length === 0) { + if (typeof rec['graphName'] !== 'string' || rec['graphName'].length === 0) { return 'graphName must be a non-empty string'; } - if (typeof rec.writerId !== 'string' || rec.writerId.length === 0) { + if (typeof rec['writerId'] !== 'string' || rec['writerId'].length === 0) { return 'writerId must be a non-empty string'; } - if (typeof rec.dataCommit !== 'string') { + if (typeof rec['dataCommit'] !== 'string') { return 'dataCommit must be a string'; } - if (typeof rec.opsDigest !== 'string') { + if (typeof rec['opsDigest'] !== 'string') { return 'opsDigest must be a string'; } - if (typeof rec.prevAuditCommit !== 'string') { + if (typeof rec['prevAuditCommit'] !== 'string') { return 'prevAuditCommit must be a string'; } - if (!Number.isInteger(rec.tickStart) || /** @type {number} */ (rec.tickStart) < 1) { - return `tickStart must be integer >= 1, got ${rec.tickStart}`; + if (!Number.isInteger(rec['tickStart']) || /** @type {number} */ (rec['tickStart']) < 1) { + return `tickStart must be integer >= 1, got ${rec['tickStart']}`; } - if (!Number.isInteger(rec.tickEnd) || /** @type {number} */ (rec.tickEnd) < /** @type {number} */ (rec.tickStart)) { - return `tickEnd must be integer >= tickStart, got ${rec.tickEnd}`; + if (!Number.isInteger(rec['tickEnd']) || /** @type {number} */ (rec['tickEnd']) < /** @type {number} */ (rec['tickStart'])) { + return `tickEnd must be integer >= tickStart, got ${rec['tickEnd']}`; } - if (rec.version === 1 && rec.tickStart !== rec.tickEnd) { - return `v1 requires tickStart === tickEnd, got ${rec.tickStart} !== ${rec.tickEnd}`; + if (rec['version'] === 1 && rec['tickStart'] !== rec['tickEnd']) { + return `v1 requires tickStart === tickEnd, got ${rec['tickStart']} !== ${rec['tickEnd']}`; } - if (!Number.isInteger(rec.timestamp) || /** @type {number} */ (rec.timestamp) < 0) { - return `timestamp must be non-negative integer, got ${rec.timestamp}`; + if (!Number.isInteger(rec['timestamp']) || /** @type {number} */ (rec['timestamp']) < 0) { + return `timestamp must be non-negative integer, got ${rec['timestamp']}`; } return null; } From 235e2d66b3a030728172ac287ceec7ac26f5d30d Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 22:22:30 -0700 Subject: [PATCH 35/73] chore: log 14 typedef-to-class backlog items in bad-code/ MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Each @typedef {Object} that is constructed, stored, and passed around as a real data entity should be a concrete JavaScript class — not a phantom type that vanishes at runtime. Candidates identified by audit: - CRDT primitives: Dot (XS), ORSet (M), LWWRegister (S) - Domain types: EventId (XS), PatchV2 (M), TickReceipt (S), PatchDiff (S) - Effect system: EffectEmission (XS), DeliveryObservation (XS) - Trust: TrustRecord (S), TrustState (S) - Core state: WarpStateV5 (L), BTR (S), StateDiffResult (S) --- .../backlog/bad-code/PROTO_typedef-btr-to-class.md | 9 +++++++++ .../PROTO_typedef-deliveryobservation-to-class.md | 9 +++++++++ .../backlog/bad-code/PROTO_typedef-dot-to-class.md | 9 +++++++++ .../bad-code/PROTO_typedef-effectemission-to-class.md | 10 ++++++++++ .../backlog/bad-code/PROTO_typedef-eventid-to-class.md | 8 ++++++++ .../backlog/bad-code/PROTO_typedef-lww-to-class.md | 9 +++++++++ .../backlog/bad-code/PROTO_typedef-orset-to-class.md | 9 +++++++++ .../bad-code/PROTO_typedef-patchdiff-to-class.md | 9 +++++++++ .../backlog/bad-code/PROTO_typedef-patchv2-to-class.md | 9 +++++++++ .../bad-code/PROTO_typedef-statediffresult-to-class.md | 9 +++++++++ .../bad-code/PROTO_typedef-tickreceipt-to-class.md | 10 ++++++++++ .../bad-code/PROTO_typedef-trustrecord-to-class.md | 9 +++++++++ .../bad-code/PROTO_typedef-truststate-to-class.md | 9 +++++++++ .../bad-code/PROTO_typedef-warpstatev5-to-class.md | 10 ++++++++++ 14 files changed, 128 insertions(+) create mode 100644 docs/method/backlog/bad-code/PROTO_typedef-btr-to-class.md create mode 100644 docs/method/backlog/bad-code/PROTO_typedef-deliveryobservation-to-class.md create mode 100644 docs/method/backlog/bad-code/PROTO_typedef-dot-to-class.md create mode 100644 docs/method/backlog/bad-code/PROTO_typedef-effectemission-to-class.md create mode 100644 docs/method/backlog/bad-code/PROTO_typedef-eventid-to-class.md create mode 100644 docs/method/backlog/bad-code/PROTO_typedef-lww-to-class.md create mode 100644 docs/method/backlog/bad-code/PROTO_typedef-orset-to-class.md create mode 100644 docs/method/backlog/bad-code/PROTO_typedef-patchdiff-to-class.md create mode 100644 docs/method/backlog/bad-code/PROTO_typedef-patchv2-to-class.md create mode 100644 docs/method/backlog/bad-code/PROTO_typedef-statediffresult-to-class.md create mode 100644 docs/method/backlog/bad-code/PROTO_typedef-tickreceipt-to-class.md create mode 100644 docs/method/backlog/bad-code/PROTO_typedef-trustrecord-to-class.md create mode 100644 docs/method/backlog/bad-code/PROTO_typedef-truststate-to-class.md create mode 100644 docs/method/backlog/bad-code/PROTO_typedef-warpstatev5-to-class.md diff --git a/docs/method/backlog/bad-code/PROTO_typedef-btr-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-btr-to-class.md new file mode 100644 index 00000000..e057bf44 --- /dev/null +++ b/docs/method/backlog/bad-code/PROTO_typedef-btr-to-class.md @@ -0,0 +1,9 @@ +# Promote BTR from @typedef to class + +**Effort:** S + +## Problem + +`src/domain/services/BoundaryTransitionRecord.js` defines `BTR` as a +`@typedef {Object}`. Tamper-evident package — constructed, frozen, +verified, serialized. Should be a class. diff --git a/docs/method/backlog/bad-code/PROTO_typedef-deliveryobservation-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-deliveryobservation-to-class.md new file mode 100644 index 00000000..45df6f55 --- /dev/null +++ b/docs/method/backlog/bad-code/PROTO_typedef-deliveryobservation-to-class.md @@ -0,0 +1,9 @@ +# Promote DeliveryObservation from @typedef to class + +**Effort:** XS + +## Problem + +`src/domain/types/DeliveryObservation.js` defines `DeliveryObservation` +as a `@typedef {Object}` with a factory (`createDeliveryObservation`) +returning a frozen object. Should be a class. diff --git a/docs/method/backlog/bad-code/PROTO_typedef-dot-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-dot-to-class.md new file mode 100644 index 00000000..dcacdfa0 --- /dev/null +++ b/docs/method/backlog/bad-code/PROTO_typedef-dot-to-class.md @@ -0,0 +1,9 @@ +# Promote Dot from @typedef to class + +**Effort:** XS + +## Problem + +`src/domain/crdt/Dot.js` defines `Dot` as a `@typedef {Object}` but it +has factory (`createDot`), encode/decode, and comparison functions. Should +be a class with those as methods. diff --git a/docs/method/backlog/bad-code/PROTO_typedef-effectemission-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-effectemission-to-class.md new file mode 100644 index 00000000..5b0efde8 --- /dev/null +++ b/docs/method/backlog/bad-code/PROTO_typedef-effectemission-to-class.md @@ -0,0 +1,10 @@ +# Promote EffectEmission from @typedef to class + +**Effort:** XS + +## Problem + +`src/domain/types/EffectEmission.js` defines `EffectEmission` as a +`@typedef {Object}` but has a factory (`createEffectEmission`) that +returns a frozen object. Should be a class. `EffectCoordinate` could +merge into it as a nested shape or separate class. diff --git a/docs/method/backlog/bad-code/PROTO_typedef-eventid-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-eventid-to-class.md new file mode 100644 index 00000000..d141a91d --- /dev/null +++ b/docs/method/backlog/bad-code/PROTO_typedef-eventid-to-class.md @@ -0,0 +1,8 @@ +# Promote EventId from @typedef to class + +**Effort:** XS + +## Problem + +`src/domain/utils/EventId.js` defines `EventId` as a `@typedef {Object}` +but has a factory (`createEventId`) and comparison logic. Should be a class. diff --git a/docs/method/backlog/bad-code/PROTO_typedef-lww-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-lww-to-class.md new file mode 100644 index 00000000..80841e78 --- /dev/null +++ b/docs/method/backlog/bad-code/PROTO_typedef-lww-to-class.md @@ -0,0 +1,9 @@ +# Promote LWWRegister from @typedef to class + +**Effort:** S + +## Problem + +`src/domain/crdt/LWW.js` defines `LWWRegister` as a `@typedef {Object}` +but it has semilattice merge semantics (`lwwMax`) and setter logic (`lwwSet`). +Should be a class. diff --git a/docs/method/backlog/bad-code/PROTO_typedef-orset-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-orset-to-class.md new file mode 100644 index 00000000..e487fd25 --- /dev/null +++ b/docs/method/backlog/bad-code/PROTO_typedef-orset-to-class.md @@ -0,0 +1,9 @@ +# Promote ORSet from @typedef to class + +**Effort:** M + +## Problem + +`src/domain/crdt/ORSet.js` defines `ORSet` as a `@typedef {Object}` but +it has 10+ functions operating on it (add, remove, join, compact, contains, +encode). This is a full CRDT data structure — should be a class. diff --git a/docs/method/backlog/bad-code/PROTO_typedef-patchdiff-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-patchdiff-to-class.md new file mode 100644 index 00000000..04598257 --- /dev/null +++ b/docs/method/backlog/bad-code/PROTO_typedef-patchdiff-to-class.md @@ -0,0 +1,9 @@ +# Promote PatchDiff from @typedef to class + +**Effort:** S + +## Problem + +`src/domain/types/PatchDiff.js` defines `PatchDiff` as a `@typedef {Object}` +with a factory (`createEmptyDiff`) and merge logic (`mergeDiffs`). Real +data entity accumulated during reduce. Should be a class. diff --git a/docs/method/backlog/bad-code/PROTO_typedef-patchv2-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-patchv2-to-class.md new file mode 100644 index 00000000..171715d2 --- /dev/null +++ b/docs/method/backlog/bad-code/PROTO_typedef-patchv2-to-class.md @@ -0,0 +1,9 @@ +# Promote PatchV2 from @typedef to class + +**Effort:** M + +## Problem + +`src/domain/types/WarpTypesV2.js` defines `PatchV2` as a `@typedef {Object}`. +Core domain entity — created by PatchBuilder, serialized to CBOR, consumed +by JoinReducer. Should be a class. diff --git a/docs/method/backlog/bad-code/PROTO_typedef-statediffresult-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-statediffresult-to-class.md new file mode 100644 index 00000000..15e55c12 --- /dev/null +++ b/docs/method/backlog/bad-code/PROTO_typedef-statediffresult-to-class.md @@ -0,0 +1,9 @@ +# Promote StateDiffResult from @typedef to class + +**Effort:** S + +## Problem + +`src/domain/services/StateDiff.js` defines `StateDiffResult` as a +`@typedef {Object}`. Computed diffs pushed to subscribers via +`graph.subscribe()`. Should be a class. diff --git a/docs/method/backlog/bad-code/PROTO_typedef-tickreceipt-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-tickreceipt-to-class.md new file mode 100644 index 00000000..d0f2ff17 --- /dev/null +++ b/docs/method/backlog/bad-code/PROTO_typedef-tickreceipt-to-class.md @@ -0,0 +1,10 @@ +# Promote TickReceipt from @typedef to class + +**Effort:** S + +## Problem + +`src/domain/types/TickReceipt.js` defines `TickReceipt` as a +`@typedef {Object}` with a factory (`createTickReceipt`), validation, +canonical JSON serialization, and public export. Should be a class. +Part of the public API surface. diff --git a/docs/method/backlog/bad-code/PROTO_typedef-trustrecord-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-trustrecord-to-class.md new file mode 100644 index 00000000..5d982e48 --- /dev/null +++ b/docs/method/backlog/bad-code/PROTO_typedef-trustrecord-to-class.md @@ -0,0 +1,9 @@ +# Promote TrustRecord from @typedef to class + +**Effort:** S + +## Problem + +`src/domain/trust/TrustStateBuilder.js` defines `TrustRecord` as a +`@typedef {Object}`. Parsed, validated, and chained — full entity +lifecycle. Should be a class. diff --git a/docs/method/backlog/bad-code/PROTO_typedef-truststate-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-truststate-to-class.md new file mode 100644 index 00000000..8abe3a13 --- /dev/null +++ b/docs/method/backlog/bad-code/PROTO_typedef-truststate-to-class.md @@ -0,0 +1,9 @@ +# Promote TrustState from @typedef to class + +**Effort:** S + +## Problem + +`src/domain/trust/TrustStateBuilder.js` defines `TrustState` as a +`@typedef {Object}`. Built by `buildState`, frozen, queried by +TrustEvaluator. Maps of bindings/keys. Should be a class. diff --git a/docs/method/backlog/bad-code/PROTO_typedef-warpstatev5-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-warpstatev5-to-class.md new file mode 100644 index 00000000..a62b1f3c --- /dev/null +++ b/docs/method/backlog/bad-code/PROTO_typedef-warpstatev5-to-class.md @@ -0,0 +1,10 @@ +# Promote WarpStateV5 from @typedef to class + +**Effort:** L + +## Problem + +`src/domain/services/JoinReducer.js` defines `WarpStateV5` as a +`@typedef {Object}`. This is the core CRDT materialized state — +constructed, cloned, mutated by all apply paths, checkpointed, and +serialized. Large blast radius; many consumers. From addb2908cb6277382a37626d502969a18186038f Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 22:25:21 -0700 Subject: [PATCH 36/73] chore: add NO_DOGS_NO_MASTERS legend (NDNM) New legend for god object decomposition and typedef-to-class liberation. Covers 2 god objects (WarpRuntime, StrandService) and 14 phantom typedefs that should be real JavaScript classes. "Break up the gods. Free their vassals." See docs/method/legends/NO_DOGS_NO_MASTERS.md for the full manifesto. --- docs/method/legends/NO_DOGS_NO_MASTERS.md | 102 ++++++++++++++++++++++ 1 file changed, 102 insertions(+) create mode 100644 docs/method/legends/NO_DOGS_NO_MASTERS.md diff --git a/docs/method/legends/NO_DOGS_NO_MASTERS.md b/docs/method/legends/NO_DOGS_NO_MASTERS.md new file mode 100644 index 00000000..7fe241fc --- /dev/null +++ b/docs/method/legends/NO_DOGS_NO_MASTERS.md @@ -0,0 +1,102 @@ +# NO_DOGS_NO_MASTERS + +Break up the gods. Free their vassals. + +## What it covers + +God object decomposition and phantom-type liberation. Two sides +of the same coin: god objects hoard responsibilities behind a +single class, and `@typedef {Object}` phantoms masquerade as data +structures while contributing nothing at runtime. + +**The Gods** — bloated service classes that own everything and +delegate nothing. StrandService (2,048 LOC, 40+ methods). +WarpRuntime (6,613 LOC across warp/ mixins). They know too much, +do too much, and make every edit a full-context-window affair. + +**The Vassals** — `@typedef {Object}` shapes that get constructed, +frozen, serialized, passed around, and queried — doing all the +work of a class without ever becoming one. They exist only at +type-check time. No `instanceof`. No constructors. No methods. +Phantom types serving phantom masters. + +The fix is the same for both: real JavaScript. Classes with +constructors that validate. Methods that live next to their data. +Files you can grep for with `instanceof`. Code that exists at +runtime because it has something to do at runtime. + +## Who cares + +### Sponsor human + +James — wants to open a file and find one thing. Wants `instanceof` +to work. Wants the TypeScript layer to describe reality, not +invent a parallel universe of shapes that vanish when you +`console.log` them. + +### Sponsor agent + +Claude — god objects force full-file reads that burn context window. +Phantom types force guessing at runtime shapes. Both multiply the +cost of every edit. A 200-LOC class with a constructor is readable +in one pass. A 2,000-LOC god object with 14 typedef vassals is not. + +## What success looks like + +- No service file exceeds 500 LOC +- Every data entity that gets constructed is a `class`, not a + `@typedef {Object}` +- `@typedef` is reserved for genuinely type-only concepts: unions, + callback signatures, import aliases +- `grep -rn '@typedef {Object}' src/domain/` returns only options + bags, never entities +- `instanceof` works on every domain value object + +## How you know + +- Count of `@typedef {Object}` in `src/domain/` trends toward zero + (options bags excepted) +- God object LOC counts shrink each cycle +- New domain entities are born as classes, never typedefs + +## Current surface + +### The Gods + +| Item | LOC | Location | +|------|-----|----------| +| `PROTO_warpruntime-god-class` (asap/) | 6,613 | WarpRuntime + warp/ mixins | +| `PROTO_strand-service-god-object` (bad-code/) | 2,048 | StrandService.js | + +### The Vassals (typedef → class) + +| Item | Effort | Entity | +|------|--------|--------| +| `PROTO_typedef-dot-to-class` | XS | Dot (CRDT primitive) | +| `PROTO_typedef-eventid-to-class` | XS | EventId (causal ordering) | +| `PROTO_typedef-effectemission-to-class` | XS | EffectEmission (domain event) | +| `PROTO_typedef-deliveryobservation-to-class` | XS | DeliveryObservation (trace record) | +| `PROTO_typedef-lww-to-class` | S | LWWRegister (CRDT) | +| `PROTO_typedef-tickreceipt-to-class` | S | TickReceipt (public API) | +| `PROTO_typedef-patchdiff-to-class` | S | PatchDiff (reduce output) | +| `PROTO_typedef-trustrecord-to-class` | S | TrustRecord (trust chain) | +| `PROTO_typedef-truststate-to-class` | S | TrustState (trust aggregate) | +| `PROTO_typedef-btr-to-class` | S | BTR (tamper-evident package) | +| `PROTO_typedef-statediffresult-to-class` | S | StateDiffResult (subscriber diffs) | +| `PROTO_typedef-orset-to-class` | M | ORSet (CRDT, 10+ operations) | +| `PROTO_typedef-patchv2-to-class` | M | PatchV2 (core domain entity) | +| `PROTO_typedef-warpstatev5-to-class` | L | WarpStateV5 (CRDT materialized state) | + +### Already liberated + +- `AuditReceipt` — promoted from typedef to class (this session) + +## Legend code + +`NDNM` — for backlog items that belong to this legend. + +``` +NDNM_warpruntime-decomposition.md +NDNM_typedef-tickreceipt.md +NDNM_typedef-orset.md +``` From 48889be4bbfa23c7ca51f9427cab7872e5190fc2 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 22:55:20 -0700 Subject: [PATCH 37/73] refactor(NDNM): extract StrandController from WarpRuntime MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Moved strand.methods.js (200 LOC) and conflict.methods.js (23 LOC) into a new StrandController service class. WarpRuntime now delegates 14 strand/conflict methods to this._strandController via defineProperty loops, following the SyncController precedent. StrandController caches a single StrandService instance instead of creating one per method call. ConflictAnalyzerService is still instantiated per call (stateless analyzer). Phase 1a of the WarpRuntime god class decomposition. No public API surface changes — all 14 methods remain on the prototype. --- src/domain/WarpRuntime.js | 33 +++- src/domain/services/StrandController.js | 182 +++++++++++++++++++++ src/domain/warp/conflict.methods.js | 23 --- src/domain/warp/strand.methods.js | 200 ------------------------ 4 files changed, 211 insertions(+), 227 deletions(-) create mode 100644 src/domain/services/StrandController.js delete mode 100644 src/domain/warp/conflict.methods.js delete mode 100644 src/domain/warp/strand.methods.js diff --git a/src/domain/WarpRuntime.js b/src/domain/WarpRuntime.js index 87e50b72..6d2ef0d2 100644 --- a/src/domain/WarpRuntime.js +++ b/src/domain/WarpRuntime.js @@ -19,6 +19,7 @@ import defaultClock from './utils/defaultClock.js'; import LogicalTraversal from './services/LogicalTraversal.js'; import LRUCache from './utils/LRUCache.js'; import SyncController from './services/SyncController.js'; +import StrandController from './services/StrandController.js'; import SyncTrustGate from './services/SyncTrustGate.js'; import { AuditVerifierService } from './services/AuditVerifierService.js'; import MaterializedViewService from './services/MaterializedViewService.js'; @@ -32,8 +33,6 @@ import * as checkpointMethods from './warp/checkpoint.methods.js'; import * as patchMethods from './warp/patch.methods.js'; import * as materializeMethods from './warp/materialize.methods.js'; import * as materializeAdvancedMethods from './warp/materializeAdvanced.methods.js'; -import * as strandMethods from './warp/strand.methods.js'; -import * as conflictMethods from './warp/conflict.methods.js'; import * as comparisonMethods from './warp/comparison.methods.js'; /** @typedef {import('./types/WarpPersistence.js').CorePersistence} CorePersistence */ @@ -310,6 +309,9 @@ export default class WarpRuntime { ...(trustGate !== undefined ? { trustGate } : {}), }); + /** @type {StrandController} */ + this._strandController = new StrandController(this); + /** @type {MaterializedViewService} */ this._viewService = new MaterializedViewService({ codec: this._codec, @@ -654,11 +656,34 @@ wireWarpMethods(WarpRuntime, [ patchMethods, materializeMethods, materializeAdvancedMethods, - strandMethods, - conflictMethods, comparisonMethods, ]); +// ── Strand + conflict methods: direct delegation to StrandController ──────── +const strandDelegates = /** @type {const} */ ([ + 'createStrand', 'braidStrand', 'getStrand', 'listStrands', 'dropStrand', + 'materializeStrand', 'getStrandPatches', 'patchesForStrand', + 'createStrandPatch', 'patchStrand', + 'queueStrandIntent', 'listStrandIntents', 'tickStrand', + 'analyzeConflicts', +]); +for (const method of strandDelegates) { + Object.defineProperty(WarpRuntime.prototype, method, { + // eslint-disable-next-line object-shorthand -- function keyword needed for `this` binding + value: /** Delegates to StrandController. @param {unknown[]} args @returns {unknown} */ function (...args) { + /** @type {unknown} */ + const raw = this; + const self = /** @type {WarpRuntime} */ (raw); + const ctrl = /** @type {Record} */ (/** @type {unknown} */ (self._strandController)); + const fn = /** @type {(...a: unknown[]) => unknown} */ (ctrl[method]); + return fn.call(ctrl, ...args); + }, + writable: true, + configurable: true, + enumerable: false, + }); +} + // ── Sync methods: direct delegation to SyncController (no stub file) ──────── const syncDelegates = /** @type {const} */ ([ 'getFrontier', 'hasFrontierChanged', 'status', diff --git a/src/domain/services/StrandController.js b/src/domain/services/StrandController.js new file mode 100644 index 00000000..bea98445 --- /dev/null +++ b/src/domain/services/StrandController.js @@ -0,0 +1,182 @@ +/** + * StrandController — encapsulates strand and conflict analysis operations. + * + * Extracted from strand.methods.js and conflict.methods.js. WarpRuntime + * delegates directly to this controller via defineProperty loops. + * + * @module domain/services/StrandController + */ + +import StrandService from './StrandService.js'; +import ConflictAnalyzerService from './ConflictAnalyzerService.js'; + +/** + * The host interface that StrandController depends on. + * + * StrandService and ConflictAnalyzerService both accept `{ graph }` where + * graph is the full WarpRuntime instance. This typedef documents that + * coupling explicitly. + * + * @typedef {import('../WarpRuntime.js').default} StrandHost + */ + +export default class StrandController { + /** @type {StrandHost} */ + _host; + + /** @type {StrandService} */ + _strandService; + + /** + * Creates a StrandController bound to a WarpRuntime host. + * @param {StrandHost} host - The WarpRuntime instance + */ + constructor(host) { + this._host = host; + this._strandService = new StrandService({ graph: host }); + } + + // ── Strand lifecycle ──────────────────────────────────────────────────── + + /** + * Creates a new strand with the given options. + * @param {import('./StrandService.js').StrandCreateOptions} [options] + * @returns {Promise} + */ + async createStrand(options) { + return await this._strandService.create(options); + } + + /** + * Braids a strand, merging its overlay back into the base graph. + * @param {string} strandId + * @param {import('./StrandService.js').StrandBraidOptions} [options] + * @returns {Promise} + */ + async braidStrand(strandId, options) { + return await this._strandService.braid(strandId, options); + } + + /** + * Retrieves the descriptor for a strand by its identifier. + * @param {string} strandId + * @returns {Promise} + */ + async getStrand(strandId) { + return await this._strandService.get(strandId); + } + + /** + * Lists all strand descriptors in the current graph. + * @returns {Promise} + */ + async listStrands() { + return await this._strandService.list(); + } + + /** + * Drops (deletes) a strand, removing its refs and overlay data. + * @param {string} strandId + * @returns {Promise} + */ + async dropStrand(strandId) { + return await this._strandService.drop(strandId); + } + + // ── Strand materialization & queries ───────────────────────────────────── + + /** + * Materializes the graph state scoped to a single strand. + * @param {string} strandId + * @param {{ receipts?: boolean, ceiling?: number|null }} [options] + * @returns {Promise} + */ + async materializeStrand(strandId, options) { + return await this._strandService.materialize(strandId, options); + } + + /** + * Retrieves all patch entries belonging to a strand. + * @param {string} strandId + * @param {{ ceiling?: number|null }} [options] + * @returns {Promise>} + */ + async getStrandPatches(strandId, options) { + return await this._strandService.getPatchEntries(strandId, options); + } + + /** + * Returns the patch SHAs that touched a given entity within a strand. + * @param {string} strandId + * @param {string} entityId + * @param {{ ceiling?: number|null }} [options] + * @returns {Promise} + */ + async patchesForStrand(strandId, entityId, options) { + return await this._strandService.patchesFor(strandId, entityId, options); + } + + // ── Strand patching ───────────────────────────────────────────────────── + + /** + * Creates a PatchBuilderV2 scoped to a strand for manual patch construction. + * @param {string} strandId + * @returns {Promise} + */ + async createStrandPatch(strandId) { + return await this._strandService.createPatchBuilder(strandId); + } + + /** + * Applies a patch to a strand using a builder callback and commits it. + * @param {string} strandId + * @param {(p: import('./PatchBuilderV2.js').PatchBuilderV2) => void | Promise} build + * @returns {Promise} + */ + async patchStrand(strandId, build) { + return await this._strandService.patch(strandId, build); + } + + // ── Speculative intents ───────────────────────────────────────────────── + + /** + * Queues a speculative intent on a strand without committing it. + * @param {string} strandId + * @param {(p: import('./PatchBuilderV2.js').PatchBuilderV2) => void | Promise} build + * @returns {Promise<{ intentId: string, enqueuedAt: string, patch: import('../types/WarpTypesV2.js').PatchV2, reads: string[], writes: string[], contentBlobOids: string[] }>} + */ + async queueStrandIntent(strandId, build) { + return await this._strandService.queueIntent(strandId, build); + } + + /** + * Lists all pending intents queued on a strand. + * @param {string} strandId + * @returns {Promise>} + */ + async listStrandIntents(strandId) { + return await this._strandService.listIntents(strandId); + } + + /** + * Advances a strand by one tick, draining queued intents with conflict detection. + * @param {string} strandId + * @returns {Promise<{ tickId: string, strandId: string, tickIndex: number, createdAt: string, drainedIntentCount: number, admittedIntentIds: string[], rejected: Array<{ intentId: string, reason: string, conflictsWith: string[], reads: string[], writes: string[] }>, baseOverlayHeadPatchSha: string|null, overlayHeadPatchSha: string|null, overlayPatchShas: string[] }>} + */ + async tickStrand(strandId) { + return await this._strandService.tick(strandId); + } + + // ── Conflict analysis ─────────────────────────────────────────────────── + + /** + * Analyze read-only conflict provenance over either the current frontier + * or an explicit strand, with an optional Lamport ceiling. + * @param {import('./ConflictAnalyzerService.js').ConflictAnalyzeOptions} [options] + * @returns {Promise} + */ + async analyzeConflicts(options) { + const analyzer = new ConflictAnalyzerService({ graph: this._host }); + return await analyzer.analyze(options); + } +} diff --git a/src/domain/warp/conflict.methods.js b/src/domain/warp/conflict.methods.js deleted file mode 100644 index 483eba82..00000000 --- a/src/domain/warp/conflict.methods.js +++ /dev/null @@ -1,23 +0,0 @@ -/** - * Conflict analysis methods for WarpRuntime. - * - * @module domain/warp/conflict.methods - */ - -import ConflictAnalyzerService from '../services/ConflictAnalyzerService.js'; - -/** - * Analyze read-only conflict provenance over either the current frontier or - * an explicit strand, with an optional Lamport ceiling. - * - * This method performs zero durable writes. It does not materialize or mutate - * cached graph state, checkpoints, or persistent caches. - * - * @this {import('../WarpRuntime.js').default} - * @param {import('../services/ConflictAnalyzerService.js').ConflictAnalyzeOptions} [options] - * @returns {Promise} - */ -export async function analyzeConflicts(options) { - const analyzer = new ConflictAnalyzerService({ graph: this }); - return await analyzer.analyze(options); -} diff --git a/src/domain/warp/strand.methods.js b/src/domain/warp/strand.methods.js deleted file mode 100644 index bd72803e..00000000 --- a/src/domain/warp/strand.methods.js +++ /dev/null @@ -1,200 +0,0 @@ -/** - * Strand methods for WarpRuntime. - * - * @module domain/warp/strand.methods - */ - -import StrandService from '../services/StrandService.js'; - -/** - * Creates a new strand with the given options. - * - * @this {import('../WarpRuntime.js').default} - * @param {import('../services/StrandService.js').StrandCreateOptions} [options] - * @returns {Promise} - */ -export async function createStrand(options) { - const service = new StrandService({ graph: this }); - return await service.create(options); -} - -/** - * Braids a strand, merging its overlay back into the base graph. - * - * @this {import('../WarpRuntime.js').default} - * @param {string} strandId - * @param {import('../services/StrandService.js').StrandBraidOptions} [options] - * @returns {Promise} - */ -export async function braidStrand(strandId, options) { - const service = new StrandService({ graph: this }); - return await service.braid(strandId, options); -} - -/** - * Retrieves the descriptor for a strand by its identifier. - * - * @this {import('../WarpRuntime.js').default} - * @param {string} strandId - * @returns {Promise} - */ -export async function getStrand(strandId) { - const service = new StrandService({ graph: this }); - return await service.get(strandId); -} - -/** - * Lists all strand descriptors in the current graph. - * - * @this {import('../WarpRuntime.js').default} - * @returns {Promise} - */ -export async function listStrands() { - const service = new StrandService({ graph: this }); - return await service.list(); -} - -/** - * Drops (deletes) a strand, removing its refs and overlay data. - * - * @this {import('../WarpRuntime.js').default} - * @param {string} strandId - * @returns {Promise} - */ -export async function dropStrand(strandId) { - const service = new StrandService({ graph: this }); - return await service.drop(strandId); -} - -/** - * Materializes the graph state scoped to a single strand. - * - * @this {import('../WarpRuntime.js').default} - * @param {string} strandId - * @param {{ receipts?: boolean, ceiling?: number|null }} [options] - * @returns {Promise} - */ -export async function materializeStrand(strandId, options) { - const service = new StrandService({ graph: this }); - return await service.materialize(strandId, options); -} - -/** - * Retrieves all patch entries belonging to a strand. - * - * @this {import('../WarpRuntime.js').default} - * @param {string} strandId - * @param {{ ceiling?: number|null }} [options] - * @returns {Promise>} - */ -export async function getStrandPatches(strandId, options) { - const service = new StrandService({ graph: this }); - return await service.getPatchEntries(strandId, options); -} - -/** - * Returns the patch SHAs that touched a given entity within a strand. - * - * @this {import('../WarpRuntime.js').default} - * @param {string} strandId - * @param {string} entityId - * @param {{ ceiling?: number|null }} [options] - * @returns {Promise} - */ -export async function patchesForStrand(strandId, entityId, options) { - const service = new StrandService({ graph: this }); - return await service.patchesFor(strandId, entityId, options); -} - -/** - * Creates a PatchBuilderV2 scoped to a strand for manual patch construction. - * - * @this {import('../WarpRuntime.js').default} - * @param {string} strandId - * @returns {Promise} - */ -export async function createStrandPatch(strandId) { - const service = new StrandService({ graph: this }); - return await service.createPatchBuilder(strandId); -} - -/** - * Applies a patch to a strand using a builder callback and commits it. - * - * @this {import('../WarpRuntime.js').default} - * @param {string} strandId - * @param {(p: import('../services/PatchBuilderV2.js').PatchBuilderV2) => void | Promise} build - * @returns {Promise} - */ -export async function patchStrand(strandId, build) { - const service = new StrandService({ graph: this }); - return await service.patch(strandId, build); -} - -/** - * Queues a speculative intent on a strand without committing it. - * - * @this {import('../WarpRuntime.js').default} - * @param {string} strandId - * @param {(p: import('../services/PatchBuilderV2.js').PatchBuilderV2) => void | Promise} build - * @returns {Promise<{ - * intentId: string, - * enqueuedAt: string, - * patch: import('../types/WarpTypesV2.js').PatchV2, - * reads: string[], - * writes: string[], - * contentBlobOids: string[] - * }>} - */ -export async function queueStrandIntent(strandId, build) { - const service = new StrandService({ graph: this }); - return await service.queueIntent(strandId, build); -} - -/** - * Lists all pending intents queued on a strand. - * - * @this {import('../WarpRuntime.js').default} - * @param {string} strandId - * @returns {Promise>} - */ -export async function listStrandIntents(strandId) { - const service = new StrandService({ graph: this }); - return await service.listIntents(strandId); -} - -/** - * Advances a strand by one tick, draining queued intents with conflict detection. - * - * @this {import('../WarpRuntime.js').default} - * @param {string} strandId - * @returns {Promise<{ - * tickId: string, - * strandId: string, - * tickIndex: number, - * createdAt: string, - * drainedIntentCount: number, - * admittedIntentIds: string[], - * rejected: Array<{ - * intentId: string, - * reason: string, - * conflictsWith: string[], - * reads: string[], - * writes: string[] - * }>, - * baseOverlayHeadPatchSha: string|null, - * overlayHeadPatchSha: string|null, - * overlayPatchShas: string[] - * }>} - */ -export async function tickStrand(strandId) { - const service = new StrandService({ graph: this }); - return await service.tick(strandId); -} From 6080e268d80abe5ffb9f039b6bc3fc69d49bf1b5 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 23:02:42 -0700 Subject: [PATCH 38/73] refactor(NDNM): extract ComparisonController from WarpRuntime MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Moved comparison.methods.js (1,088 LOC) into ComparisonController service class. The 5 public methods (compareCoordinates, compareStrand, planCoordinateTransfer, planStrandTransfer, buildPatchDivergence) are now instance methods on the controller; ~40 private helpers remain as module-level functions receiving the host as a `graph` parameter. Key change: resolveComparisonSide() no longer uses .call(this) to thread the host — it takes graph as an explicit first parameter, making the data flow honest. Phase 1b of the WarpRuntime god class decomposition. No public API surface changes. --- src/domain/WarpRuntime.js | 28 +++- .../ComparisonController.js} | 153 +++++++++++++----- 2 files changed, 136 insertions(+), 45 deletions(-) rename src/domain/{warp/comparison.methods.js => services/ComparisonController.js} (88%) diff --git a/src/domain/WarpRuntime.js b/src/domain/WarpRuntime.js index 6d2ef0d2..04156820 100644 --- a/src/domain/WarpRuntime.js +++ b/src/domain/WarpRuntime.js @@ -20,6 +20,7 @@ import LogicalTraversal from './services/LogicalTraversal.js'; import LRUCache from './utils/LRUCache.js'; import SyncController from './services/SyncController.js'; import StrandController from './services/StrandController.js'; +import ComparisonController from './services/ComparisonController.js'; import SyncTrustGate from './services/SyncTrustGate.js'; import { AuditVerifierService } from './services/AuditVerifierService.js'; import MaterializedViewService from './services/MaterializedViewService.js'; @@ -33,7 +34,6 @@ import * as checkpointMethods from './warp/checkpoint.methods.js'; import * as patchMethods from './warp/patch.methods.js'; import * as materializeMethods from './warp/materialize.methods.js'; import * as materializeAdvancedMethods from './warp/materializeAdvanced.methods.js'; -import * as comparisonMethods from './warp/comparison.methods.js'; /** @typedef {import('./types/WarpPersistence.js').CorePersistence} CorePersistence */ @@ -312,6 +312,9 @@ export default class WarpRuntime { /** @type {StrandController} */ this._strandController = new StrandController(this); + /** @type {ComparisonController} */ + this._comparisonController = new ComparisonController(this); + /** @type {MaterializedViewService} */ this._viewService = new MaterializedViewService({ codec: this._codec, @@ -656,7 +659,6 @@ wireWarpMethods(WarpRuntime, [ patchMethods, materializeMethods, materializeAdvancedMethods, - comparisonMethods, ]); // ── Strand + conflict methods: direct delegation to StrandController ──────── @@ -684,6 +686,28 @@ for (const method of strandDelegates) { }); } +// ── Comparison methods: direct delegation to ComparisonController ──────────── +const comparisonDelegates = /** @type {const} */ ([ + 'buildPatchDivergence', 'compareStrand', 'planStrandTransfer', + 'planCoordinateTransfer', 'compareCoordinates', +]); +for (const method of comparisonDelegates) { + Object.defineProperty(WarpRuntime.prototype, method, { + // eslint-disable-next-line object-shorthand -- function keyword needed for `this` binding + value: /** Delegates to ComparisonController. @param {unknown[]} args @returns {unknown} */ function (...args) { + /** @type {unknown} */ + const raw = this; + const self = /** @type {WarpRuntime} */ (raw); + const ctrl = /** @type {Record} */ (/** @type {unknown} */ (self._comparisonController)); + const fn = /** @type {(...a: unknown[]) => unknown} */ (ctrl[method]); + return fn.call(ctrl, ...args); + }, + writable: true, + configurable: true, + enumerable: false, + }); +} + // ── Sync methods: direct delegation to SyncController (no stub file) ──────── const syncDelegates = /** @type {const} */ ([ 'getFrontier', 'hasFrontierChanged', 'status', diff --git a/src/domain/warp/comparison.methods.js b/src/domain/services/ComparisonController.js similarity index 88% rename from src/domain/warp/comparison.methods.js rename to src/domain/services/ComparisonController.js index dae500d9..16d2eee1 100644 --- a/src/domain/warp/comparison.methods.js +++ b/src/domain/services/ComparisonController.js @@ -1,31 +1,28 @@ /** - * Comparison methods for substrate-visible coordinate and strand reads. + * ComparisonController — substrate-visible coordinate and strand comparison. * - * These helpers compare only deterministic substrate facts: - * - visible patch-universe divergence - * - visible node / edge / property deltas - * - optional node-local target diffs + * Extracted from comparison.methods.js. Compares only deterministic + * substrate facts: visible patch-universe divergence, visible node/edge/ + * property deltas, and optional node-local target diffs. * - * They do not introduce application semantics. - * - * @module domain/warp/comparison.methods + * @module domain/services/ComparisonController */ import QueryError from '../errors/QueryError.js'; import { buildCoordinateComparisonFact, buildCoordinateTransferPlanFact, -} from '../services/CoordinateFactExport.js'; -import { createStateReaderV5 } from '../services/StateReaderV5.js'; -import { computeStateHashV5 } from '../services/StateSerializerV5.js'; +} from './CoordinateFactExport.js'; +import { createStateReaderV5 } from './StateReaderV5.js'; +import { computeStateHashV5 } from './StateSerializerV5.js'; import { normalizeVisibleStateScopeV1, scopeMaterializedStateV5, scopePatchEntriesV1, -} from '../services/VisibleStateScopeV1.js'; -import { compareVisibleStateV5 } from '../services/VisibleStateComparisonV5.js'; -import { planVisibleStateTransferV5 } from '../services/VisibleStateTransferPlannerV5.js'; -import StrandService from '../services/StrandService.js'; +} from './VisibleStateScopeV1.js'; +import { compareVisibleStateV5 } from './VisibleStateComparisonV5.js'; +import { planVisibleStateTransferV5 } from './VisibleStateTransferPlannerV5.js'; +import StrandService from './StrandService.js'; import { computeChecksum } from '../utils/checksumUtils.js'; import { callInternalRuntimeMethod } from '../utils/callInternalRuntimeMethod.js'; @@ -422,7 +419,7 @@ function buildTargetDivergence(leftEntries, rightEntries, targetId) { * @param {string|null} targetId * @returns {Record} */ -export function buildPatchDivergence(leftEntries, rightEntries, targetId) { +function buildPatchDivergenceImpl(leftEntries, rightEntries, targetId) { const leftShas = uniqueSortedPatchShas(leftEntries); const rightShas = uniqueSortedPatchShas(rightEntries); const rightSet = new Set(rightShas); @@ -772,26 +769,26 @@ async function resolveStrandBaseComparisonSide(graph, selector, scope) { /** * Dispatches coordinate side resolution based on selector kind. * - * @this {import('../WarpRuntime.js').default} + * @param {import('../WarpRuntime.js').default} graph * @param {NormalizedSelector} selector - * @param {VisibleStateScopeV1|null} scope + * @param {VisibleStateScopeV1|null} [scope] * @returns {Promise>} * @private */ -async function resolveComparisonSide(selector, scope = null) { +async function resolveComparisonSide(graph, selector, scope = null) { if (selector.kind === 'live') { - return await resolveLiveComparisonSide(this, selector, scope); + return await resolveLiveComparisonSide(graph, selector, scope); } if (selector.kind === 'coordinate') { - return await resolveCoordinateComparisonSide(this, selector, scope); + return await resolveCoordinateComparisonSide(graph, selector, scope); } if (selector.kind === 'strand') { - return await resolveStrandComparisonSide(this, selector, scope); + return await resolveStrandComparisonSide(graph, selector, scope); } - return await resolveStrandBaseComparisonSide(this, selector, scope); + return await resolveStrandBaseComparisonSide(graph, selector, scope); } /** @@ -832,7 +829,7 @@ function normalizeAgainstSelector(normalizedStrandId, against, againstCeiling) { * Compares a strand against its base observation, the live frontier, or * another strand. * - * @this {import('../WarpRuntime.js').default} + * @param {import('../WarpRuntime.js').default} graph * @param {string} strandId * @param {{ * against?: 'base'|'live'|{ kind: 'strand', strandId: string }, @@ -843,7 +840,7 @@ function normalizeAgainstSelector(normalizedStrandId, against, againstCeiling) { * }} [options] * @returns {Promise} */ -export async function compareStrand(strandId, options = {}) { +async function compareStrandImpl(graph, strandId, options = {}) { const normalizedStrandId = normalizeRequiredString(strandId, 'strandId'); const ceiling = normalizeLamportCeiling(options.ceiling, 'ceiling'); const againstCeiling = normalizeLamportCeiling(options.againstCeiling, 'againstCeiling'); @@ -853,7 +850,7 @@ export async function compareStrand(strandId, options = {}) { const left = { kind: 'strand', strandId: normalizedStrandId, ceiling }; const right = normalizeAgainstSelector(normalizedStrandId, options.against ?? 'base', againstCeiling); - return await this.compareCoordinates({ + return await compareCoordinatesImpl(graph, { left: /** @type {CoordinateComparisonSelectorV1} */ (left), right: /** @type {CoordinateComparisonSelectorV1} */ (right), targetId, @@ -906,7 +903,7 @@ function normalizeIntoSelector(normalizedStrandId, into, intoCeiling) { * Plans a deterministic transfer from one strand into live truth, its * pinned base observation, or another strand. * - * @this {import('../WarpRuntime.js').default} + * @param {import('../WarpRuntime.js').default} graph * @param {string} strandId * @param {{ * into?: 'base'|'live'|{ kind: 'strand', strandId: string }, @@ -916,7 +913,7 @@ function normalizeIntoSelector(normalizedStrandId, into, intoCeiling) { * }} [options] * @returns {Promise} */ -export async function planStrandTransfer(strandId, options = {}) { +async function planStrandTransferImpl(graph, strandId, options = {}) { const normalizedStrandId = normalizeRequiredString(strandId, 'strandId'); const ceiling = normalizeLamportCeiling(options.ceiling, 'ceiling'); const intoCeiling = normalizeLamportCeiling(options.intoCeiling, 'intoCeiling'); @@ -925,7 +922,7 @@ export async function planStrandTransfer(strandId, options = {}) { const source = { kind: 'strand', strandId: normalizedStrandId, ceiling }; const target = normalizeIntoSelector(normalizedStrandId, options.into ?? 'live', intoCeiling); - return await this.planCoordinateTransfer({ + return await planCoordinateTransferImpl(graph, { source: /** @type {CoordinateTransferPlanSelectorV1} */ (source), target: /** @type {CoordinateTransferPlanSelectorV1} */ (target), ...(scope ? { scope } : {}), @@ -988,7 +985,7 @@ async function finalizeTransferPlan(params) { /** * Plans a deterministic transfer between two substrate observation selectors. * - * @this {import('../WarpRuntime.js').default} + * @param {import('../WarpRuntime.js').default} graph * @param {{ * source: Record, * target: Record, @@ -996,28 +993,28 @@ async function finalizeTransferPlan(params) { * }} options * @returns {Promise} */ -export async function planCoordinateTransfer(options) { +async function planCoordinateTransferImpl(graph, options) { assertTransferOptions(options); const normalizedSource = /** @type {NormalizedSelector} */ (normalizeSelector(options.source, 'source')); const normalizedTarget = /** @type {NormalizedSelector} */ (normalizeSelector(options.target, 'target')); const scope = normalizeVisibleStateScopeV1(options.scope, 'scope'); - const comp = await this.compareCoordinates({ + const comp = await compareCoordinatesImpl(graph, { left: /** @type {CoordinateComparisonSelectorV1} */ (/** @type {unknown} */ (normalizedSource)), right: /** @type {CoordinateComparisonSelectorV1} */ (/** @type {unknown} */ (normalizedTarget)), ...(scope !== null && scope !== undefined ? { scope } : {}), }); - const sourceSide = /** @type {ResolvedComparisonSide} */ (await resolveComparisonSide.call(this, normalizedSource, scope)); - const targetSide = /** @type {ResolvedComparisonSide} */ (await resolveComparisonSide.call(this, normalizedTarget, scope)); + const sourceSide = /** @type {ResolvedComparisonSide} */ (await resolveComparisonSide(graph, normalizedSource, scope)); + const targetSide = /** @type {ResolvedComparisonSide} */ (await resolveComparisonSide(graph, normalizedTarget, scope)); /** Loads node content blob by OID. @type {(nodeId: string, meta: { oid: string }) => Promise} */ - const loadNodeContent = async (_nodeId, meta) => await readContentBlobByOid(this, meta.oid); + const loadNodeContent = async (_nodeId, meta) => await readContentBlobByOid(graph, meta.oid); /** Loads edge content blob by OID. @type {(edge: unknown, meta: { oid: string }) => Promise} */ - const loadEdgeContent = async (_edge, meta) => await readContentBlobByOid(this, meta.oid); + const loadEdgeContent = async (_edge, meta) => await readContentBlobByOid(graph, meta.oid); const transfer = await planVisibleStateTransferV5(createStateReaderV5(sourceSide.state), createStateReaderV5(targetSide.state), { loadNodeContent, loadEdgeContent, }); - return await finalizeTransferPlan({ graph: this, sourceSide, targetSide, transfer, comparisonDigest: comp.comparisonDigest, scope }); + return await finalizeTransferPlan({ graph, sourceSide, targetSide, transfer, comparisonDigest: comp.comparisonDigest, scope }); } /** @@ -1057,7 +1054,7 @@ function assertComparisonOptions(options) { /** * Compares two substrate observation selectors. * - * @this {import('../WarpRuntime.js').default} + * @param {import('../WarpRuntime.js').default} graph * @param {{ * left: Record, * right: Record, @@ -1066,12 +1063,12 @@ function assertComparisonOptions(options) { * }} options * @returns {Promise} */ -export async function compareCoordinates(options) { +async function compareCoordinatesImpl(graph, options) { const { normalizedLeft, normalizedRight, targetId, scope } = extractComparisonInputs(options); - const left = /** @type {ResolvedComparisonSide} */ (await resolveComparisonSide.call(this, normalizedLeft, scope)); - const right = /** @type {ResolvedComparisonSide} */ (await resolveComparisonSide.call(this, normalizedRight, scope)); - const visiblePatchDivergence = buildPatchDivergence(left.patchEntries, right.patchEntries, targetId); + const left = /** @type {ResolvedComparisonSide} */ (await resolveComparisonSide(graph, normalizedLeft, scope)); + const right = /** @type {ResolvedComparisonSide} */ (await resolveComparisonSide(graph, normalizedRight, scope)); + const visiblePatchDivergence = buildPatchDivergenceImpl(left.patchEntries, right.patchEntries, targetId); const visibleState = compareVisibleStateV5(left.state, right.state, { targetId }); const fact = buildCoordinateComparisonFact({ @@ -1082,7 +1079,77 @@ export async function compareCoordinates(options) { visiblePatchDivergence, visibleState, }); - const digest = await computeChecksum(/** @type {Record} */ (/** @type {unknown} */ (fact)), this._crypto); + const digest = await computeChecksum(/** @type {Record} */ (/** @type {unknown} */ (fact)), graph._crypto); return /** @type {CoordinateComparisonV1} */ ({ ...fact, comparisonDigest: digest }); } + +// ── Controller class ────────────────────────────────────────────────────────── + +/** + * The host interface that ComparisonController depends on. + * + * @typedef {import('../WarpRuntime.js').default} ComparisonHost + */ + +export default class ComparisonController { + /** @type {ComparisonHost} */ + _host; + + /** + * Creates a ComparisonController bound to a WarpRuntime host. + * @param {ComparisonHost} host + */ + constructor(host) { + this._host = host; + } + + /** + * Builds a deterministic patch divergence analysis between two sets of patch entries. + * @param {Array<{ patch: import('../types/WarpTypesV2.js').PatchV2, sha: string }>} leftEntries + * @param {Array<{ patch: import('../types/WarpTypesV2.js').PatchV2, sha: string }>} rightEntries + * @param {string|null} [targetId] + * @returns {Record} + */ + buildPatchDivergence(leftEntries, rightEntries, targetId) { + return buildPatchDivergenceImpl(leftEntries, rightEntries, targetId ?? null); + } + + /** + * Compares a strand against its base, live truth, or another strand. + * @param {string} strandId + * @param {Record} [options] + * @returns {Promise} + */ + async compareStrand(strandId, options = {}) { + return await compareStrandImpl(this._host, strandId, options); + } + + /** + * Plans a transfer from one strand into another observation point. + * @param {string} strandId + * @param {Record} [options] + * @returns {Promise} + */ + async planStrandTransfer(strandId, options = {}) { + return await planStrandTransferImpl(this._host, strandId, options); + } + + /** + * Plans a deterministic transfer between two substrate observation selectors. + * @param {{ source: Record, target: Record, scope?: VisibleStateScopeV1|null }} options + * @returns {Promise} + */ + async planCoordinateTransfer(options) { + return await planCoordinateTransferImpl(this._host, options); + } + + /** + * Compares two substrate observation selectors. + * @param {{ left: Record, right: Record, targetId?: string|null, scope?: VisibleStateScopeV1|null }} options + * @returns {Promise} + */ + async compareCoordinates(options) { + return await compareCoordinatesImpl(this._host, options); + } +} From f92dadf37cbcb8dc00b34bd9e4720a3e232b85a2 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 23:30:21 -0700 Subject: [PATCH 39/73] refactor(NDNM): extract SubscriptionController from WarpRuntime MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Moved subscribe.methods.js (284 LOC) into SubscriptionController service class. Three methods extracted: subscribe(), watch(), and _notifySubscribers(). The _notifySubscribers() method remains on the prototype as a delegation stub because materialize.methods.js (still a mixin) calls it during post-materialization notification. This cross-call will become a direct controller-to-controller call when the kernel is extracted. Updated ESLint relaxation list: subscribe.methods.js → SubscriptionController.js. Phase 2a of the WarpRuntime god class decomposition. No public API surface changes. --- eslint.config.js | 2 +- src/domain/WarpRuntime.js | 27 +- src/domain/services/SubscriptionController.js | 244 +++++++++++++++ src/domain/warp/subscribe.methods.js | 284 ------------------ 4 files changed, 270 insertions(+), 287 deletions(-) create mode 100644 src/domain/services/SubscriptionController.js delete mode 100644 src/domain/warp/subscribe.methods.js diff --git a/eslint.config.js b/eslint.config.js index 84d38175..4a3a0aca 100644 --- a/eslint.config.js +++ b/eslint.config.js @@ -240,7 +240,7 @@ export default tseslint.config( files: [ "src/domain/WarpGraph.js", "src/domain/warp/query.methods.js", - "src/domain/warp/subscribe.methods.js", + "src/domain/services/SubscriptionController.js", "src/domain/warp/provenance.methods.js", "src/domain/warp/fork.methods.js", "src/domain/warp/checkpoint.methods.js", diff --git a/src/domain/WarpRuntime.js b/src/domain/WarpRuntime.js index 04156820..10b63104 100644 --- a/src/domain/WarpRuntime.js +++ b/src/domain/WarpRuntime.js @@ -21,13 +21,13 @@ import LRUCache from './utils/LRUCache.js'; import SyncController from './services/SyncController.js'; import StrandController from './services/StrandController.js'; import ComparisonController from './services/ComparisonController.js'; +import SubscriptionController from './services/SubscriptionController.js'; import SyncTrustGate from './services/SyncTrustGate.js'; import { AuditVerifierService } from './services/AuditVerifierService.js'; import MaterializedViewService from './services/MaterializedViewService.js'; import InMemoryBlobStorageAdapter from './utils/defaultBlobStorage.js'; import { wireWarpMethods } from './warp/_wire.js'; import * as queryMethods from './warp/query.methods.js'; -import * as subscribeMethods from './warp/subscribe.methods.js'; import * as provenanceMethods from './warp/provenance.methods.js'; import * as forkMethods from './warp/fork.methods.js'; import * as checkpointMethods from './warp/checkpoint.methods.js'; @@ -315,6 +315,9 @@ export default class WarpRuntime { /** @type {ComparisonController} */ this._comparisonController = new ComparisonController(this); + /** @type {SubscriptionController} */ + this._subscriptionController = new SubscriptionController(this); + /** @type {MaterializedViewService} */ this._viewService = new MaterializedViewService({ codec: this._codec, @@ -652,7 +655,6 @@ export default class WarpRuntime { // ── Wire extracted method groups onto WarpRuntime.prototype ─────────────────── wireWarpMethods(WarpRuntime, [ queryMethods, - subscribeMethods, provenanceMethods, forkMethods, checkpointMethods, @@ -686,6 +688,27 @@ for (const method of strandDelegates) { }); } +// ── Subscription methods: direct delegation to SubscriptionController ──────── +const subscriptionDelegates = /** @type {const} */ ([ + 'subscribe', 'watch', '_notifySubscribers', +]); +for (const method of subscriptionDelegates) { + Object.defineProperty(WarpRuntime.prototype, method, { + // eslint-disable-next-line object-shorthand -- function keyword needed for `this` binding + value: /** Delegates to SubscriptionController. @param {unknown[]} args @returns {unknown} */ function (...args) { + /** @type {unknown} */ + const raw = this; + const self = /** @type {WarpRuntime} */ (raw); + const ctrl = /** @type {Record} */ (/** @type {unknown} */ (self._subscriptionController)); + const fn = /** @type {(...a: unknown[]) => unknown} */ (ctrl[method]); + return fn.call(ctrl, ...args); + }, + writable: true, + configurable: true, + enumerable: false, + }); +} + // ── Comparison methods: direct delegation to ComparisonController ──────────── const comparisonDelegates = /** @type {const} */ ([ 'buildPatchDivergence', 'compareStrand', 'planStrandTransfer', diff --git a/src/domain/services/SubscriptionController.js b/src/domain/services/SubscriptionController.js new file mode 100644 index 00000000..a999ec66 --- /dev/null +++ b/src/domain/services/SubscriptionController.js @@ -0,0 +1,244 @@ +/** + * SubscriptionController — graph change subscription and watch logic. + * + * Extracted from subscribe.methods.js. Manages subscriber registration, + * glob-filtered watches with optional polling, and deferred replay. + * + * @module domain/services/SubscriptionController + */ + +import { diffStates, isEmptyDiff } from './StateDiff.js'; +import { matchGlob } from '../utils/matchGlob.js'; + +/** + * @typedef {Object} Subscriber + * @property {(diff: import('./StateDiff.js').StateDiffResult) => void} onChange + * @property {((error: unknown) => void)|undefined} [onError] + * @property {boolean} pendingReplay + */ + +/** + * The host interface that SubscriptionController depends on. + * + * @typedef {Object} SubscriptionHost + * @property {import('./JoinReducer.js').WarpStateV5|null} _cachedState + * @property {Array<{onChange: Function, onError?: Function, pendingReplay?: boolean}>} _subscribers + * @property {() => Promise} hasFrontierChanged + * @property {(options?: Record) => Promise} materialize + */ + +export default class SubscriptionController { + /** @type {SubscriptionHost} */ + _host; + + /** + * Creates a SubscriptionController bound to a WarpRuntime host. + * @param {SubscriptionHost} host + */ + constructor(host) { + this._host = host; + } + + /** + * Subscribes to graph changes. + * + * The `onChange` handler is called after each `materialize()` that results in + * state changes. The handler receives a diff object describing what changed. + * + * When `replay: true` is set and `_cachedState` is available, immediately + * fires `onChange` with a diff from empty state to current state. If + * `_cachedState` is null, replay is deferred until the first materialize. + * + * @param {{ onChange: (diff: import('./StateDiff.js').StateDiffResult) => void, onError?: (error: unknown) => void, replay?: boolean }} options + * @returns {{ unsubscribe: () => void }} + */ + subscribe({ onChange, onError, replay = false }) { + if (typeof onChange !== 'function') { + throw new Error('onChange must be a function'); + } + + const host = this._host; + const subscriber = { + onChange, + ...(onError !== undefined ? { onError } : {}), + pendingReplay: replay && !host._cachedState, + }; + host._subscribers.push(subscriber); + + // Immediate replay if requested and cached state is available + if (replay && host._cachedState) { + const diff = diffStates(null, host._cachedState); + if (!isEmptyDiff(diff)) { + try { + onChange(diff); + } catch (err) { + if (onError) { + try { + onError(/** @type {Error} */ (err)); + } catch { + // onError itself threw — swallow to prevent cascade + } + } + } + } + } + + return { + /** Removes this subscriber from the notification list. */ + unsubscribe: () => { + const index = host._subscribers.indexOf(subscriber); + if (index !== -1) { + host._subscribers.splice(index, 1); + } + }, + }; + } + + /** + * Watches for graph changes matching a pattern. + * + * Like `subscribe()`, but only fires for changes where node IDs match the + * provided glob pattern. When `poll` is set, periodically checks + * `hasFrontierChanged()` and auto-materializes if changed. + * + * @param {string|string[]} pattern + * @param {{ onChange: (diff: import('./StateDiff.js').StateDiffResult) => void, onError?: (error: unknown) => void, poll?: number }} options + * @returns {{ unsubscribe: () => void }} + */ + watch(pattern, { onChange, onError, poll }) { + /** Checks whether a pattern is a non-empty string or array of strings. @param {string|string[]} p @returns {boolean} */ + const isValidPattern = (p) => typeof p === 'string' || (Array.isArray(p) && p.length > 0 && p.every(i => typeof i === 'string')); + if (!isValidPattern(pattern)) { + throw new Error('pattern must be a non-empty string or non-empty array of strings'); + } + if (typeof onChange !== 'function') { + throw new Error('onChange must be a function'); + } + if (poll !== undefined) { + if (typeof poll !== 'number' || !Number.isFinite(poll) || poll < 1000) { + throw new Error('poll must be a finite number >= 1000'); + } + } + + /** Tests whether a node ID matches the subscription pattern. @param {string} nodeId @returns {boolean} */ + const matchesPattern = (nodeId) => matchGlob(pattern, nodeId); + + /** + * Filtered onChange that only passes matching changes. + * @param {import('./StateDiff.js').StateDiffResult} diff + */ + const filteredOnChange = (diff) => { + const filteredDiff = { + nodes: { + added: diff.nodes.added.filter(matchesPattern), + removed: diff.nodes.removed.filter(matchesPattern), + }, + edges: { + added: diff.edges.added.filter((/** @type {import('./StateDiff.js').EdgeChange} */ e) => matchesPattern(e.from) || matchesPattern(e.to)), + removed: diff.edges.removed.filter((/** @type {import('./StateDiff.js').EdgeChange} */ e) => matchesPattern(e.from) || matchesPattern(e.to)), + }, + props: { + set: diff.props.set.filter((/** @type {import('./StateDiff.js').PropSet} */ p) => matchesPattern(p.nodeId)), + removed: diff.props.removed.filter((/** @type {import('./StateDiff.js').PropRemoved} */ p) => matchesPattern(p.nodeId)), + }, + }; + + const hasChanges = + filteredDiff.nodes.added.length > 0 || + filteredDiff.nodes.removed.length > 0 || + filteredDiff.edges.added.length > 0 || + filteredDiff.edges.removed.length > 0 || + filteredDiff.props.set.length > 0 || + filteredDiff.props.removed.length > 0; + + if (hasChanges) { + onChange(filteredDiff); + } + }; + + // Reuse own subscription infrastructure + const subscription = this.subscribe({ + onChange: filteredOnChange, + ...(onError !== undefined ? { onError } : {}), + }); + + const host = this._host; + + // Polling: periodically check frontier and auto-materialize if changed + /** @type {ReturnType|null} */ + let pollIntervalId = null; + let pollInFlight = false; + if (poll !== undefined) { + pollIntervalId = setInterval(() => { + if (pollInFlight) { + return; + } + pollInFlight = true; + host.hasFrontierChanged() + .then(async (changed) => { + if (changed) { + await host.materialize(); + } + }) + .catch((err) => { + if (onError) { + try { + onError(err); + } catch { + // onError itself threw — swallow to prevent cascade + } + } + }) + .finally(() => { + pollInFlight = false; + }); + }, poll); + } + + return { + /** Stops polling and removes the filtered subscriber. */ + unsubscribe: () => { + if (pollIntervalId !== null) { + clearInterval(pollIntervalId); + pollIntervalId = null; + } + subscription.unsubscribe(); + }, + }; + } + + /** + * Notifies all subscribers of state changes. + * Handles deferred replay for subscribers added with `replay: true` before + * cached state was available. + * + * @param {import('./StateDiff.js').StateDiffResult} diff + * @param {import('./JoinReducer.js').WarpStateV5} currentState + */ + _notifySubscribers(diff, currentState) { + for (const subscriber of /** @type {Subscriber[]} */ ([...this._host._subscribers])) { + try { + if (subscriber.pendingReplay) { + subscriber.pendingReplay = false; + const replayDiff = diffStates(null, currentState); + if (!isEmptyDiff(replayDiff)) { + subscriber.onChange(replayDiff); + } + } else { + if (isEmptyDiff(diff)) { + continue; + } + subscriber.onChange(diff); + } + } catch (err) { + if (typeof subscriber.onError === 'function') { + try { + subscriber.onError(err); + } catch { + // onError itself threw — swallow to prevent cascade + } + } + } + } + } +} diff --git a/src/domain/warp/subscribe.methods.js b/src/domain/warp/subscribe.methods.js deleted file mode 100644 index 0ae64195..00000000 --- a/src/domain/warp/subscribe.methods.js +++ /dev/null @@ -1,284 +0,0 @@ -/** - * @module domain/warp/subscribe.methods - * - * Extracted subscribe, watch, and _notifySubscribers methods from WarpRuntime. - * Each function is bound to a WarpRuntime instance at runtime via `this`. - */ - -import { diffStates, isEmptyDiff } from '../services/StateDiff.js'; -import { matchGlob } from '../utils/matchGlob.js'; - -/** - * Subscribes to graph changes. - * - * The `onChange` handler is called after each `materialize()` that results in - * state changes. The handler receives a diff object describing what changed. - * - * When `replay: true` is set and `_cachedState` is available, immediately - * fires `onChange` with a diff from empty state to current state. If - * `_cachedState` is null, replay is deferred until the first materialize. - * - * Errors thrown by handlers are caught and forwarded to `onError` if provided. - * One handler's error does not prevent other handlers from being called. - * - * @public - * @since 13.0.0 (stable) - * @stability stable - * @this {import('../WarpRuntime.js').default} - * @param {{ onChange: (diff: import('../services/StateDiff.js').StateDiffResult) => void, onError?: (error: unknown) => void, replay?: boolean }} options - Subscription options - * @returns {{unsubscribe: () => void}} Subscription handle - * @throws {Error} If onChange is not a function - * - * @example - * const { unsubscribe } = graph.subscribe({ - * onChange: (diff) => { - * console.log('Nodes added:', diff.nodes.added); - * console.log('Nodes removed:', diff.nodes.removed); - * }, - * onError: (err) => console.error('Handler error:', err), - * }); - * - * // Later, to stop receiving updates: - * unsubscribe(); - * - * @example - * // With replay: get initial state immediately - * await graph.materialize(); - * graph.subscribe({ - * onChange: (diff) => console.log('Initial or changed:', diff), - * replay: true, // Immediately fires with current state as additions - * }); - */ -export function subscribe({ onChange, onError, replay = false }) { - if (typeof onChange !== 'function') { - throw new Error('onChange must be a function'); - } - - const subscriber = { - onChange, - ...(onError !== undefined ? { onError } : {}), - pendingReplay: replay && !this._cachedState, - }; - this._subscribers.push(subscriber); - - // Immediate replay if requested and cached state is available - if (replay && this._cachedState) { - const diff = diffStates(null, this._cachedState); - if (!isEmptyDiff(diff)) { - try { - onChange(diff); - } catch (err) { - if (onError) { - try { - onError(/** @type {Error} */ (err)); - } catch { - // onError itself threw — swallow to prevent cascade - } - } - } - } - } - - return { - /** Removes this subscriber from the notification list. */ - unsubscribe: () => { - const index = this._subscribers.indexOf(subscriber); - if (index !== -1) { - this._subscribers.splice(index, 1); - } - }, - }; -} - -/** - * Watches for graph changes matching a pattern. - * - * Like `subscribe()`, but only fires for changes where node IDs match the - * provided glob pattern. Uses the same pattern syntax as `query().match()`. - * - * - Nodes: filters `added` and `removed` to matching IDs - * - Edges: filters to edges where `from` or `to` matches the pattern - * - Props: filters to properties where `nodeId` matches the pattern - * - * If all changes are filtered out, the handler is not called. - * - * When `poll` is set, periodically checks `hasFrontierChanged()` and auto-materializes - * if the frontier has changed (e.g., remote writes detected). The poll interval must - * be at least 1000ms. - * - * @public - * @since 13.0.0 (stable) - * @stability stable - * @this {import('../WarpRuntime.js').default} - * @param {string|string[]} pattern - Glob pattern(s) (e.g., 'user:*', 'order:123', '*') - * @param {{ onChange: (diff: import('../services/StateDiff.js').StateDiffResult) => void, onError?: (error: unknown) => void, poll?: number }} options - Watch options - * @returns {{unsubscribe: () => void}} Subscription handle - * @throws {Error} If pattern is not a string or array of strings - * @throws {Error} If onChange is not a function - * @throws {Error} If poll is provided but less than 1000 - * - * @example - * const { unsubscribe } = graph.watch('user:*', { - * onChange: (diff) => { - * // Only user node changes arrive here - * console.log('User nodes added:', diff.nodes.added); - * }, - * }); - * - * @example - * // With polling: checks every 5s for remote changes - * const { unsubscribe } = graph.watch('user:*', { - * onChange: (diff) => console.log('User changed:', diff), - * poll: 5000, - * }); - * - * // Later, to stop receiving updates: - * unsubscribe(); - */ -export function watch(pattern, { onChange, onError, poll }) { - /** Checks whether a pattern is a non-empty string or array of strings. @param {string|string[]} p @returns {boolean} */ - const isValidPattern = (p) => typeof p === 'string' || (Array.isArray(p) && p.length > 0 && p.every(i => typeof i === 'string')); - if (!isValidPattern(pattern)) { - throw new Error('pattern must be a non-empty string or non-empty array of strings'); - } - if (typeof onChange !== 'function') { - throw new Error('onChange must be a function'); - } - if (poll !== undefined) { - if (typeof poll !== 'number' || !Number.isFinite(poll) || poll < 1000) { - throw new Error('poll must be a finite number >= 1000'); - } - } - - // Pattern matching logic - /** Tests whether a node ID matches the subscription pattern. @param {string} nodeId @returns {boolean} */ - const matchesPattern = (nodeId) => matchGlob(pattern, nodeId); - - /** - * Filtered onChange that only passes matching changes. - * @param {import('../services/StateDiff.js').StateDiffResult} diff - */ - const filteredOnChange = (diff) => { - const filteredDiff = { - nodes: { - added: diff.nodes.added.filter(matchesPattern), - removed: diff.nodes.removed.filter(matchesPattern), - }, - edges: { - added: diff.edges.added.filter((/** @type {import('../services/StateDiff.js').EdgeChange} */ e) => matchesPattern(e.from) || matchesPattern(e.to)), - removed: diff.edges.removed.filter((/** @type {import('../services/StateDiff.js').EdgeChange} */ e) => matchesPattern(e.from) || matchesPattern(e.to)), - }, - props: { - set: diff.props.set.filter((/** @type {import('../services/StateDiff.js').PropSet} */ p) => matchesPattern(p.nodeId)), - removed: diff.props.removed.filter((/** @type {import('../services/StateDiff.js').PropRemoved} */ p) => matchesPattern(p.nodeId)), - }, - }; - - // Only call handler if there are matching changes - const hasChanges = - filteredDiff.nodes.added.length > 0 || - filteredDiff.nodes.removed.length > 0 || - filteredDiff.edges.added.length > 0 || - filteredDiff.edges.removed.length > 0 || - filteredDiff.props.set.length > 0 || - filteredDiff.props.removed.length > 0; - - if (hasChanges) { - onChange(filteredDiff); - } - }; - - // Reuse subscription infrastructure - const subscription = this.subscribe({ - onChange: filteredOnChange, - ...(onError !== undefined ? { onError } : {}), - }); - - // Polling: periodically check frontier and auto-materialize if changed - /** @type {ReturnType|null} */ - let pollIntervalId = null; - let pollInFlight = false; - if (poll !== undefined) { - pollIntervalId = setInterval(() => { - if (pollInFlight) { - return; - } - pollInFlight = true; - this.hasFrontierChanged() - .then(async (changed) => { - if (changed) { - await this.materialize(); - } - }) - .catch((err) => { - if (onError) { - try { - onError(err); - } catch { - // onError itself threw — swallow to prevent cascade - } - } - }) - .finally(() => { - pollInFlight = false; - }); - }, poll); - } - - return { - /** Stops polling and removes the filtered subscriber. */ - unsubscribe: () => { - if (pollIntervalId !== null) { - clearInterval(pollIntervalId); - pollIntervalId = null; - } - subscription.unsubscribe(); - }, - }; -} - -/** - * @typedef {Object} Subscriber - * @property {(diff: import('../services/StateDiff.js').StateDiffResult) => void} onChange - * @property {((error: unknown) => void)|undefined} [onError] - * @property {boolean} pendingReplay - */ - -/** - * Notifies all subscribers of state changes. - * Handles deferred replay for subscribers added with `replay: true` before - * cached state was available. - * - * @this {import('../WarpRuntime.js').default} - * @param {import('../services/StateDiff.js').StateDiffResult} diff - * @param {import('../services/JoinReducer.js').WarpStateV5} currentState - The current state for deferred replay - * @private - */ -export function _notifySubscribers(diff, currentState) { - for (const subscriber of /** @type {Subscriber[]} */ ([...this._subscribers])) { - try { - // Handle deferred replay: on first notification, send full state diff instead - if (subscriber.pendingReplay) { - subscriber.pendingReplay = false; - const replayDiff = diffStates(null, currentState); - if (!isEmptyDiff(replayDiff)) { - subscriber.onChange(replayDiff); - } - } else { - // Skip non-replay subscribers when diff is empty - if (isEmptyDiff(diff)) { - continue; - } - subscriber.onChange(diff); - } - } catch (err) { - if (typeof subscriber.onError === 'function') { - try { - subscriber.onError(err); - } catch { - // onError itself threw — swallow to prevent cascade - } - } - } - } -} From a83d64950c6f57c6508e7ee279521d405d47b636 Mon Sep 17 00:00:00 2001 From: James Ross Date: Wed, 1 Apr 2026 23:37:19 -0700 Subject: [PATCH 40/73] refactor(NDNM): extract ProvenanceController from WarpRuntime Moved provenance.methods.js (286 LOC) into ProvenanceController service class. Seven methods extracted: patchesFor, materializeSlice, _computeBackwardCone, loadPatchBySha, _loadPatchBySha, _loadPatchesBySha, _sortPatchesCausally. Host typed as WarpGraphWithMixins (from _internal.js) because _readPatchBlob is wired onto the prototype by patch.methods.js and invisible to TSC as a class member. Constructor casts through unknown at the single construction site. Updated ESLint relaxation list. Phase 2b of the WarpRuntime god class decomposition. No public API surface changes. --- eslint.config.js | 2 +- src/domain/WarpRuntime.js | 28 +- src/domain/services/ProvenanceController.js | 242 +++++++++++++++++ src/domain/warp/provenance.methods.js | 286 -------------------- 4 files changed, 269 insertions(+), 289 deletions(-) create mode 100644 src/domain/services/ProvenanceController.js delete mode 100644 src/domain/warp/provenance.methods.js diff --git a/eslint.config.js b/eslint.config.js index 4a3a0aca..5264942c 100644 --- a/eslint.config.js +++ b/eslint.config.js @@ -241,7 +241,7 @@ export default tseslint.config( "src/domain/WarpGraph.js", "src/domain/warp/query.methods.js", "src/domain/services/SubscriptionController.js", - "src/domain/warp/provenance.methods.js", + "src/domain/services/ProvenanceController.js", "src/domain/warp/fork.methods.js", "src/domain/warp/checkpoint.methods.js", "src/domain/warp/patch.methods.js", diff --git a/src/domain/WarpRuntime.js b/src/domain/WarpRuntime.js index 10b63104..93b1fd9e 100644 --- a/src/domain/WarpRuntime.js +++ b/src/domain/WarpRuntime.js @@ -22,13 +22,13 @@ import SyncController from './services/SyncController.js'; import StrandController from './services/StrandController.js'; import ComparisonController from './services/ComparisonController.js'; import SubscriptionController from './services/SubscriptionController.js'; +import ProvenanceController from './services/ProvenanceController.js'; import SyncTrustGate from './services/SyncTrustGate.js'; import { AuditVerifierService } from './services/AuditVerifierService.js'; import MaterializedViewService from './services/MaterializedViewService.js'; import InMemoryBlobStorageAdapter from './utils/defaultBlobStorage.js'; import { wireWarpMethods } from './warp/_wire.js'; import * as queryMethods from './warp/query.methods.js'; -import * as provenanceMethods from './warp/provenance.methods.js'; import * as forkMethods from './warp/fork.methods.js'; import * as checkpointMethods from './warp/checkpoint.methods.js'; import * as patchMethods from './warp/patch.methods.js'; @@ -318,6 +318,9 @@ export default class WarpRuntime { /** @type {SubscriptionController} */ this._subscriptionController = new SubscriptionController(this); + /** @type {ProvenanceController} */ + this._provenanceController = new ProvenanceController(/** @type {import('./warp/_internal.js').WarpGraphWithMixins} */ (/** @type {unknown} */ (this))); + /** @type {MaterializedViewService} */ this._viewService = new MaterializedViewService({ codec: this._codec, @@ -655,7 +658,6 @@ export default class WarpRuntime { // ── Wire extracted method groups onto WarpRuntime.prototype ─────────────────── wireWarpMethods(WarpRuntime, [ queryMethods, - provenanceMethods, forkMethods, checkpointMethods, patchMethods, @@ -688,6 +690,28 @@ for (const method of strandDelegates) { }); } +// ── Provenance methods: direct delegation to ProvenanceController ──────────── +const provenanceDelegates = /** @type {const} */ ([ + 'patchesFor', 'materializeSlice', '_computeBackwardCone', + 'loadPatchBySha', '_loadPatchBySha', '_loadPatchesBySha', '_sortPatchesCausally', +]); +for (const method of provenanceDelegates) { + Object.defineProperty(WarpRuntime.prototype, method, { + // eslint-disable-next-line object-shorthand -- function keyword needed for `this` binding + value: /** Delegates to ProvenanceController. @param {unknown[]} args @returns {unknown} */ function (...args) { + /** @type {unknown} */ + const raw = this; + const self = /** @type {WarpRuntime} */ (raw); + const ctrl = /** @type {Record} */ (/** @type {unknown} */ (self._provenanceController)); + const fn = /** @type {(...a: unknown[]) => unknown} */ (ctrl[method]); + return fn.call(ctrl, ...args); + }, + writable: true, + configurable: true, + enumerable: false, + }); +} + // ── Subscription methods: direct delegation to SubscriptionController ──────── const subscriptionDelegates = /** @type {const} */ ([ 'subscribe', 'watch', '_notifySubscribers', diff --git a/src/domain/services/ProvenanceController.js b/src/domain/services/ProvenanceController.js new file mode 100644 index 00000000..73444fe3 --- /dev/null +++ b/src/domain/services/ProvenanceController.js @@ -0,0 +1,242 @@ +/** + * ProvenanceController — patch lookups, slice materialization, + * backward causal cone computation, and causal sorting. + * + * Extracted from provenance.methods.js. + * + * @module domain/services/ProvenanceController + */ + +import QueryError from '../errors/QueryError.js'; +import { createEmptyStateV5, reduceV5 } from './JoinReducer.js'; +import { ProvenancePayload } from './ProvenancePayload.js'; +import { decodePatchMessage, detectMessageKind } from './WarpMessageCodec.js'; + +/** @typedef {import('../types/WarpTypesV2.js').PatchV2} PatchV2 */ + +/** + * The host interface that ProvenanceController depends on. + * + * Uses WarpRuntime directly because several required methods + * (_readPatchBlob, _ensureFreshState) are wired onto the prototype + * by other mixin files and not visible to TSC as class members. + * + * @typedef {import('../warp/_internal.js').WarpGraphWithMixins} ProvenanceHost + */ + +export default class ProvenanceController { + /** @type {ProvenanceHost} */ + _host; + + /** + * Creates a ProvenanceController bound to a WarpRuntime host. + * @param {ProvenanceHost} host + */ + constructor(host) { + this._host = host; + } + + /** + * Returns all patch SHAs that affected a given node or edge. + * + * @param {string} entityId + * @returns {Promise} + */ + async patchesFor(entityId) { + await this._host._ensureFreshState(); + + if (this._host._provenanceDegraded) { + throw new QueryError('Provenance unavailable for cached seek. Re-seek with --no-persistent-cache or call materialize({ ceiling }) directly.', { + code: 'E_PROVENANCE_DEGRADED', + }); + } + + if (!this._host._provenanceIndex) { + throw new QueryError('No provenance index. Call materialize() first.', { + code: 'E_NO_STATE', + }); + } + return this._host._provenanceIndex.patchesFor(entityId); + } + + /** + * Materializes only the backward causal cone for a specific node. + * + * @param {string} nodeId + * @param {{receipts?: boolean}} [options] + * @returns {Promise<{state: import('./JoinReducer.js').WarpStateV5, patchCount: number, receipts?: import('../types/TickReceipt.js').TickReceipt[]}>} + */ + async materializeSlice(nodeId, options) { + const host = this._host; + const t0 = host._clock.now(); + const collectReceipts = options?.receipts === true; + + try { + await host._ensureFreshState(); + + if (host._provenanceDegraded) { + throw new QueryError('Provenance unavailable for cached seek. Re-seek with --no-persistent-cache or call materialize({ ceiling }) directly.', { + code: 'E_PROVENANCE_DEGRADED', + }); + } + + if (!host._provenanceIndex) { + throw new QueryError('No provenance index. Call materialize() first.', { + code: 'E_NO_STATE', + }); + } + + const conePatchMap = await this._computeBackwardCone(nodeId); + + if (conePatchMap.size === 0) { + const emptyState = createEmptyStateV5(); + host._logTiming('materializeSlice', t0, { metrics: '0 patches (empty cone)' }); + return { + state: emptyState, + patchCount: 0, + ...(collectReceipts ? { receipts: [] } : {}), + }; + } + + const patchEntries = []; + for (const [sha, patch] of conePatchMap) { + patchEntries.push({ patch, sha }); + } + + const sortedPatches = this._sortPatchesCausally(patchEntries); + host._logTiming('materializeSlice', t0, { metrics: `${sortedPatches.length} patches` }); + + if (collectReceipts) { + const result = /** @type {{state: import('./JoinReducer.js').WarpStateV5, receipts: import('../types/TickReceipt.js').TickReceipt[]}} */ (reduceV5(sortedPatches, undefined, { receipts: true })); + return { + state: result.state, + patchCount: sortedPatches.length, + receipts: result.receipts, + }; + } + + const payload = new ProvenancePayload(sortedPatches); + return { + state: payload.replay(), + patchCount: sortedPatches.length, + }; + } catch (err) { + host._logTiming('materializeSlice', t0, { error: /** @type {Error} */ (err) }); + throw err; + } + } + + /** + * Computes the backward causal cone for a node via BFS over the provenance index. + * + * @param {string} nodeId + * @returns {Promise>} + */ + async _computeBackwardCone(nodeId) { + const host = this._host; + if (!host._provenanceIndex) { + throw new QueryError('No provenance index. Call materialize() first.', { + code: 'E_NO_STATE', + }); + } + /** @type {Map} */ + const cone = new Map(); + /** @type {Set} */ + const visited = new Set(); + const queue = [nodeId]; + let qi = 0; + + while (qi < queue.length) { + const entityId = /** @type {string} */ (queue[qi++]); + if (visited.has(entityId)) { + continue; + } + visited.add(entityId); + + const patchShas = host._provenanceIndex.patchesFor(entityId); + for (const sha of patchShas) { + if (cone.has(sha)) { + continue; + } + const patch = await this._loadPatchBySha(sha); + cone.set(sha, patch); + + const patchReads = /** @type {{reads?: string[]}} */ (patch).reads; + if (patchReads) { + for (const readEntity of patchReads) { + if (!visited.has(readEntity)) { + queue.push(readEntity); + } + } + } + } + } + + return cone; + } + + /** + * Loads a single patch by its SHA (public API for CLI/debug tooling). + * + * @param {string} sha + * @returns {Promise} + */ + async loadPatchBySha(sha) { + return await this._loadPatchBySha(sha); + } + + /** + * Loads a single patch by its SHA. + * + * @param {string} sha + * @returns {Promise} + */ + async _loadPatchBySha(sha) { + const host = this._host; + const nodeInfo = await host._persistence.getNodeInfo(sha); + const kind = detectMessageKind(nodeInfo.message); + + if (kind !== 'patch') { + throw new Error(`Commit ${sha} is not a patch`); + } + + const patchMeta = decodePatchMessage(nodeInfo.message); + const patchBuffer = await host._readPatchBlob(patchMeta); + return /** @type {PatchV2} */ (host._codec.decode(patchBuffer)); + } + + /** + * Loads multiple patches by their SHAs. + * + * @param {string[]} shas + * @returns {Promise>} + */ + async _loadPatchesBySha(shas) { + const entries = []; + for (const sha of shas) { + const patch = await this._loadPatchBySha(sha); + entries.push({ patch, sha }); + } + return entries; + } + + /** + * Sorts patches in causal order for deterministic replay. + * + * @param {Array<{patch: PatchV2, sha: string}>} patches + * @returns {Array<{patch: PatchV2, sha: string}>} + */ + _sortPatchesCausally(patches) { + return [...patches].sort((a, b) => { + const lamportDiff = (a.patch.lamport || 0) - (b.patch.lamport || 0); + if (lamportDiff !== 0) { + return lamportDiff; + } + const writerCmp = (a.patch.writer || '').localeCompare(b.patch.writer || ''); + if (writerCmp !== 0) { + return writerCmp; + } + return a.sha.localeCompare(b.sha); + }); + } +} diff --git a/src/domain/warp/provenance.methods.js b/src/domain/warp/provenance.methods.js deleted file mode 100644 index 928b1c7e..00000000 --- a/src/domain/warp/provenance.methods.js +++ /dev/null @@ -1,286 +0,0 @@ -/** - * Provenance methods for WarpRuntime — patch lookups, slice materialization, - * backward causal cone computation, and causal sorting. - * - * Every function uses `this` bound to a WarpRuntime instance at runtime - * via wireWarpMethods(). - * - * @module domain/warp/provenance.methods - */ - -import { QueryError } from './_internal.js'; -import { createEmptyStateV5, reduceV5 } from '../services/JoinReducer.js'; -import { ProvenancePayload } from '../services/ProvenancePayload.js'; -import { decodePatchMessage, detectMessageKind } from '../services/WarpMessageCodec.js'; - -/** @typedef {import('../types/WarpTypesV2.js').PatchV2} PatchV2 */ - -/** - * Returns all patch SHAs that affected a given node or edge. - * - * "Affected" means the patch either read from or wrote to the entity - * (based on the patch's I/O declarations from HG/IO/1). - * - * If `autoMaterialize` is enabled, this will automatically materialize - * the state if dirty. Otherwise, call `materialize()` first. - * - * @this {import('../WarpRuntime.js').default} - * @param {string} entityId - The node ID or edge key to query - * @returns {Promise} Array of patch SHAs that affected the entity, sorted alphabetically - * @throws {QueryError} If no cached state exists and autoMaterialize is off (code: `E_NO_STATE`) - */ -export async function patchesFor(entityId) { - await this._ensureFreshState(); - - if (this._provenanceDegraded) { - throw new QueryError('Provenance unavailable for cached seek. Re-seek with --no-persistent-cache or call materialize({ ceiling }) directly.', { - code: 'E_PROVENANCE_DEGRADED', - }); - } - - if (!this._provenanceIndex) { - throw new QueryError('No provenance index. Call materialize() first.', { - code: 'E_NO_STATE', - }); - } - return this._provenanceIndex.patchesFor(entityId); -} - -/** - * Materializes only the backward causal cone for a specific node. - * - * This implements the slicing theorem from Paper III (Computational Holography): - * Given a target node v, compute its backward causal cone D(v) - the set of - * all patches that contributed to v's current state - and replay only those. - * - * The algorithm: - * 1. Start with patches that directly wrote to the target node - * 2. For each patch, find entities it read from - * 3. Recursively gather all dependencies - * 4. Topologically sort by Lamport timestamp (causal order) - * 5. Replay the sorted patches against empty state - * - * **Requires a cached state.** Call materialize() first to build the provenance index. - * - * @this {import('../WarpRuntime.js').default} - * @param {string} nodeId - The target node ID to materialize the cone for - * @param {{receipts?: boolean}} [options] - Optional configuration - * @returns {Promise<{state: import('../services/JoinReducer.js').WarpStateV5, patchCount: number, receipts?: import('../types/TickReceipt.js').TickReceipt[]}>} - * Returns the sliced state with the patch count (for comparison with full materialization) - * @throws {QueryError} If no provenance index exists (code: `E_NO_STATE`) - * @throws {Error} If patch loading fails - */ -export async function materializeSlice(nodeId, options) { - const t0 = this._clock.now(); - const collectReceipts = options?.receipts === true; - - try { - // Ensure fresh state before accessing provenance index - await this._ensureFreshState(); - - if (this._provenanceDegraded) { - throw new QueryError('Provenance unavailable for cached seek. Re-seek with --no-persistent-cache or call materialize({ ceiling }) directly.', { - code: 'E_PROVENANCE_DEGRADED', - }); - } - - if (!this._provenanceIndex) { - throw new QueryError('No provenance index. Call materialize() first.', { - code: 'E_NO_STATE', - }); - } - - // 1. Compute backward causal cone using BFS over the provenance index - // Returns Map with patches already loaded (avoids double I/O) - const conePatchMap = await this._computeBackwardCone(nodeId); - - // 2. If no patches in cone, return empty state - if (conePatchMap.size === 0) { - const emptyState = createEmptyStateV5(); - this._logTiming('materializeSlice', t0, { metrics: '0 patches (empty cone)' }); - return { - state: emptyState, - patchCount: 0, - ...(collectReceipts ? { receipts: [] } : {}), - }; - } - - // 3. Convert cached patches to entry format (patches already loaded by _computeBackwardCone) - const patchEntries = []; - for (const [sha, patch] of conePatchMap) { - patchEntries.push({ patch, sha }); - } - - // 4. Topologically sort by causal order (Lamport timestamp, then writer, then SHA) - const sortedPatches = this._sortPatchesCausally(patchEntries); - - // 5. Replay: use reduceV5 directly when collecting receipts, otherwise use ProvenancePayload - this._logTiming('materializeSlice', t0, { metrics: `${sortedPatches.length} patches` }); - - if (collectReceipts) { - const result = /** @type {{state: import('../services/JoinReducer.js').WarpStateV5, receipts: import('../types/TickReceipt.js').TickReceipt[]}} */ (reduceV5(sortedPatches, undefined, { receipts: true })); - return { - state: result.state, - patchCount: sortedPatches.length, - receipts: result.receipts, - }; - } - - const payload = new ProvenancePayload(sortedPatches); - return { - state: payload.replay(), - patchCount: sortedPatches.length, - }; - } catch (err) { - this._logTiming('materializeSlice', t0, { error: /** @type {Error} */ (err) }); - throw err; - } -} - -/** - * Computes the backward causal cone for a node. - * - * Uses BFS over the provenance index: - * 1. Find all patches that wrote to the target node - * 2. For each patch, find entities it read from - * 3. Find all patches that wrote to those entities - * 4. Repeat until no new patches are found - * - * Returns a Map of SHA -> patch to avoid double-loading (the cone - * computation needs to read patches for their read-dependencies, - * so we cache them for later replay). - * - * @this {import('../WarpRuntime.js').default} - * @param {string} nodeId - The target node ID - * @returns {Promise>} Map of patch SHA to loaded patch object - */ -export async function _computeBackwardCone(nodeId) { - if (!this._provenanceIndex) { - throw new QueryError('No provenance index. Call materialize() first.', { - code: 'E_NO_STATE', - }); - } - const cone = new Map(); // sha -> patch (cache loaded patches) - const visited = new Set(); // Visited entities - const queue = [nodeId]; // BFS queue of entities to process - let qi = 0; - - while (qi < queue.length) { - const entityId = /** @type {string} */ (queue[qi++]); - - if (visited.has(entityId)) { - continue; - } - visited.add(entityId); - - // Get all patches that affected this entity - const patchShas = /** @type {import('../services/ProvenanceIndex.js').ProvenanceIndex} */ (this._provenanceIndex).patchesFor(entityId); - - for (const sha of patchShas) { - if (cone.has(sha)) { - continue; - } - - // Load the patch and cache it - const patch = await this._loadPatchBySha(sha); - cone.set(sha, patch); - - // Add read dependencies to the queue - const patchReads = /** @type {{reads?: string[]}} */ (patch).reads; - if (patchReads) { - for (const readEntity of patchReads) { - if (!visited.has(readEntity)) { - queue.push(readEntity); - } - } - } - } - } - - return cone; -} - -/** - * Loads a single patch by its SHA. - * - * Thin wrapper around the internal `_loadPatchBySha` helper. Exposed for - * CLI/debug tooling (e.g. seek tick receipts) that needs to inspect patch - * operations without re-materializing intermediate states. - * - * @this {import('../WarpRuntime.js').default} - * @param {string} sha - The patch commit SHA - * @returns {Promise} The decoded patch object - * @throws {Error} If the commit is not a patch or loading fails - */ -export async function loadPatchBySha(sha) { - return await this._loadPatchBySha(sha); -} - -/** - * Loads a single patch by its SHA. - * - * @this {import('./_internal.js').WarpGraphWithMixins} - * @param {string} sha - The patch commit SHA - * @returns {Promise} The decoded patch object - * @throws {Error} If the commit is not a patch or loading fails - */ -export async function _loadPatchBySha(sha) { - const nodeInfo = await this._persistence.getNodeInfo(sha); - const kind = detectMessageKind(nodeInfo.message); - - if (kind !== 'patch') { - throw new Error(`Commit ${sha} is not a patch`); - } - - const patchMeta = decodePatchMessage(nodeInfo.message); - const patchBuffer = await this._readPatchBlob(patchMeta); - return /** @type {import('../types/WarpTypesV2.js').PatchV2} */ (this._codec.decode(patchBuffer)); -} - -/** - * Loads multiple patches by their SHAs. - * - * @this {import('../WarpRuntime.js').default} - * @param {string[]} shas - Array of patch commit SHAs - * @returns {Promise>} Array of patch entries - * @throws {Error} If any SHA is not a patch or loading fails - */ -export async function _loadPatchesBySha(shas) { - const entries = []; - - for (const sha of shas) { - const patch = await this._loadPatchBySha(sha); - entries.push({ patch, sha }); - } - - return entries; -} - -/** - * Sorts patches in causal order for deterministic replay. - * - * Sort order: Lamport timestamp (ascending), then writer ID, then SHA. - * This ensures deterministic ordering regardless of discovery order. - * - * @this {import('../WarpRuntime.js').default} - * @param {Array<{patch: PatchV2, sha: string}>} patches - Unsorted patch entries - * @returns {Array<{patch: PatchV2, sha: string}>} Sorted patch entries - */ -export function _sortPatchesCausally(patches) { - return [...patches].sort((a, b) => { - // Primary: Lamport timestamp (ascending - earlier patches first) - const lamportDiff = (a.patch.lamport || 0) - (b.patch.lamport || 0); - if (lamportDiff !== 0) { - return lamportDiff; - } - - // Secondary: Writer ID (lexicographic) - const writerCmp = (a.patch.writer || '').localeCompare(b.patch.writer || ''); - if (writerCmp !== 0) { - return writerCmp; - } - - // Tertiary: SHA (lexicographic) for total ordering - return a.sha.localeCompare(b.sha); - }); -} From 8fb1799bfc8e7f241cc65755a24e9486abfc8b1f Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 01:04:14 -0700 Subject: [PATCH 41/73] refactor(NDNM): extract ForkController from WarpRuntime Moved fork.methods.js (320 LOC) into ForkController service class. Five methods extracted: fork, createWormhole, _isAncestor, _relationToCheckpointHead, _validatePatchAgainstCheckpoint. Private helpers (_isAncestor, _relationToCheckpointHead, _validatePatchAgainstCheckpoint) remain on the prototype as delegation stubs because checkpoint.methods.js (still a mixin) calls them during backfill rejection. Updated ESLint relaxation list. Phase 3a of the WarpRuntime god class decomposition. No public API surface changes. --- eslint.config.js | 2 +- src/domain/WarpRuntime.js | 28 ++- src/domain/services/ForkController.js | 274 ++++++++++++++++++++++ src/domain/warp/fork.methods.js | 320 -------------------------- 4 files changed, 301 insertions(+), 323 deletions(-) create mode 100644 src/domain/services/ForkController.js delete mode 100644 src/domain/warp/fork.methods.js diff --git a/eslint.config.js b/eslint.config.js index 5264942c..a4abd20c 100644 --- a/eslint.config.js +++ b/eslint.config.js @@ -242,7 +242,7 @@ export default tseslint.config( "src/domain/warp/query.methods.js", "src/domain/services/SubscriptionController.js", "src/domain/services/ProvenanceController.js", - "src/domain/warp/fork.methods.js", + "src/domain/services/ForkController.js", "src/domain/warp/checkpoint.methods.js", "src/domain/warp/patch.methods.js", "src/domain/warp/materialize.methods.js", diff --git a/src/domain/WarpRuntime.js b/src/domain/WarpRuntime.js index 93b1fd9e..fd5e9b4f 100644 --- a/src/domain/WarpRuntime.js +++ b/src/domain/WarpRuntime.js @@ -23,13 +23,13 @@ import StrandController from './services/StrandController.js'; import ComparisonController from './services/ComparisonController.js'; import SubscriptionController from './services/SubscriptionController.js'; import ProvenanceController from './services/ProvenanceController.js'; +import ForkController from './services/ForkController.js'; import SyncTrustGate from './services/SyncTrustGate.js'; import { AuditVerifierService } from './services/AuditVerifierService.js'; import MaterializedViewService from './services/MaterializedViewService.js'; import InMemoryBlobStorageAdapter from './utils/defaultBlobStorage.js'; import { wireWarpMethods } from './warp/_wire.js'; import * as queryMethods from './warp/query.methods.js'; -import * as forkMethods from './warp/fork.methods.js'; import * as checkpointMethods from './warp/checkpoint.methods.js'; import * as patchMethods from './warp/patch.methods.js'; import * as materializeMethods from './warp/materialize.methods.js'; @@ -321,6 +321,9 @@ export default class WarpRuntime { /** @type {ProvenanceController} */ this._provenanceController = new ProvenanceController(/** @type {import('./warp/_internal.js').WarpGraphWithMixins} */ (/** @type {unknown} */ (this))); + /** @type {ForkController} */ + this._forkController = new ForkController(this); + /** @type {MaterializedViewService} */ this._viewService = new MaterializedViewService({ codec: this._codec, @@ -658,7 +661,6 @@ export default class WarpRuntime { // ── Wire extracted method groups onto WarpRuntime.prototype ─────────────────── wireWarpMethods(WarpRuntime, [ queryMethods, - forkMethods, checkpointMethods, patchMethods, materializeMethods, @@ -690,6 +692,28 @@ for (const method of strandDelegates) { }); } +// ── Fork methods: direct delegation to ForkController ──────────────────────── +const forkDelegates = /** @type {const} */ ([ + 'fork', 'createWormhole', + '_isAncestor', '_relationToCheckpointHead', '_validatePatchAgainstCheckpoint', +]); +for (const method of forkDelegates) { + Object.defineProperty(WarpRuntime.prototype, method, { + // eslint-disable-next-line object-shorthand -- function keyword needed for `this` binding + value: /** Delegates to ForkController. @param {unknown[]} args @returns {unknown} */ function (...args) { + /** @type {unknown} */ + const raw = this; + const self = /** @type {WarpRuntime} */ (raw); + const ctrl = /** @type {Record} */ (/** @type {unknown} */ (self._forkController)); + const fn = /** @type {(...a: unknown[]) => unknown} */ (ctrl[method]); + return fn.call(ctrl, ...args); + }, + writable: true, + configurable: true, + enumerable: false, + }); +} + // ── Provenance methods: direct delegation to ProvenanceController ──────────── const provenanceDelegates = /** @type {const} */ ([ 'patchesFor', 'materializeSlice', '_computeBackwardCone', diff --git a/src/domain/services/ForkController.js b/src/domain/services/ForkController.js new file mode 100644 index 00000000..d7893273 --- /dev/null +++ b/src/domain/services/ForkController.js @@ -0,0 +1,274 @@ +/** + * ForkController — fork creation, wormhole compression, and + * backfill-rejection helpers. + * + * Extracted from fork.methods.js. + * + * @module domain/services/ForkController + */ + +import ForkError from '../errors/ForkError.js'; +import { validateGraphName, validateWriterId, buildWriterRef, buildWritersPrefix } from '../utils/RefLayout.js'; +import { generateWriterId } from '../utils/WriterId.js'; +import { createWormhole as createWormholeImpl } from './WormholeService.js'; + +const DEFAULT_ADJACENCY_CACHE_SIZE = 3; + +/** + * The host interface that ForkController depends on. + * + * @typedef {import('../WarpRuntime.js').default} ForkHost + */ + +export default class ForkController { + /** @type {ForkHost} */ + _host; + + /** + * Creates a ForkController bound to a WarpRuntime host. + * @param {ForkHost} host + */ + constructor(host) { + this._host = host; + } + + /** + * Creates a fork of this graph at a specific point in a writer's history. + * + * @param {{ from: string, at: string, forkName?: string, forkWriterId?: string }} options + * @returns {Promise} + */ + async fork({ from, at, forkName, forkWriterId }) { + const host = this._host; + const t0 = host._clock.now(); + + try { + if (!from || typeof from !== 'string') { + throw new ForkError("Required parameter 'from' is missing or not a string", { + code: 'E_FORK_INVALID_ARGS', + context: { from }, + }); + } + + if (!at || typeof at !== 'string') { + throw new ForkError("Required parameter 'at' is missing or not a string", { + code: 'E_FORK_INVALID_ARGS', + context: { at }, + }); + } + + const writers = await host.discoverWriters(); + if (!writers.includes(from)) { + throw new ForkError(`Writer '${from}' does not exist in graph '${host._graphName}'`, { + code: 'E_FORK_WRITER_NOT_FOUND', + context: { writerId: from, graphName: host._graphName, existingWriters: writers }, + }); + } + + const nodeExists = await host._persistence.nodeExists(at); + if (!nodeExists) { + throw new ForkError(`Patch SHA '${at}' does not exist`, { + code: 'E_FORK_PATCH_NOT_FOUND', + context: { patchSha: at, writerId: from }, + }); + } + + const writerRef = buildWriterRef(host._graphName, from); + const tipSha = await host._persistence.readRef(writerRef); + + if (tipSha === null || tipSha === undefined || tipSha === '') { + throw new ForkError(`Writer '${from}' has no commits`, { + code: 'E_FORK_WRITER_NOT_FOUND', + context: { writerId: from }, + }); + } + + const isInChain = await this._isAncestor(at, tipSha); + if (!isInChain) { + throw new ForkError(`Patch SHA '${at}' is not in writer '${from}' chain`, { + code: 'E_FORK_PATCH_NOT_IN_CHAIN', + context: { patchSha: at, writerId: from, tipSha }, + }); + } + + const resolvedForkName = + forkName ?? `${host._graphName}-fork-${Math.random().toString(36).slice(2, 10).padEnd(8, '0')}`; + try { + validateGraphName(resolvedForkName); + } catch (err) { + throw new ForkError(`Invalid fork name: ${/** @type {Error} */ (err).message}`, { + code: 'E_FORK_NAME_INVALID', + context: { forkName: resolvedForkName, originalError: /** @type {Error} */ (err).message }, + }); + } + + const forkWritersPrefix = buildWritersPrefix(resolvedForkName); + const existingForkRefs = await host._persistence.listRefs(forkWritersPrefix); + if (existingForkRefs.length > 0) { + throw new ForkError(`Graph '${resolvedForkName}' already exists`, { + code: 'E_FORK_ALREADY_EXISTS', + context: { forkName: resolvedForkName, existingRefs: existingForkRefs }, + }); + } + + const resolvedForkWriterId = (forkWriterId !== undefined && forkWriterId !== null && forkWriterId !== '') ? forkWriterId : generateWriterId(); + try { + validateWriterId(resolvedForkWriterId); + } catch (err) { + throw new ForkError(`Invalid fork writer ID: ${/** @type {Error} */ (err).message}`, { + code: 'E_FORK_WRITER_ID_INVALID', + context: { forkWriterId: resolvedForkWriterId, originalError: /** @type {Error} */ (err).message }, + }); + } + + const forkWriterRef = buildWriterRef(resolvedForkName, resolvedForkWriterId); + await host._persistence.updateRef(forkWriterRef, at); + + // Dynamic import to avoid circular dependency + const { default: WarpRuntime } = await import('../WarpRuntime.js'); + + const forkGraph = await WarpRuntime.open({ + persistence: host._persistence, + graphName: resolvedForkName, + writerId: resolvedForkWriterId, + gcPolicy: host._gcPolicy, + adjacencyCacheSize: host._adjacencyCache?.maxSize ?? DEFAULT_ADJACENCY_CACHE_SIZE, + ...(host._checkpointPolicy ? { checkpointPolicy: host._checkpointPolicy } : {}), + autoMaterialize: host._autoMaterialize, + onDeleteWithData: host._onDeleteWithData, + ...(host._logger ? { logger: host._logger } : {}), + clock: host._clock, + crypto: host._crypto, + codec: host._codec, + }); + + host._logTiming('fork', t0, { + metrics: `from=${from} at=${at.slice(0, 7)} name=${resolvedForkName}`, + }); + + return forkGraph; + } catch (err) { + host._logTiming('fork', t0, { error: /** @type {Error} */ (err) }); + throw err; + } + } + + /** + * Creates a wormhole compressing a range of patches. + * + * @param {string} fromSha + * @param {string} toSha + * @returns {Promise<{fromSha: string, toSha: string, writerId: string, payload: import('./ProvenancePayload.js').default, patchCount: number}>} + */ + async createWormhole(fromSha, toSha) { + const host = this._host; + const t0 = host._clock.now(); + + try { + const wormhole = await createWormholeImpl({ + persistence: host._persistence, + graphName: host._graphName, + fromSha, + toSha, + codec: host._codec, + }); + + host._logTiming('createWormhole', t0, { + metrics: `${wormhole.patchCount} patches from=${fromSha.slice(0, 7)} to=${toSha.slice(0, 7)}`, + }); + + return wormhole; + } catch (err) { + host._logTiming('createWormhole', t0, { error: /** @type {Error} */ (err) }); + throw err; + } + } + + /** + * Checks if ancestorSha is an ancestor of descendantSha. + * + * @param {string} ancestorSha + * @param {string} descendantSha + * @returns {Promise} + */ + async _isAncestor(ancestorSha, descendantSha) { + if (!ancestorSha || !descendantSha) { + return false; + } + if (ancestorSha === descendantSha) { + return true; + } + + /** @type {string | null} */ + let cur = descendantSha; + const MAX_WALK = 100_000; + let steps = 0; + while (cur !== null) { + if (++steps > MAX_WALK) { + throw new Error(`_isAncestor: exceeded ${MAX_WALK} steps — possible cycle`); + } + const nodeInfo = await this._host._persistence.getNodeInfo(cur); + const parent = nodeInfo.parents?.[0] ?? null; + if (parent === ancestorSha) { + return true; + } + cur = parent; + } + return false; + } + + /** + * Determines relationship between incoming patch and checkpoint head. + * + * @param {string} ckHead + * @param {string} incomingSha + * @returns {Promise<'same' | 'ahead' | 'behind' | 'diverged'>} + */ + async _relationToCheckpointHead(ckHead, incomingSha) { + if (incomingSha === ckHead) { + return 'same'; + } + if (await this._isAncestor(ckHead, incomingSha)) { + return 'ahead'; + } + if (await this._isAncestor(incomingSha, ckHead)) { + return 'behind'; + } + return 'diverged'; + } + + /** + * Validates an incoming patch against checkpoint frontier. + * + * @param {string} writerId + * @param {string} incomingSha + * @param {{state: import('./JoinReducer.js').WarpStateV5, frontier: Map, stateHash: string, schema: number}} checkpoint + * @returns {Promise} + */ + async _validatePatchAgainstCheckpoint(writerId, incomingSha, checkpoint) { + if (checkpoint === null || checkpoint === undefined || (checkpoint.schema !== 2 && checkpoint.schema !== 3)) { + return; + } + + const ckHead = checkpoint.frontier?.get(writerId); + if (ckHead === undefined || ckHead === null || ckHead === '') { + return; + } + + const relation = await this._relationToCheckpointHead(ckHead, incomingSha); + + if (relation === 'same' || relation === 'behind') { + throw new Error( + `Backfill rejected for writer ${writerId}: ` + + `incoming patch is ${relation} checkpoint frontier` + ); + } + + if (relation === 'diverged') { + throw new Error( + `Writer fork detected for ${writerId}: ` + + `incoming patch does not extend checkpoint head` + ); + } + } +} diff --git a/src/domain/warp/fork.methods.js b/src/domain/warp/fork.methods.js deleted file mode 100644 index ee89ded5..00000000 --- a/src/domain/warp/fork.methods.js +++ /dev/null @@ -1,320 +0,0 @@ -/** - * Fork and wormhole methods for WarpRuntime, plus backfill-rejection helpers. - * - * Every function uses `this` bound to a WarpRuntime instance at runtime - * via wireWarpMethods(). - * - * @module domain/warp/fork.methods - */ - -import { ForkError, DEFAULT_ADJACENCY_CACHE_SIZE } from './_internal.js'; -import { validateGraphName, validateWriterId, buildWriterRef, buildWritersPrefix } from '../utils/RefLayout.js'; -import { generateWriterId } from '../utils/WriterId.js'; -import { createWormhole as createWormholeImpl } from '../services/WormholeService.js'; - -// ============================================================================ -// Fork API -// ============================================================================ - -/** - * Creates a fork of this graph at a specific point in a writer's history. - * - * A fork creates a new WarpRuntime instance that shares history up to the - * specified patch SHA. Due to Git's content-addressed storage, the shared - * history is automatically deduplicated. The fork gets a new writer ID and - * operates independently from the original graph. - * - * **Key Properties:** - * - Fork materializes the same state as the original at the fork point - * - Writes to the fork don't appear in the original - * - Writes to the original after fork don't appear in the fork - * - History up to the fork point is shared (content-addressed dedup) - * - * @this {import('../WarpRuntime.js').default} - * @param {{ from: string, at: string, forkName?: string, forkWriterId?: string }} options - Fork configuration - * @returns {Promise} A new WarpRuntime instance for the fork - * @throws {ForkError} If `from` writer does not exist (code: `E_FORK_WRITER_NOT_FOUND`) - * @throws {ForkError} If `at` SHA does not exist (code: `E_FORK_PATCH_NOT_FOUND`) - * @throws {ForkError} If `at` SHA is not in the writer's chain (code: `E_FORK_PATCH_NOT_IN_CHAIN`) - * @throws {ForkError} If fork graph name is invalid (code: `E_FORK_NAME_INVALID`) - * @throws {ForkError} If a graph with the fork name already has refs (code: `E_FORK_ALREADY_EXISTS`) - * @throws {ForkError} If required parameters are missing or invalid (code: `E_FORK_INVALID_ARGS`) - * @throws {ForkError} If forkWriterId is invalid (code: `E_FORK_WRITER_ID_INVALID`) - */ -export async function fork({ from, at, forkName, forkWriterId }) { - const t0 = this._clock.now(); - - try { - // Validate required parameters - if (!from || typeof from !== 'string') { - throw new ForkError("Required parameter 'from' is missing or not a string", { - code: 'E_FORK_INVALID_ARGS', - context: { from }, - }); - } - - if (!at || typeof at !== 'string') { - throw new ForkError("Required parameter 'at' is missing or not a string", { - code: 'E_FORK_INVALID_ARGS', - context: { at }, - }); - } - - // 1. Validate that the `from` writer exists - const writers = await this.discoverWriters(); - if (!writers.includes(from)) { - throw new ForkError(`Writer '${from}' does not exist in graph '${this._graphName}'`, { - code: 'E_FORK_WRITER_NOT_FOUND', - context: { writerId: from, graphName: this._graphName, existingWriters: writers }, - }); - } - - // 2. Validate that `at` SHA exists in the repository - const nodeExists = await this._persistence.nodeExists(at); - if (!nodeExists) { - throw new ForkError(`Patch SHA '${at}' does not exist`, { - code: 'E_FORK_PATCH_NOT_FOUND', - context: { patchSha: at, writerId: from }, - }); - } - - // 3. Validate that `at` SHA is in the writer's chain - const writerRef = buildWriterRef(this._graphName, from); - const tipSha = await this._persistence.readRef(writerRef); - - if (tipSha === null || tipSha === undefined || tipSha === '') { - throw new ForkError(`Writer '${from}' has no commits`, { - code: 'E_FORK_WRITER_NOT_FOUND', - context: { writerId: from }, - }); - } - - // Walk the chain to verify `at` is reachable from the tip - const isInChain = await this._isAncestor(at, tipSha); - if (!isInChain) { - throw new ForkError(`Patch SHA '${at}' is not in writer '${from}' chain`, { - code: 'E_FORK_PATCH_NOT_IN_CHAIN', - context: { patchSha: at, writerId: from, tipSha }, - }); - } - - // 4. Generate or validate fork name (add random suffix to prevent collisions) - const resolvedForkName = - forkName ?? `${this._graphName}-fork-${Math.random().toString(36).slice(2, 10).padEnd(8, '0')}`; - try { - validateGraphName(resolvedForkName); - } catch (err) { - throw new ForkError(`Invalid fork name: ${/** @type {Error} */ (err).message}`, { - code: 'E_FORK_NAME_INVALID', - context: { forkName: resolvedForkName, originalError: /** @type {Error} */ (err).message }, - }); - } - - // 5. Check that the fork graph doesn't already exist (has any refs) - const forkWritersPrefix = buildWritersPrefix(resolvedForkName); - const existingForkRefs = await this._persistence.listRefs(forkWritersPrefix); - if (existingForkRefs.length > 0) { - throw new ForkError(`Graph '${resolvedForkName}' already exists`, { - code: 'E_FORK_ALREADY_EXISTS', - context: { forkName: resolvedForkName, existingRefs: existingForkRefs }, - }); - } - - // 6. Generate or validate fork writer ID - const resolvedForkWriterId = (forkWriterId !== undefined && forkWriterId !== null && forkWriterId !== '') ? forkWriterId : generateWriterId(); - try { - validateWriterId(resolvedForkWriterId); - } catch (err) { - throw new ForkError(`Invalid fork writer ID: ${/** @type {Error} */ (err).message}`, { - code: 'E_FORK_WRITER_ID_INVALID', - context: { forkWriterId: resolvedForkWriterId, originalError: /** @type {Error} */ (err).message }, - }); - } - - // 7. Create the fork's writer ref pointing to the `at` commit - const forkWriterRef = buildWriterRef(resolvedForkName, resolvedForkWriterId); - await this._persistence.updateRef(forkWriterRef, at); - - // 8. Open and return a new WarpRuntime instance for the fork - // Dynamic import to avoid circular dependency (WarpRuntime -> fork.methods -> WarpRuntime) - const { default: WarpRuntime } = await import('../WarpRuntime.js'); - - const forkGraph = await WarpRuntime.open({ - persistence: this._persistence, - graphName: resolvedForkName, - writerId: resolvedForkWriterId, - gcPolicy: this._gcPolicy, - adjacencyCacheSize: this._adjacencyCache?.maxSize ?? DEFAULT_ADJACENCY_CACHE_SIZE, - ...(this._checkpointPolicy ? { checkpointPolicy: this._checkpointPolicy } : {}), - autoMaterialize: this._autoMaterialize, - onDeleteWithData: this._onDeleteWithData, - ...(this._logger ? { logger: this._logger } : {}), - clock: this._clock, - crypto: this._crypto, - codec: this._codec, - }); - - this._logTiming('fork', t0, { - metrics: `from=${from} at=${at.slice(0, 7)} name=${resolvedForkName}`, - }); - - return forkGraph; - } catch (err) { - this._logTiming('fork', t0, { error: /** @type {Error} */ (err) }); - throw err; - } -} - -// ============================================================================ -// Wormhole API (HOLOGRAM) -// ============================================================================ - -/** - * Creates a wormhole compressing a range of patches. - * - * A wormhole is a compressed representation of a contiguous range of patches - * from a single writer. It preserves provenance by storing the original - * patches as a ProvenancePayload that can be replayed during materialization. - * - * **Key Properties:** - * - **Provenance Preservation**: The wormhole contains the full sub-payload, - * allowing exact replay of the compressed segment. - * - **Monoid Composition**: Two consecutive wormholes can be composed by - * concatenating their sub-payloads (use `WormholeService.composeWormholes`). - * - **Materialization Equivalence**: A wormhole + remaining patches produces - * the same state as materializing all patches. - * - * @this {import('../WarpRuntime.js').default} - * @param {string} fromSha - SHA of the first (oldest) patch commit in the range - * @param {string} toSha - SHA of the last (newest) patch commit in the range - * @returns {Promise<{fromSha: string, toSha: string, writerId: string, payload: import('../services/ProvenancePayload.js').default, patchCount: number}>} The created wormhole edge - * @throws {import('../errors/WormholeError.js').default} If fromSha or toSha doesn't exist (E_WORMHOLE_SHA_NOT_FOUND) - * @throws {import('../errors/WormholeError.js').default} If fromSha is not an ancestor of toSha (E_WORMHOLE_INVALID_RANGE) - * @throws {import('../errors/WormholeError.js').default} If commits span multiple writers (E_WORMHOLE_MULTI_WRITER) - * @throws {import('../errors/WormholeError.js').default} If a commit is not a patch commit (E_WORMHOLE_NOT_PATCH) - */ -export async function createWormhole(fromSha, toSha) { - const t0 = this._clock.now(); - - try { - const wormhole = await createWormholeImpl({ - persistence: this._persistence, - graphName: this._graphName, - fromSha, - toSha, - codec: this._codec, - }); - - this._logTiming('createWormhole', t0, { - metrics: `${wormhole.patchCount} patches from=${fromSha.slice(0, 7)} to=${toSha.slice(0, 7)}`, - }); - - return wormhole; - } catch (err) { - this._logTiming('createWormhole', t0, { error: /** @type {Error} */ (err) }); - throw err; - } -} - -// ============================================================================ -// Backfill Rejection and Divergence Detection -// ============================================================================ - -/** - * Checks if ancestorSha is an ancestor of descendantSha. - * Walks the commit graph (linear per-writer chain assumption). - * - * @this {import('../WarpRuntime.js').default} - * @param {string} ancestorSha - The potential ancestor commit SHA - * @param {string} descendantSha - The potential descendant commit SHA - * @returns {Promise} True if ancestorSha is an ancestor of descendantSha - * @private - */ -export async function _isAncestor(ancestorSha, descendantSha) { - if (!ancestorSha || !descendantSha) { - return false; - } - if (ancestorSha === descendantSha) { - return true; - } - - /** @type {string | null} */ - let cur = descendantSha; - const MAX_WALK = 100_000; - let steps = 0; - while (cur !== null) { - if (++steps > MAX_WALK) { - throw new Error(`_isAncestor: exceeded ${MAX_WALK} steps — possible cycle`); - } - const nodeInfo = await this._persistence.getNodeInfo(cur); - const parent = nodeInfo.parents?.[0] ?? null; - if (parent === ancestorSha) { - return true; - } - cur = parent; - } - return false; -} - -/** - * Determines relationship between incoming patch and checkpoint head. - * - * @this {import('../WarpRuntime.js').default} - * @param {string} ckHead - The checkpoint head SHA for this writer - * @param {string} incomingSha - The incoming patch commit SHA - * @returns {Promise<'same' | 'ahead' | 'behind' | 'diverged'>} The relationship - * @private - */ -export async function _relationToCheckpointHead(ckHead, incomingSha) { - if (incomingSha === ckHead) { - return 'same'; - } - if (await this._isAncestor(ckHead, incomingSha)) { - return 'ahead'; - } - if (await this._isAncestor(incomingSha, ckHead)) { - return 'behind'; - } - return 'diverged'; -} - -/** - * Validates an incoming patch against checkpoint frontier. - * Uses graph reachability, NOT lamport timestamps. - * - * @this {import('../WarpRuntime.js').default} - * @param {string} writerId - The writer ID for this patch - * @param {string} incomingSha - The incoming patch commit SHA - * @param {{state: import('../services/JoinReducer.js').WarpStateV5, frontier: Map, stateHash: string, schema: number}} checkpoint - The checkpoint to validate against - * @returns {Promise} - * @throws {Error} If patch is behind/same as checkpoint frontier (backfill rejected) - * @throws {Error} If patch does not extend checkpoint head (writer fork detected) - * @private - */ -export async function _validatePatchAgainstCheckpoint(writerId, incomingSha, checkpoint) { - if (checkpoint === null || checkpoint === undefined || (checkpoint.schema !== 2 && checkpoint.schema !== 3)) { - return; - } - - const ckHead = checkpoint.frontier?.get(writerId); - if (ckHead === undefined || ckHead === null || ckHead === '') { - return; // Checkpoint didn't include this writer - } - - const relation = await this._relationToCheckpointHead(ckHead, incomingSha); - - if (relation === 'same' || relation === 'behind') { - throw new Error( - `Backfill rejected for writer ${writerId}: ` + - `incoming patch is ${relation} checkpoint frontier` - ); - } - - if (relation === 'diverged') { - throw new Error( - `Writer fork detected for ${writerId}: ` + - `incoming patch does not extend checkpoint head` - ); - } - // relation === 'ahead' => OK -} From 3b08412796e2b29dff559d85d9fa853d4c257906 Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 01:19:19 -0700 Subject: [PATCH 42/73] refactor(NDNM): extract QueryController from WarpRuntime MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Moved query.methods.js (906 LOC) into QueryController service class. Twenty methods extracted: hasNode, getNodeProps, getEdgeProps, neighbors, getStateSnapshot, getNodes, getEdges, getPropertyCount, query, worldline, observer, translationCost, getContentOid, getContentMeta, getContent, getEdgeContentOid, getEdgeContentMeta, getEdgeContent, getContentStream, getEdgeContentStream. Module-level functions wired onto QueryController.prototype via Object.defineProperty (same mechanism as wireWarpMethods). Each function annotated with @this {QueryController} for TSC. WarpApp and WarpCore updated: replaced direct function imports from query.methods.js with callInternalRuntimeMethod() delegation, which correctly resolves dynamically wired prototype methods. Updated ESLint relaxation list. Phase 3b — final pre-kernel extraction. No public API surface changes. --- eslint.config.js | 2 +- src/domain/WarpApp.js | 29 +- src/domain/WarpCore.js | 118 ++------ src/domain/WarpRuntime.js | 32 +- .../QueryController.js} | 274 +++++++++++------- 5 files changed, 224 insertions(+), 231 deletions(-) rename src/domain/{warp/query.methods.js => services/QueryController.js} (81%) diff --git a/eslint.config.js b/eslint.config.js index a4abd20c..b0d9d3a9 100644 --- a/eslint.config.js +++ b/eslint.config.js @@ -239,7 +239,7 @@ export default tseslint.config( { files: [ "src/domain/WarpGraph.js", - "src/domain/warp/query.methods.js", + "src/domain/services/QueryController.js", "src/domain/services/SubscriptionController.js", "src/domain/services/ProvenanceController.js", "src/domain/services/ForkController.js", diff --git a/src/domain/WarpApp.js b/src/domain/WarpApp.js index 33a3bb19..a0817a62 100644 --- a/src/domain/WarpApp.js +++ b/src/domain/WarpApp.js @@ -1,14 +1,5 @@ import WarpCore from './WarpCore.js'; -import { - getContent as _getContent, - getContentStream as _getContentStream, - getContentOid as _getContentOid, - getContentMeta as _getContentMeta, - getEdgeContent as _getEdgeContent, - getEdgeContentStream as _getEdgeContentStream, - getEdgeContentOid as _getEdgeContentOid, - getEdgeContentMeta as _getEdgeContentMeta, -} from './warp/query.methods.js'; +import { callInternalRuntimeMethod } from './utils/callInternalRuntimeMethod.js'; /** * Curated product-facing WARP surface. @@ -182,54 +173,52 @@ export default class WarpApp { } // ── Content attachment reads ────────────────────────────────────────── - // Imported from query.methods.js and called with the runtime as this binding. - /** Reads the full content blob attached to a node. * @param {string} nodeId @returns {Promise} */ async getContent(nodeId) { - return await _getContent.call(this._runtime(), nodeId); + return /** @type {Uint8Array|null} */ (await callInternalRuntimeMethod(this._runtime(), 'getContent', nodeId)); } /** Returns a streaming reader for the content blob attached to a node. * @param {string} nodeId @returns {Promise|null>} */ async getContentStream(nodeId) { - return await _getContentStream.call(this._runtime(), nodeId); + return /** @type {AsyncIterable|null} */ (await callInternalRuntimeMethod(this._runtime(), 'getContentStream', nodeId)); } /** Returns the Git object ID of the content blob attached to a node. * @param {string} nodeId @returns {Promise} */ async getContentOid(nodeId) { - return await _getContentOid.call(this._runtime(), nodeId); + return /** @type {string|null} */ (await callInternalRuntimeMethod(this._runtime(), 'getContentOid', nodeId)); } /** Returns structured content metadata (oid, mime, size) for a node. * @param {string} nodeId @returns {Promise<{ oid: string, mime: string|null, size: number|null }|null>} */ async getContentMeta(nodeId) { - return await _getContentMeta.call(this._runtime(), nodeId); + return /** @type {{ oid: string, mime: string|null, size: number|null }|null} */ (await callInternalRuntimeMethod(this._runtime(), 'getContentMeta', nodeId)); } /** Reads the full content blob attached to an edge. * @param {string} from @param {string} to @param {string} label @returns {Promise} */ async getEdgeContent(from, to, label) { - return await _getEdgeContent.call(this._runtime(), from, to, label); + return /** @type {Uint8Array|null} */ (await callInternalRuntimeMethod(this._runtime(), 'getEdgeContent', from, to, label)); } /** Returns a streaming reader for the content blob attached to an edge. * @param {string} from @param {string} to @param {string} label @returns {Promise|null>} */ async getEdgeContentStream(from, to, label) { - return await _getEdgeContentStream.call(this._runtime(), from, to, label); + return /** @type {AsyncIterable|null} */ (await callInternalRuntimeMethod(this._runtime(), 'getEdgeContentStream', from, to, label)); } /** Returns the Git object ID of the content blob attached to an edge. * @param {string} from @param {string} to @param {string} label @returns {Promise} */ async getEdgeContentOid(from, to, label) { - return await _getEdgeContentOid.call(this._runtime(), from, to, label); + return /** @type {string|null} */ (await callInternalRuntimeMethod(this._runtime(), 'getEdgeContentOid', from, to, label)); } /** Returns structured content metadata (oid, mime, size) for an edge. * @param {string} from @param {string} to @param {string} label @returns {Promise<{ oid: string, mime: string|null, size: number|null }|null>} */ async getEdgeContentMeta(from, to, label) { - return await _getEdgeContentMeta.call(this._runtime(), from, to, label); + return /** @type {{ oid: string, mime: string|null, size: number|null }|null} */ (await callInternalRuntimeMethod(this._runtime(), 'getEdgeContentMeta', from, to, label)); } // ── Strands ───────────────────────────────────────────────────────── diff --git a/src/domain/WarpCore.js b/src/domain/WarpCore.js index 84563725..28ce203d 100644 --- a/src/domain/WarpCore.js +++ b/src/domain/WarpCore.js @@ -1,14 +1,5 @@ import WarpRuntime from './WarpRuntime.js'; -import { - getContent as _getContent, - getContentStream as _getContentStream, - getContentOid as _getContentOid, - getContentMeta as _getContentMeta, - getEdgeContent as _getEdgeContent, - getEdgeContentStream as _getEdgeContentStream, - getEdgeContentOid as _getEdgeContentOid, - getEdgeContentMeta as _getEdgeContentMeta, -} from './warp/query.methods.js'; +import { callInternalRuntimeMethod } from './utils/callInternalRuntimeMethod.js'; import { toInternalStrandShape, toPublicStrandShape } from './utils/strandPublicShape.js'; import { buildCoordinateComparisonFact, @@ -164,8 +155,7 @@ export default class WarpCore { } // ── Content attachment reads ────────────────────────────────────────── - // Imported from query.methods.js and called with WarpRuntime-typed this. - // WarpCore is a WarpRuntime at runtime (via Object.setPrototypeOf in _adopt). + // Delegated to the runtime's QueryController via prototype methods. /** * Returns the internal WarpRuntime instance. @@ -177,101 +167,29 @@ export default class WarpCore { return /** @type {WarpRuntime} */ (/** @type {unknown} */ (this)); } - /** - * Returns a content attachment by node ID. - * - * @param {string} nodeId - * @returns {Promise} - */ - async getContent(nodeId) { - const fn = /** @type {{ call: (thisArg: WarpRuntime, nodeId: string) => Promise }} */ (/** @type {unknown} */ (_getContent)); - return await fn.call(this._asRuntime(), nodeId); - } + /** Returns a content attachment by node ID. @param {string} nodeId @returns {Promise} */ + async getContent(nodeId) { return /** @type {Uint8Array|null} */ (await callInternalRuntimeMethod(this, 'getContent', nodeId)); } - /** - * Returns a content attachment stream by node ID. - * - * @param {string} nodeId - * @returns {Promise|null>} - */ - async getContentStream(nodeId) { - const fn = /** @type {{ call: (thisArg: WarpRuntime, nodeId: string) => Promise|null> }} */ (/** @type {unknown} */ (_getContentStream)); - return await fn.call(this._asRuntime(), nodeId); - } + /** Returns a content stream by node ID. @param {string} nodeId @returns {Promise|null>} */ + async getContentStream(nodeId) { return /** @type {AsyncIterable|null} */ (await callInternalRuntimeMethod(this, 'getContentStream', nodeId)); } - /** - * Returns the storage OID for a content attachment. - * - * @param {string} nodeId - * @returns {Promise} - */ - async getContentOid(nodeId) { - const fn = /** @type {{ call: (thisArg: WarpRuntime, nodeId: string) => Promise }} */ (/** @type {unknown} */ (_getContentOid)); - return await fn.call(this._asRuntime(), nodeId); - } + /** Returns the OID for a content attachment. @param {string} nodeId @returns {Promise} */ + async getContentOid(nodeId) { return /** @type {string|null} */ (await callInternalRuntimeMethod(this, 'getContentOid', nodeId)); } - /** - * Returns metadata for a content attachment. - * - * @param {string} nodeId - * @returns {Promise<{ oid: string, mime: string|null, size: number|null }|null>} - */ - async getContentMeta(nodeId) { - const fn = /** @type {{ call: (thisArg: WarpRuntime, nodeId: string) => Promise<{ oid: string, mime: string|null, size: number|null }|null> }} */ (/** @type {unknown} */ (_getContentMeta)); - return await fn.call(this._asRuntime(), nodeId); - } + /** Returns content metadata. @param {string} nodeId @returns {Promise<{ oid: string, mime: string|null, size: number|null }|null>} */ + async getContentMeta(nodeId) { return /** @type {{ oid: string, mime: string|null, size: number|null }|null} */ (await callInternalRuntimeMethod(this, 'getContentMeta', nodeId)); } - /** - * Returns a content attachment for an edge. - * - * @param {string} from - * @param {string} to - * @param {string} label - * @returns {Promise} - */ - async getEdgeContent(from, to, label) { - const fn = /** @type {{ call: (thisArg: WarpRuntime, from: string, to: string, label: string) => Promise }} */ (/** @type {unknown} */ (_getEdgeContent)); - return await fn.call(this._asRuntime(), from, to, label); - } + /** Returns a content attachment for an edge. @param {string} from @param {string} to @param {string} label @returns {Promise} */ + async getEdgeContent(from, to, label) { return /** @type {Uint8Array|null} */ (await callInternalRuntimeMethod(this, 'getEdgeContent', from, to, label)); } - /** - * Returns a content attachment stream for an edge. - * - * @param {string} from - * @param {string} to - * @param {string} label - * @returns {Promise|null>} - */ - async getEdgeContentStream(from, to, label) { - const fn = /** @type {{ call: (thisArg: WarpRuntime, from: string, to: string, label: string) => Promise|null> }} */ (/** @type {unknown} */ (_getEdgeContentStream)); - return await fn.call(this._asRuntime(), from, to, label); - } + /** Returns a content stream for an edge. @param {string} from @param {string} to @param {string} label @returns {Promise|null>} */ + async getEdgeContentStream(from, to, label) { return /** @type {AsyncIterable|null} */ (await callInternalRuntimeMethod(this, 'getEdgeContentStream', from, to, label)); } - /** - * Returns the storage OID for an edge content attachment. - * - * @param {string} from - * @param {string} to - * @param {string} label - * @returns {Promise} - */ - async getEdgeContentOid(from, to, label) { - const fn = /** @type {{ call: (thisArg: WarpRuntime, from: string, to: string, label: string) => Promise }} */ (/** @type {unknown} */ (_getEdgeContentOid)); - return await fn.call(this._asRuntime(), from, to, label); - } + /** Returns the OID for an edge content attachment. @param {string} from @param {string} to @param {string} label @returns {Promise} */ + async getEdgeContentOid(from, to, label) { return /** @type {string|null} */ (await callInternalRuntimeMethod(this, 'getEdgeContentOid', from, to, label)); } - /** - * Returns metadata for an edge content attachment. - * - * @param {string} from - * @param {string} to - * @param {string} label - * @returns {Promise<{ oid: string, mime: string|null, size: number|null }|null>} - */ - async getEdgeContentMeta(from, to, label) { - const fn = /** @type {{ call: (thisArg: WarpRuntime, from: string, to: string, label: string) => Promise<{ oid: string, mime: string|null, size: number|null }|null> }} */ (/** @type {unknown} */ (_getEdgeContentMeta)); - return await fn.call(this._asRuntime(), from, to, label); - } + /** Returns metadata for an edge content attachment. @param {string} from @param {string} to @param {string} label @returns {Promise<{ oid: string, mime: string|null, size: number|null }|null>} */ + async getEdgeContentMeta(from, to, label) { return /** @type {{ oid: string, mime: string|null, size: number|null }|null} */ (await callInternalRuntimeMethod(this, 'getEdgeContentMeta', from, to, label)); } // ── Strands ───────────────────────────────────────────────────────── diff --git a/src/domain/WarpRuntime.js b/src/domain/WarpRuntime.js index fd5e9b4f..7b3add17 100644 --- a/src/domain/WarpRuntime.js +++ b/src/domain/WarpRuntime.js @@ -24,12 +24,12 @@ import ComparisonController from './services/ComparisonController.js'; import SubscriptionController from './services/SubscriptionController.js'; import ProvenanceController from './services/ProvenanceController.js'; import ForkController from './services/ForkController.js'; +import QueryController from './services/QueryController.js'; import SyncTrustGate from './services/SyncTrustGate.js'; import { AuditVerifierService } from './services/AuditVerifierService.js'; import MaterializedViewService from './services/MaterializedViewService.js'; import InMemoryBlobStorageAdapter from './utils/defaultBlobStorage.js'; import { wireWarpMethods } from './warp/_wire.js'; -import * as queryMethods from './warp/query.methods.js'; import * as checkpointMethods from './warp/checkpoint.methods.js'; import * as patchMethods from './warp/patch.methods.js'; import * as materializeMethods from './warp/materialize.methods.js'; @@ -324,6 +324,9 @@ export default class WarpRuntime { /** @type {ForkController} */ this._forkController = new ForkController(this); + /** @type {QueryController} */ + this._queryController = new QueryController(/** @type {import('./warp/_internal.js').WarpGraphWithMixins} */ (/** @type {unknown} */ (this))); + /** @type {MaterializedViewService} */ this._viewService = new MaterializedViewService({ codec: this._codec, @@ -660,7 +663,6 @@ export default class WarpRuntime { // ── Wire extracted method groups onto WarpRuntime.prototype ─────────────────── wireWarpMethods(WarpRuntime, [ - queryMethods, checkpointMethods, patchMethods, materializeMethods, @@ -692,6 +694,32 @@ for (const method of strandDelegates) { }); } +// ── Query methods: direct delegation to QueryController ────────────────────── +const queryDelegates = /** @type {const} */ ([ + 'hasNode', 'getNodeProps', 'getEdgeProps', 'neighbors', + 'getStateSnapshot', 'getNodes', 'getEdges', 'getPropertyCount', + 'query', 'worldline', 'observer', 'translationCost', + 'getContentOid', 'getContentMeta', 'getContent', + 'getEdgeContentOid', 'getEdgeContentMeta', 'getEdgeContent', + 'getContentStream', 'getEdgeContentStream', +]); +for (const method of queryDelegates) { + Object.defineProperty(WarpRuntime.prototype, method, { + // eslint-disable-next-line object-shorthand -- function keyword needed for `this` binding + value: /** Delegates to QueryController. @param {unknown[]} args @returns {unknown} */ function (...args) { + /** @type {unknown} */ + const raw = this; + const self = /** @type {WarpRuntime} */ (raw); + const ctrl = /** @type {Record} */ (/** @type {unknown} */ (self._queryController)); + const fn = /** @type {(...a: unknown[]) => unknown} */ (ctrl[method]); + return fn.call(ctrl, ...args); + }, + writable: true, + configurable: true, + enumerable: false, + }); +} + // ── Fork methods: direct delegation to ForkController ──────────────────────── const forkDelegates = /** @type {const} */ ([ 'fork', 'createWormhole', diff --git a/src/domain/warp/query.methods.js b/src/domain/services/QueryController.js similarity index 81% rename from src/domain/warp/query.methods.js rename to src/domain/services/QueryController.js index 535502bf..2dcff3c5 100644 --- a/src/domain/warp/query.methods.js +++ b/src/domain/services/QueryController.js @@ -1,10 +1,10 @@ /** - * Query methods for WarpRuntime — pure reads on materialized state. + * QueryController — pure reads on materialized state. * - * Every function uses `this` bound to a WarpRuntime instance at runtime - * via wireWarpMethods(). + * Extracted from query.methods.js. All methods are read-only queries + * against cached CRDT state, indexes, and blob storage. * - * @module domain/warp/query.methods + * @module domain/services/QueryController */ import { orsetContains, orsetElements } from '../crdt/ORSet.js'; @@ -19,18 +19,24 @@ import { CONTENT_PROPERTY_KEY, CONTENT_MIME_PROPERTY_KEY, CONTENT_SIZE_PROPERTY_KEY, -} from '../services/KeyCodec.js'; +} from './KeyCodec.js'; import { compareEventIds } from '../utils/EventId.js'; -import { cloneStateV5 } from '../services/JoinReducer.js'; -import { createImmutableWarpStateV5 } from '../services/ImmutableSnapshot.js'; -import QueryBuilder from '../services/QueryBuilder.js'; -import Observer from '../services/Observer.js'; -import Worldline from '../services/Worldline.js'; -import { computeTranslationCost } from '../services/TranslationCost.js'; -import { computeStateHashV5 } from '../services/StateSerializerV5.js'; +import { cloneStateV5 } from './JoinReducer.js'; +import { createImmutableWarpStateV5 } from './ImmutableSnapshot.js'; +import QueryBuilder from './QueryBuilder.js'; +import Observer from './Observer.js'; +import Worldline from './Worldline.js'; +import { computeTranslationCost } from './TranslationCost.js'; +import { computeStateHashV5 } from './StateSerializerV5.js'; import { toInternalStrandShape } from '../utils/strandPublicShape.js'; import { callInternalRuntimeMethod } from '../utils/callInternalRuntimeMethod.js'; +/** + * The host interface that QueryController depends on. + * + * @typedef {import('../warp/_internal.js').WarpGraphWithMixins} QueryHost + */ + /** * @typedef {{ * source?: { @@ -198,33 +204,33 @@ async function resolveObserverSnapshot(graph, options) { * * **Requires a cached state.** Call materialize() first if not already cached. * - * @this {import('../WarpRuntime.js').default} * @param {string} nodeId - The node ID to check * @returns {Promise} True if the node exists in the materialized state * @throws {import('../errors/QueryError.js').default} If no cached state exists (code: `E_NO_STATE`) * @throws {import('../errors/QueryError.js').default} If cached state is dirty (code: `E_STALE_STATE`) + * @this {QueryController} */ -export async function hasNode(nodeId) { - await this._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._cachedState); +async function hasNode(nodeId) { + await this._host._ensureFreshState(); + const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); return orsetContains(s.nodeAlive, nodeId); } /** * Gets all properties for a node from the materialized state. * - * @this {import('../WarpRuntime.js').default} * @param {string} nodeId - The node ID to get properties for * @returns {Promise|null>} Object of property key → value, or null if node doesn't exist * @throws {import('../errors/QueryError.js').default} If no cached state exists (code: `E_NO_STATE`) + * @this {QueryController} */ -export async function getNodeProps(nodeId) { - await this._ensureFreshState(); +async function getNodeProps(nodeId) { + await this._host._ensureFreshState(); // ── Indexed fast path (positive results only; stale index falls through) ── - if (this._propertyReader !== null && this._propertyReader !== undefined && this._logicalIndex?.isAlive(nodeId) === true) { + if (this._host._propertyReader !== null && this._host._propertyReader !== undefined && this._host._logicalIndex?.isAlive(nodeId) === true) { try { - const record = await this._propertyReader.getNodeProps(nodeId); + const record = await this._host._propertyReader.getNodeProps(nodeId); if (record !== null) { return record; } @@ -235,7 +241,7 @@ export async function getNodeProps(nodeId) { } // ── Linear scan fallback ───────────────────────────────────────────── - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._cachedState); + const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); if (!orsetContains(s.nodeAlive, nodeId)) { return null; @@ -256,16 +262,16 @@ export async function getNodeProps(nodeId) { /** * Gets all properties for an edge from the materialized state. * - * @this {import('../WarpRuntime.js').default} * @param {string} from - Source node ID * @param {string} to - Target node ID * @param {string} label - Edge label * @returns {Promise|null>} Object of property key → value, or null if edge doesn't exist * @throws {import('../errors/QueryError.js').default} If no cached state exists (code: `E_NO_STATE`) + * @this {QueryController} */ -export async function getEdgeProps(from, to, label) { - await this._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._cachedState); +async function getEdgeProps(from, to, label) { + await this._host._ensureFreshState(); + const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); const edgeKey = encodeEdgeKey(from, to, label); if (!orsetContains(s.edgeAlive, edgeKey)) { @@ -311,19 +317,19 @@ function tagDirection(edges, dir) { /** * Gets neighbors of a node from the materialized state. * - * @this {import('../WarpRuntime.js').default} * @param {string} nodeId - The node ID to get neighbors for * @param {'outgoing' | 'incoming' | 'both'} [direction='both'] - Edge direction to follow * @param {string} [edgeLabel] - Optional edge label filter * @returns {Promise>} Array of neighbor info * @throws {import('../errors/QueryError.js').default} If no cached state exists (code: `E_NO_STATE`) + * @this {QueryController} */ -export async function neighbors(nodeId, direction = 'both', edgeLabel = undefined) { - await this._ensureFreshState(); +async function neighbors(nodeId, direction = 'both', edgeLabel = undefined) { + await this._host._ensureFreshState(); // ── Indexed fast path (only when node is in index; stale falls through) ── - const provider = this._materializedGraph?.provider; - if (provider !== null && provider !== undefined && this._logicalIndex?.isAlive(nodeId) === true) { + const provider = this._host._materializedGraph?.provider; + if (provider !== null && provider !== undefined && this._host._logicalIndex?.isAlive(nodeId) === true) { try { const opts = typeof edgeLabel === 'string' && edgeLabel.length > 0 ? { labels: new Set([edgeLabel]) } : undefined; return await _indexedNeighbors(provider, nodeId, direction, opts); @@ -333,7 +339,7 @@ export async function neighbors(nodeId, direction = 'both', edgeLabel = undefine } // ── Linear scan fallback ───────────────────────────────────────────── - return _linearNeighbors(/** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._cachedState), nodeId, direction, edgeLabel); + return _linearNeighbors(/** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState), nodeId, direction, edgeLabel); } /** @@ -394,43 +400,43 @@ function _linearNeighbors(cachedState, nodeId, direction, edgeLabel) { /** * Returns a defensive copy of the current materialized state. * - * @this {import('../WarpRuntime.js').default} * @returns {Promise} + * @this {QueryController} */ -export async function getStateSnapshot() { - if (!this._cachedState && !this._autoMaterialize) { +async function getStateSnapshot() { + if (!this._host._cachedState && !this._host._autoMaterialize) { return null; } - await this._ensureFreshState(); - if (!this._cachedState) { + await this._host._ensureFreshState(); + if (!this._host._cachedState) { return null; } - return createImmutableWarpStateV5(/** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._cachedState)); + return createImmutableWarpStateV5(/** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState)); } /** * Gets all visible nodes in the materialized state. * - * @this {import('../WarpRuntime.js').default} * @returns {Promise} Array of node IDs * @throws {import('../errors/QueryError.js').default} If no cached state exists (code: `E_NO_STATE`) + * @this {QueryController} */ -export async function getNodes() { - await this._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._cachedState); +async function getNodes() { + await this._host._ensureFreshState(); + const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); return [...orsetElements(s.nodeAlive)]; } /** * Gets all visible edges in the materialized state. * - * @this {import('../WarpRuntime.js').default} * @returns {Promise}>>} Array of edge info * @throws {import('../errors/QueryError.js').default} If no cached state exists (code: `E_NO_STATE`) + * @this {QueryController} */ -export async function getEdges() { - await this._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._cachedState); +async function getEdges() { + await this._host._ensureFreshState(); + const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); /** @type {Map>} */ const edgePropsByKey = new Map(); @@ -471,36 +477,36 @@ export async function getEdges() { /** * Returns the number of property entries in the materialized state. * - * @this {import('../WarpRuntime.js').default} * @returns {Promise} Number of property entries * @throws {import('../errors/QueryError.js').default} If no cached state exists (code: `E_NO_STATE`) + * @this {QueryController} */ -export async function getPropertyCount() { - await this._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._cachedState); +async function getPropertyCount() { + await this._host._ensureFreshState(); + const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); return s.prop.size; } /** * Creates a fluent query builder for the logical graph. * - * @this {import('../WarpRuntime.js').default} * @returns {import('../services/QueryBuilder.js').default} A fluent query builder + * @this {QueryController} */ -export function query() { - return new QueryBuilder(this); +function query() { + return new QueryBuilder(this._host); } /** * Creates a first-class worldline handle over a pinned read source. * - * @this {import('../WarpRuntime.js').default} * @param {ObserverOptions} [options] * @returns {import('../services/Worldline.js').default} + * @this {QueryController} */ -export function worldline(options = undefined) { +function worldline(options = undefined) { return new Worldline({ - graph: this, + graph: this._host, source: cloneObserverSource(options?.source) || { kind: 'live' }, }); } @@ -534,26 +540,26 @@ function normalizeObserverArgs(nameOrConfig, configOrOptions, maybeOptions) { /** * Creates a read-only observer over the current materialized state. * - * @this {import('../WarpRuntime.js').default} * @param {string|{ match: string|string[], expose?: string[], redact?: string[] }} nameOrConfig * Observer name or observer configuration * @param {{ match: string|string[], expose?: string[], redact?: string[] }|ObserverOptions} [configOrOptions] * Observer configuration when a name is supplied, otherwise observer options * @param {ObserverOptions} [maybeOptions] - Optional pinned read source * @returns {Promise} A read-only observer + * @this {QueryController} */ -export async function observer(nameOrConfig, configOrOptions = undefined, maybeOptions = undefined) { +async function observer(nameOrConfig, configOrOptions = undefined, maybeOptions = undefined) { const { name, config, options } = normalizeObserverArgs(nameOrConfig, configOrOptions, maybeOptions); /** Validates that a match value is a non-empty string or non-empty string array. @param {unknown} m - Match value to validate @returns {boolean} True if valid */ const isValidMatch = (m) => typeof m === 'string' || (Array.isArray(m) && m.length > 0 && m.every(/** Checks that an element is a string. @param {unknown} i - Array element @returns {boolean} True if string */ i => typeof i === 'string')); if (!config || !isValidMatch(config.match)) { throw new Error('observer config.match must be a non-empty string or non-empty array of strings'); } - const snapshot = await resolveObserverSnapshot(this, options); + const snapshot = await resolveObserverSnapshot(this._host, options); return new Observer({ name, config, - graph: this, + graph: this._host, snapshot, source: cloneObserverSource(options?.source) || { kind: 'live' }, }); @@ -562,14 +568,14 @@ export async function observer(nameOrConfig, configOrOptions = undefined, maybeO /** * Computes the directed MDL translation cost from observer A to observer B. * - * @this {import('../WarpRuntime.js').default} * @param {{ match: string|string[], expose?: string[], redact?: string[] }} configA - Observer configuration for A * @param {{ match: string|string[], expose?: string[], redact?: string[] }} configB - Observer configuration for B * @returns {Promise<{cost: number, breakdown: {nodeLoss: number, edgeLoss: number, propLoss: number}}>} + * @this {QueryController} */ -export async function translationCost(configA, configB) { - await this._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._cachedState); +async function translationCost(configA, configB) { + await this._host._ensureFreshState(); + const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); return computeTranslationCost(configA, configB, s); } @@ -706,14 +712,14 @@ function extractContentMeta(contentRegister, mimeRegister, sizeRegister) { /** * Gets the content blob OID for a node, or null if none is attached. * - * @this {import('../WarpRuntime.js').default} * @param {string} nodeId - The node ID to check * @returns {Promise} Hex blob OID or null * @throws {import('../errors/QueryError.js').default} If no cached state exists (code: `E_NO_STATE`) + * @this {QueryController} */ -export async function getContentOid(nodeId) { - await this._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._cachedState); +async function getContentOid(nodeId) { + await this._host._ensureFreshState(); + const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); const registers = getNodeContentRegisters(s, nodeId); return registers?.contentRegister.value ?? null; } @@ -721,14 +727,14 @@ export async function getContentOid(nodeId) { /** * Gets structured content metadata for a node attachment, or null if none is attached. * - * @this {import('../WarpRuntime.js').default} * @param {string} nodeId - The node ID to check * @returns {Promise<{ oid: string, mime: string|null, size: number|null }|null>} Content metadata or null * @throws {import('../errors/QueryError.js').default} If no cached state exists (code: `E_NO_STATE`) + * @this {QueryController} */ -export async function getContentMeta(nodeId) { - await this._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._cachedState); +async function getContentMeta(nodeId) { + await this._host._ensureFreshState(); + const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); const registers = getNodeContentRegisters(s, nodeId); return registers ? extractContentMeta(registers.contentRegister, registers.mimeRegister, registers.sizeRegister) @@ -741,41 +747,41 @@ export async function getContentMeta(nodeId) { * Returns the raw bytes from `readBlob()`. Consumers wanting text * should decode the result with `new TextDecoder().decode(buf)`. * - * @this {import('../WarpRuntime.js').default} * @param {string} nodeId - The node ID to get content for * @returns {Promise} Content bytes or null * @throws {import('../errors/PersistenceError.js').default} If the referenced * blob OID is not in the object store (code: `E_MISSING_OBJECT`), such as * after repository corruption, aggressive GC, or a partial clone missing the * blob object. + * @this {QueryController} */ -export async function getContent(nodeId) { - await this._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._cachedState); +async function getContent(nodeId) { + await this._host._ensureFreshState(); + const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); const registers = getNodeContentRegisters(s, nodeId); if (!registers) { return null; } const { value: oid } = registers.contentRegister; - if (this._blobStorage) { - return await this._blobStorage.retrieve(oid); + if (this._host._blobStorage) { + return await this._host._blobStorage.retrieve(oid); } - return await this._persistence.readBlob(oid); + return await this._host._persistence.readBlob(oid); } /** * Gets the content blob OID for an edge, or null if none is attached. * - * @this {import('../WarpRuntime.js').default} * @param {string} from - Source node ID * @param {string} to - Target node ID * @param {string} label - Edge label * @returns {Promise} Hex blob OID or null * @throws {import('../errors/QueryError.js').default} If no cached state exists (code: `E_NO_STATE`) + * @this {QueryController} */ -export async function getEdgeContentOid(from, to, label) { - await this._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._cachedState); +async function getEdgeContentOid(from, to, label) { + await this._host._ensureFreshState(); + const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); const registers = getEdgeContentRegisters(s, from, to, label); return registers?.contentRegister.value ?? null; } @@ -783,16 +789,16 @@ export async function getEdgeContentOid(from, to, label) { /** * Gets structured content metadata for an edge attachment, or null if none is attached. * - * @this {import('../WarpRuntime.js').default} * @param {string} from - Source node ID * @param {string} to - Target node ID * @param {string} label - Edge label * @returns {Promise<{ oid: string, mime: string|null, size: number|null }|null>} Content metadata or null * @throws {import('../errors/QueryError.js').default} If no cached state exists (code: `E_NO_STATE`) + * @this {QueryController} */ -export async function getEdgeContentMeta(from, to, label) { - await this._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._cachedState); +async function getEdgeContentMeta(from, to, label) { + await this._host._ensureFreshState(); + const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); const registers = getEdgeContentRegisters(s, from, to, label); return registers ? extractContentMeta(registers.contentRegister, registers.mimeRegister, registers.sizeRegister) @@ -805,7 +811,6 @@ export async function getEdgeContentMeta(from, to, label) { * Returns the raw bytes from `readBlob()`. Consumers wanting text * should decode the result with `new TextDecoder().decode(buf)`. * - * @this {import('../WarpRuntime.js').default} * @param {string} from - Source node ID * @param {string} to - Target node ID * @param {string} label - Edge label @@ -814,19 +819,20 @@ export async function getEdgeContentMeta(from, to, label) { * blob OID is not in the object store (code: `E_MISSING_OBJECT`), such as * after repository corruption, aggressive GC, or a partial clone missing the * blob object. + * @this {QueryController} */ -export async function getEdgeContent(from, to, label) { - await this._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._cachedState); +async function getEdgeContent(from, to, label) { + await this._host._ensureFreshState(); + const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); const registers = getEdgeContentRegisters(s, from, to, label); if (!registers) { return null; } const { value: oid } = registers.contentRegister; - if (this._blobStorage) { - return await this._blobStorage.retrieve(oid); + if (this._host._blobStorage) { + return await this._host._blobStorage.retrieve(oid); } - return await this._persistence.readBlob(oid); + return await this._host._persistence.readBlob(oid); } /** @@ -835,23 +841,23 @@ export async function getEdgeContent(from, to, label) { * Returns an async iterable of Uint8Array chunks for incremental * consumption. Use `getContent()` when you want the full buffer. * - * @this {import('../WarpRuntime.js').default} * @param {string} nodeId - The node ID to get content for * @returns {Promise|null>} Async iterable of content chunks, or null + * @this {QueryController} */ -export async function getContentStream(nodeId) { - await this._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._cachedState); +async function getContentStream(nodeId) { + await this._host._ensureFreshState(); + const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); const registers = getNodeContentRegisters(s, nodeId); if (!registers) { return null; } const { value: oid } = registers.contentRegister; - if (this._blobStorage && typeof this._blobStorage.retrieveStream === 'function') { - return this._blobStorage.retrieveStream(oid); + if (this._host._blobStorage && typeof this._host._blobStorage.retrieveStream === 'function') { + return this._host._blobStorage.retrieveStream(oid); } // Fallback: wrap buffered read as single-chunk async iterable - const buf = await this._persistence.readBlob(oid); + const buf = await this._host._persistence.readBlob(oid); return singleChunkAsyncIterable(buf); } @@ -861,24 +867,24 @@ export async function getContentStream(nodeId) { * Returns an async iterable of Uint8Array chunks for incremental * consumption. Use `getEdgeContent()` when you want the full buffer. * - * @this {import('../WarpRuntime.js').default} * @param {string} from - Source node ID * @param {string} to - Target node ID * @param {string} label - Edge label * @returns {Promise|null>} Async iterable of content chunks, or null + * @this {QueryController} */ -export async function getEdgeContentStream(from, to, label) { - await this._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._cachedState); +async function getEdgeContentStream(from, to, label) { + await this._host._ensureFreshState(); + const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); const registers = getEdgeContentRegisters(s, from, to, label); if (!registers) { return null; } const { value: oid } = registers.contentRegister; - if (this._blobStorage && typeof this._blobStorage.retrieveStream === 'function') { - return this._blobStorage.retrieveStream(oid); + if (this._host._blobStorage && typeof this._host._blobStorage.retrieveStream === 'function') { + return this._host._blobStorage.retrieveStream(oid); } - const buf = await this._persistence.readBlob(oid); + const buf = await this._host._persistence.readBlob(oid); return singleChunkAsyncIterable(buf); } @@ -904,3 +910,55 @@ function singleChunkAsyncIterable(buf) { }, }; } + +// ── Controller class ────────────────────────────────────────────────────────── + +/** + * QueryController — read-only query surface for materialized graph state. + * + * Each public method delegates to the module-level function above, + * bound with `this` as the controller (which carries `_host`). + */ +export default class QueryController { + /** @type {QueryHost} */ + _host; + + /** + * Creates a QueryController bound to a WarpRuntime host. + * @param {QueryHost} host + */ + constructor(host) { + this._host = host; + } +} + +// Wire all query functions as methods on the controller prototype. +// The functions use this._host._xxx, so they work when this = controller. +const queryFunctions = /** @type {const} */ ([ + 'hasNode', 'getNodeProps', 'getEdgeProps', 'neighbors', + 'getStateSnapshot', 'getNodes', 'getEdges', 'getPropertyCount', + 'query', 'worldline', 'observer', 'translationCost', + 'getContentOid', 'getContentMeta', 'getContent', + 'getEdgeContentOid', 'getEdgeContentMeta', 'getEdgeContent', + 'getContentStream', 'getEdgeContentStream', +]); + +/** @type {Record} */ +const fnMap = { + hasNode, getNodeProps, getEdgeProps, neighbors, + getStateSnapshot, getNodes, getEdges, getPropertyCount, + query, worldline, observer, translationCost, + getContentOid, getContentMeta, getContent, + getEdgeContentOid, getEdgeContentMeta, getEdgeContent, + getContentStream, getEdgeContentStream, +}; + +for (const name of queryFunctions) { + const fn = fnMap[name]; + Object.defineProperty(QueryController.prototype, name, { + value: fn, + writable: true, + configurable: true, + enumerable: false, + }); +} From 090b658ffc5b7f7328e370908f6de2a33d6f21a3 Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 05:15:49 -0700 Subject: [PATCH 43/73] docs: update CHANGELOG with decomposition and tech debt work --- CHANGELOG.md | 11 +++++++++++ 1 file changed, 11 insertions(+) diff --git a/CHANGELOG.md b/CHANGELOG.md index 24fef184..fc6363ea 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -24,9 +24,20 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 - **AuditReceiptService uses AuditError** — all 16 raw `Error` throws replaced with typed `AuditError` carrying serializable context and machine-readable error codes (`E_AUDIT_INVALID`, `E_AUDIT_CAS_FAILED`, `E_AUDIT_DEGRADED`). - **CLI import.meta.url resolution** — replaced `__dirname` polyfill pattern in CLI with idiomatic `fileURLToPath(new URL('../..', import.meta.url))` for resilient package root resolution. +- **WarpRuntime god class decomposition (NO_DOGS_NO_MASTERS)** — extracted 6 of 11 mixin method groups into independent service controllers, following the SyncController precedent. Each controller receives the runtime host via constructor injection and delegates through `defineProperty` loops on the prototype. Public API surface unchanged — all 100+ methods remain on `WarpRuntime.prototype`. The remaining 4 mixins (checkpoint, patch, materialize, materializeAdvanced) form the core mutation kernel and are deferred to a future cycle. + - **`StrandController`** (182 LOC) — strand lifecycle + conflict analysis, cached StrandService instance + - **`ComparisonController`** (1,155 LOC) — coordinate/strand comparison, transfer planning + - **`SubscriptionController`** (244 LOC) — subscribe, watch, notification dispatch + - **`ProvenanceController`** (242 LOC) — patch lookups, backward causal cone, slice materialization + - **`ForkController`** (274 LOC) — fork creation, wormhole compression, backfill rejection + - **`QueryController`** (964 LOC) — all read queries, observer/worldline factories, content access +- **`AuditReceipt` promoted to class** — replaced `@typedef {Object}` with a real JavaScript class. Constructor validates and freezes. Fields declared in alphabetical order for canonical CBOR serialization. +- **WarpApp/WarpCore content methods** — replaced direct function imports from `query.methods.js` with `callInternalRuntimeMethod()` delegation, which correctly resolves dynamically wired prototype methods. + ### Added - **`AuditError`** — domain error class for audit receipt validation and persistence failures. Exported from package root with four static error codes. +- **`NO_DOGS_NO_MASTERS` legend** — backlog legend for god object decomposition and typedef-to-class liberation. Code: `NDNM_`. - **Effect emission & delivery observation substrate slice** — new receipt families for outbound effects and their delivery lifecycle. `EffectEmission` records that the system produced an outbound effect candidate at a causal coordinate. `DeliveryObservation` records how a sink handled that emission (delivered, suppressed, failed, skipped). `ExternalizationPolicy` provides execution context (live/replay/inspect) that shapes delivery behavior. Preset lenses `LIVE_LENS`, `REPLAY_LENS`, and `INSPECT_LENS` cover common modes. - **`EffectSinkPort`** — abstract port for effect delivery sinks, following the hexagonal architecture pattern. From fbccc1d34e67573c3fe73e87a97d8c0510c5e694 Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 12:52:14 -0700 Subject: [PATCH 44/73] fix: address CodeRabbit PR #74 review feedback Critical: - WarpCore content methods now pass this._asRuntime() instead of this to callInternalRuntimeMethod(), fixing prototype chain resolution for dynamically-wired query methods Major: - ConflictAnalyzerService.normalizeEffectPayload() now handles legacy PropSet receipt type (was silently returning null) Minor (docs): - Fixed sentence fragments in typedef-to-class backlog items - Removed legacy B143 reference from drift detector backlog item - Fixed root-relative link in process.md (../../METHOD.md) - Fixed sentence fragment in cross-path equivalence cool idea --- .../backlog/DX_rfc-field-count-drift-detector.md | 2 +- .../bad-code/PROTO_typedef-dot-to-class.md | 4 ++-- .../bad-code/PROTO_typedef-patchdiff-to-class.md | 2 +- .../DX_cross-path-equivalence-test-dsl.md | 4 ++-- docs/method/process.md | 2 +- src/domain/WarpCore.js | 16 ++++++++-------- src/domain/services/ConflictAnalyzerService.js | 2 ++ 7 files changed, 17 insertions(+), 15 deletions(-) diff --git a/docs/method/backlog/DX_rfc-field-count-drift-detector.md b/docs/method/backlog/DX_rfc-field-count-drift-detector.md index 54ff6ce6..9d984c3c 100644 --- a/docs/method/backlog/DX_rfc-field-count-drift-detector.md +++ b/docs/method/backlog/DX_rfc-field-count-drift-detector.md @@ -8,5 +8,5 @@ Script that counts WarpGraph instance fields (grep `this._` in constructor) and ## Notes -- Depends on B143 RFC (exists at `docs/design/warpgraph-decomposition.md`) +- Depends on `docs/design/warpgraph-decomposition.md` - Low urgency — fold into PRs that touch related files diff --git a/docs/method/backlog/bad-code/PROTO_typedef-dot-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-dot-to-class.md index dcacdfa0..f915f5c7 100644 --- a/docs/method/backlog/bad-code/PROTO_typedef-dot-to-class.md +++ b/docs/method/backlog/bad-code/PROTO_typedef-dot-to-class.md @@ -5,5 +5,5 @@ ## Problem `src/domain/crdt/Dot.js` defines `Dot` as a `@typedef {Object}` but it -has factory (`createDot`), encode/decode, and comparison functions. Should -be a class with those as methods. +has factory (`createDot`), encode/decode, and comparison functions. It should +be a class with those behaviors as methods. diff --git a/docs/method/backlog/bad-code/PROTO_typedef-patchdiff-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-patchdiff-to-class.md index 04598257..0df4a5aa 100644 --- a/docs/method/backlog/bad-code/PROTO_typedef-patchdiff-to-class.md +++ b/docs/method/backlog/bad-code/PROTO_typedef-patchdiff-to-class.md @@ -6,4 +6,4 @@ `src/domain/types/PatchDiff.js` defines `PatchDiff` as a `@typedef {Object}` with a factory (`createEmptyDiff`) and merge logic (`mergeDiffs`). Real -data entity accumulated during reduce. Should be a class. +data entity accumulated during reduce. It should be a class. diff --git a/docs/method/backlog/cool-ideas/DX_cross-path-equivalence-test-dsl.md b/docs/method/backlog/cool-ideas/DX_cross-path-equivalence-test-dsl.md index 401f3c6e..e2a65cd8 100644 --- a/docs/method/backlog/cool-ideas/DX_cross-path-equivalence-test-dsl.md +++ b/docs/method/backlog/cool-ideas/DX_cross-path-equivalence-test-dsl.md @@ -8,5 +8,5 @@ through N code paths and asserts identical output. This generalizes: - Sync request/response vs local materialize - Incremental vs full reduce -Could be a test DSL: -`assertPathEquivalence(input, [pathA, pathB, pathC], comparator)` +A possible test DSL is: +`assertPathEquivalence(input, [pathA, pathB, pathC], comparator)`. diff --git a/docs/method/process.md b/docs/method/process.md index 8221ed22..f6f92b51 100644 --- a/docs/method/process.md +++ b/docs/method/process.md @@ -1,6 +1,6 @@ # How cycles run -See [METHOD.md](/METHOD.md) for the full philosophy. This file is +See [METHOD.md](../../METHOD.md) for the full philosophy. This file is the quick-reference for operating a cycle. ## Starting a cycle diff --git a/src/domain/WarpCore.js b/src/domain/WarpCore.js index 28ce203d..1c7baee8 100644 --- a/src/domain/WarpCore.js +++ b/src/domain/WarpCore.js @@ -168,28 +168,28 @@ export default class WarpCore { } /** Returns a content attachment by node ID. @param {string} nodeId @returns {Promise} */ - async getContent(nodeId) { return /** @type {Uint8Array|null} */ (await callInternalRuntimeMethod(this, 'getContent', nodeId)); } + async getContent(nodeId) { return /** @type {Uint8Array|null} */ (await callInternalRuntimeMethod(this._asRuntime(), 'getContent', nodeId)); } /** Returns a content stream by node ID. @param {string} nodeId @returns {Promise|null>} */ - async getContentStream(nodeId) { return /** @type {AsyncIterable|null} */ (await callInternalRuntimeMethod(this, 'getContentStream', nodeId)); } + async getContentStream(nodeId) { return /** @type {AsyncIterable|null} */ (await callInternalRuntimeMethod(this._asRuntime(), 'getContentStream', nodeId)); } /** Returns the OID for a content attachment. @param {string} nodeId @returns {Promise} */ - async getContentOid(nodeId) { return /** @type {string|null} */ (await callInternalRuntimeMethod(this, 'getContentOid', nodeId)); } + async getContentOid(nodeId) { return /** @type {string|null} */ (await callInternalRuntimeMethod(this._asRuntime(), 'getContentOid', nodeId)); } /** Returns content metadata. @param {string} nodeId @returns {Promise<{ oid: string, mime: string|null, size: number|null }|null>} */ - async getContentMeta(nodeId) { return /** @type {{ oid: string, mime: string|null, size: number|null }|null} */ (await callInternalRuntimeMethod(this, 'getContentMeta', nodeId)); } + async getContentMeta(nodeId) { return /** @type {{ oid: string, mime: string|null, size: number|null }|null} */ (await callInternalRuntimeMethod(this._asRuntime(), 'getContentMeta', nodeId)); } /** Returns a content attachment for an edge. @param {string} from @param {string} to @param {string} label @returns {Promise} */ - async getEdgeContent(from, to, label) { return /** @type {Uint8Array|null} */ (await callInternalRuntimeMethod(this, 'getEdgeContent', from, to, label)); } + async getEdgeContent(from, to, label) { return /** @type {Uint8Array|null} */ (await callInternalRuntimeMethod(this._asRuntime(), 'getEdgeContent', from, to, label)); } /** Returns a content stream for an edge. @param {string} from @param {string} to @param {string} label @returns {Promise|null>} */ - async getEdgeContentStream(from, to, label) { return /** @type {AsyncIterable|null} */ (await callInternalRuntimeMethod(this, 'getEdgeContentStream', from, to, label)); } + async getEdgeContentStream(from, to, label) { return /** @type {AsyncIterable|null} */ (await callInternalRuntimeMethod(this._asRuntime(), 'getEdgeContentStream', from, to, label)); } /** Returns the OID for an edge content attachment. @param {string} from @param {string} to @param {string} label @returns {Promise} */ - async getEdgeContentOid(from, to, label) { return /** @type {string|null} */ (await callInternalRuntimeMethod(this, 'getEdgeContentOid', from, to, label)); } + async getEdgeContentOid(from, to, label) { return /** @type {string|null} */ (await callInternalRuntimeMethod(this._asRuntime(), 'getEdgeContentOid', from, to, label)); } /** Returns metadata for an edge content attachment. @param {string} from @param {string} to @param {string} label @returns {Promise<{ oid: string, mime: string|null, size: number|null }|null>} */ - async getEdgeContentMeta(from, to, label) { return /** @type {{ oid: string, mime: string|null, size: number|null }|null} */ (await callInternalRuntimeMethod(this, 'getEdgeContentMeta', from, to, label)); } + async getEdgeContentMeta(from, to, label) { return /** @type {{ oid: string, mime: string|null, size: number|null }|null} */ (await callInternalRuntimeMethod(this._asRuntime(), 'getEdgeContentMeta', from, to, label)); } // ── Strands ───────────────────────────────────────────────────────── diff --git a/src/domain/services/ConflictAnalyzerService.js b/src/domain/services/ConflictAnalyzerService.js index dba6ac3b..f86a3632 100644 --- a/src/domain/services/ConflictAnalyzerService.js +++ b/src/domain/services/ConflictAnalyzerService.js @@ -945,6 +945,8 @@ function normalizeEffectPayload(_target, opType, canonOp) { EdgeAdd: () => ({ dot: canonOp['dot'] ?? null }), /** Extracts observed dots from an EdgeTombstone operation. */ EdgeTombstone: () => ({ observedDots: normalizeObservedDots(canonOp['observedDots']) }), + /** Extracts the value from a PropSet operation (legacy raw type). */ + PropSet: () => ({ value: canonOp['value'] ?? null }), /** Extracts the value from a NodePropSet operation. */ NodePropSet: () => ({ value: canonOp['value'] ?? null }), /** Extracts the value from an EdgePropSet operation. */ From 07d0736f906fde8ec5374668ee3948202d8ee307 Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 12:57:14 -0700 Subject: [PATCH 45/73] fix: repair 4 broken links from backlog migration MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit - .github/maintainers/README.md: BACKLOG/README.md → docs/method/backlog/ - docs/ADVANCED_GUIDE.md: BACKLOG/OG-013/014 → current backlog paths - docs/README.md: docs/release.md → docs/method/release.md - Updated advanced guide shape test to match new backlog references --- .github/maintainers/README.md | 4 ++-- docs/ADVANCED_GUIDE.md | 4 ++-- docs/README.md | 2 +- test/unit/scripts/public-api-advanced-guide-shape.test.js | 4 ++-- 4 files changed, 7 insertions(+), 7 deletions(-) diff --git a/.github/maintainers/README.md b/.github/maintainers/README.md index 2fc18ccd..cf861474 100644 --- a/.github/maintainers/README.md +++ b/.github/maintainers/README.md @@ -18,8 +18,8 @@ evaluating or using the product API. ## Related project artifacts -- [Backlog](../../BACKLOG/README.md) - Active and promoted work tracked as repo-operating artifacts. +- [Backlog](../../docs/method/backlog/) + Lane-organized backlog items with legend prefixes. - [Design notes](../../docs/design/) Governing design docs for promoted backlog items and active cycles. - [Retrospectives](../../docs/archive/retrospectives/) diff --git a/docs/ADVANCED_GUIDE.md b/docs/ADVANCED_GUIDE.md index 3f6499d5..01b7f03f 100644 --- a/docs/ADVANCED_GUIDE.md +++ b/docs/ADVANCED_GUIDE.md @@ -206,8 +206,8 @@ That is not a law of physics. It is a good operating default until real measurem Current design backlog: -- [OG-013 out-of-core materialization and streaming reads](../BACKLOG/OG-013-out-of-core-materialization-and-streaming-reads.md) -- [OG-014 streaming content attachments](../BACKLOG/OG-014-streaming-content-attachments.md) +- [Out-of-core materialization](method/backlog/PERF_out-of-core-materialization.md) +- [Streaming graph traversal](method/backlog/cool-ideas/PERF_streaming-graph-traversal.md) ## Where next diff --git a/docs/README.md b/docs/README.md index 42e3d207..a5a1cf0a 100644 --- a/docs/README.md +++ b/docs/README.md @@ -42,7 +42,7 @@ when you need that level of detail. System structure, public/core boundaries, and internal layering. - [Roadmap](ROADMAP.md) Current committed release and milestone inventory. -- [Release Guide](release.md) +- [Release Guide](method/release.md) Release and preflight process. - [Trust Migration](trust/TRUST_MIGRATION.md) Migration path for signed trust evidence. diff --git a/test/unit/scripts/public-api-advanced-guide-shape.test.js b/test/unit/scripts/public-api-advanced-guide-shape.test.js index 710a2b34..fc4ad4f5 100644 --- a/test/unit/scripts/public-api-advanced-guide-shape.test.js +++ b/test/unit/scripts/public-api-advanced-guide-shape.test.js @@ -31,7 +31,7 @@ describe('Advanced Guide engine-room shape', () => { expect(advancedGuide).toContain("factKind: 'coordinate-transfer-plan'"); expect(advancedGuide).toContain('[API Reference](API_REFERENCE.md)'); expect(advancedGuide).toContain('[Architecture](ARCHITECTURE.md)'); - expect(advancedGuide).toContain('OG-013'); - expect(advancedGuide).toContain('OG-014'); + expect(advancedGuide).toContain('Out-of-core materialization'); + expect(advancedGuide).toContain('Streaming graph traversal'); }); }); From 217a7af2a7a2bbf6984056b6d12a6cc84bfafc4e Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 13:03:12 -0700 Subject: [PATCH 46/73] fix(lint): add language specifiers to bare fenced code blocks Markdown lint (MD040) requires all fenced code blocks to have a language identifier. Added `text` to 12 bare fences across: - docs/design/0003-safe-context/safe-context.md (8 blocks) - docs/method/legends/CLEAN_CODE.md (1 block) - docs/method/legends/NO_DOGS_NO_MASTERS.md (1 block) - METHOD.md (4 opening blocks) --- METHOD.md | 4 ++-- docs/design/0003-safe-context/safe-context.md | 16 ++++++++-------- docs/method/legends/CLEAN_CODE.md | 2 +- docs/method/legends/NO_DOGS_NO_MASTERS.md | 2 +- 4 files changed, 12 insertions(+), 12 deletions(-) diff --git a/METHOD.md b/METHOD.md index 942dcb2e..e62da7cd 100644 --- a/METHOD.md +++ b/METHOD.md @@ -76,7 +76,7 @@ work that matters, but does not currently belong in a named lane. Legend prefix if applicable. No numeric IDs. -``` +```text VIZ_braille-rendering.md PROTO_strand-lifecycle.md debt-trailer-codec-dts.md @@ -86,7 +86,7 @@ debt-trailer-codec-dts.md Pulled into a cycle, a backlog item becomes a design doc: -``` +```text backlog/asap/PROTO_strand-lifecycle.md → design//strand-lifecycle.md ``` diff --git a/docs/design/0003-safe-context/safe-context.md b/docs/design/0003-safe-context/safe-context.md index 2d59788e..598f7ee4 100644 --- a/docs/design/0003-safe-context/safe-context.md +++ b/docs/design/0003-safe-context/safe-context.md @@ -252,7 +252,7 @@ limits output to a single class or top-level declaration by name. **CLI text output:** -``` +```text src/domain/services/StrandService.js (2048 lines, javascript) exports: @@ -582,7 +582,7 @@ Tests are the spec. Playback questions map directly to test cases. ### Policy tests (`policy.test.js`) -``` +```text safe_read("foo.gif") -> refused, reason: binary_extension safe_read("node_modules/x.js") -> refused, reason: generated_path safe_read("dist/bundle.js") -> refused, reason: generated_path @@ -613,7 +613,7 @@ state_load() with no prior save -> returns null ### Outline tests (`outline.test.js`) -``` +```text outline("large-class.js") -> has exports array -> has classes array with members @@ -650,7 +650,7 @@ outline("large-class.js", { focus: "StrandService" }) ### Capture tests (`capture.test.js`) -``` +```text run_capture("echo hello", 10) -> exitCode: 0 -> tail contains "hello" @@ -666,7 +666,7 @@ run_capture("seq 1 500", 5) ### State tests (`state.test.js`) -``` +```text state_save("# Working on X") -> file exists at .graft/WORKING_STATE.md -> content matches @@ -680,7 +680,7 @@ state_load() with no prior save ### MCP integration tests (`mcp.test.js`) -``` +```text spawn MCP server via stdio -> server lists all tools -> safe_read call returns valid response @@ -782,7 +782,7 @@ Diagnostic command for debugging policy behavior. git graft doctor ``` -``` +```text project root: /Users/james/git/git-stunts/git-warp (.git detected) line threshold: 150 byte threshold: 12,000 @@ -808,7 +808,7 @@ Minimal decision metrics. Not a dashboard — a quick summary. git graft stats ``` -``` +```text session decisions (since last clear): content: 12 reads passed through outline: 8 reads downgraded to outline diff --git a/docs/method/legends/CLEAN_CODE.md b/docs/method/legends/CLEAN_CODE.md index 67e971ca..2936d1ce 100644 --- a/docs/method/legends/CLEAN_CODE.md +++ b/docs/method/legends/CLEAN_CODE.md @@ -86,7 +86,7 @@ legend: `CC` — for backlog items that belong to this legend. -``` +```text CC_strand-service-decomposition.md CC_raw-error-purge.md CC_max-lines-ratchet.md diff --git a/docs/method/legends/NO_DOGS_NO_MASTERS.md b/docs/method/legends/NO_DOGS_NO_MASTERS.md index 7fe241fc..a78bec89 100644 --- a/docs/method/legends/NO_DOGS_NO_MASTERS.md +++ b/docs/method/legends/NO_DOGS_NO_MASTERS.md @@ -95,7 +95,7 @@ in one pass. A 2,000-LOC god object with 14 typedef vassals is not. `NDNM` — for backlog items that belong to this legend. -``` +```text NDNM_warpruntime-decomposition.md NDNM_typedef-tickreceipt.md NDNM_typedef-orset.md From 471cda608726ce9b5cccca2323f6ba7ecde29a90 Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 13:08:12 -0700 Subject: [PATCH 47/73] fix: correct stale JSDoc import path in ComparisonController Three @typedef imports used '../services/JoinReducer.js' instead of './JoinReducer.js' (file is already in services/). The duplicate resolution caused a Deno AST panic during JSR dry-run publish. --- src/domain/services/ComparisonController.js | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/src/domain/services/ComparisonController.js b/src/domain/services/ComparisonController.js index 16d2eee1..2979277d 100644 --- a/src/domain/services/ComparisonController.js +++ b/src/domain/services/ComparisonController.js @@ -57,7 +57,7 @@ const COORDINATE_TRANSFER_PLAN_VERSION = 'coordinate-transfer-plan/v1'; * * @typedef {Object} ResolvedComparisonSide * @property {Record} requested - Original requested selector - * @property {import('../services/JoinReducer.js').WarpStateV5} state - Materialized state + * @property {import('./JoinReducer.js').WarpStateV5} state - Materialized state * @property {Array<{ patch: import('../types/WarpTypesV2.js').PatchV2, sha: string }>} patchEntries - Patch entries * @property {Record} resolved - Resolved metadata with digests */ @@ -604,7 +604,7 @@ function buildStrandMetadata(strandId, descriptor) { * @param {import('../WarpRuntime.js').default} graph * @param {{ * requested: Record, - * state: import('../services/JoinReducer.js').WarpStateV5, + * state: import('./JoinReducer.js').WarpStateV5, * patchEntries: Array<{ patch: import('../types/WarpTypesV2.js').PatchV2, sha: string }>, * coordinateKind: 'frontier'|'strand'|'strand_base', * lamportCeiling: number|null, @@ -710,7 +710,7 @@ async function resolveStrandComparisonSide(graph, selector, scope) { const strandId = /** @type {string} */ (selector.strandId ?? ''); const strands = new StrandService({ graph }); const descriptor = await strands.getOrThrow(strandId); - const state = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (await callInternalRuntimeMethod( + const state = /** @type {import('./JoinReducer.js').WarpStateV5} */ (await callInternalRuntimeMethod( graph, 'materializeStrand', strandId, From cd41f998d9e8f0dc7edb01826c5b573fae052a3e Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 13:14:02 -0700 Subject: [PATCH 48/73] fix: deduplicate JSDoc import paths to work around Deno AST panic Deno's deno_ast panics when two inline JSDoc import() paths resolve to the same module at nearby byte offsets during publish --dry-run. Consolidated duplicate import('./JoinReducer.js').WarpStateV5 refs in SubscriptionController into a single top-level @typedef alias. --- src/domain/services/SubscriptionController.js | 6 ++++-- 1 file changed, 4 insertions(+), 2 deletions(-) diff --git a/src/domain/services/SubscriptionController.js b/src/domain/services/SubscriptionController.js index a999ec66..9bb30433 100644 --- a/src/domain/services/SubscriptionController.js +++ b/src/domain/services/SubscriptionController.js @@ -10,6 +10,8 @@ import { diffStates, isEmptyDiff } from './StateDiff.js'; import { matchGlob } from '../utils/matchGlob.js'; +/** @typedef {import('./JoinReducer.js').WarpStateV5} WarpStateV5 */ + /** * @typedef {Object} Subscriber * @property {(diff: import('./StateDiff.js').StateDiffResult) => void} onChange @@ -21,7 +23,7 @@ import { matchGlob } from '../utils/matchGlob.js'; * The host interface that SubscriptionController depends on. * * @typedef {Object} SubscriptionHost - * @property {import('./JoinReducer.js').WarpStateV5|null} _cachedState + * @property {WarpStateV5|null} _cachedState * @property {Array<{onChange: Function, onError?: Function, pendingReplay?: boolean}>} _subscribers * @property {() => Promise} hasFrontierChanged * @property {(options?: Record) => Promise} materialize @@ -213,7 +215,7 @@ export default class SubscriptionController { * cached state was available. * * @param {import('./StateDiff.js').StateDiffResult} diff - * @param {import('./JoinReducer.js').WarpStateV5} currentState + * @param {WarpStateV5} currentState */ _notifySubscribers(diff, currentState) { for (const subscriber of /** @type {Subscriber[]} */ ([...this._host._subscribers])) { From 61defa48362aabfd575593dbcc994c5dc7fc00ce Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 13:21:16 -0700 Subject: [PATCH 49/73] fix: deduplicate inline JSDoc import() paths for Deno AST compatibility Add top-level @typedef aliases in three controller files to avoid repeated inline import() paths that trigger a Deno AST panic: - SubscriptionController.js: StateDiffResult, EdgeChange, PropSet, PropRemoved - ProvenanceController.js: WarpStateV5 - ComparisonController.js: WarpStateV5 --- src/domain/services/ComparisonController.js | 7 +++--- src/domain/services/ProvenanceController.js | 5 +++-- src/domain/services/SubscriptionController.js | 22 +++++++++++-------- 3 files changed, 20 insertions(+), 14 deletions(-) diff --git a/src/domain/services/ComparisonController.js b/src/domain/services/ComparisonController.js index 2979277d..e3b9e24a 100644 --- a/src/domain/services/ComparisonController.js +++ b/src/domain/services/ComparisonController.js @@ -38,6 +38,7 @@ const COORDINATE_TRANSFER_PLAN_VERSION = 'coordinate-transfer-plan/v1'; * @typedef {import('../../../index.js').StrandDescriptor} StrandDescriptorV1 * @typedef {import('../../../index.js').CoordinateComparisonV1} CoordinateComparisonV1 * @typedef {import('../../../index.js').CoordinateTransferPlanV1} CoordinateTransferPlanV1 + * @typedef {import('./JoinReducer.js').WarpStateV5} WarpStateV5 * @typedef {{ left: Record, right: Record, targetId?: string|null, scope?: VisibleStateScopeV1|null }} InternalCompareCoordinatesOptions * @typedef {{ source: Record, target: Record, scope?: VisibleStateScopeV1|null }} InternalPlanCoordinateTransferOptions */ @@ -57,7 +58,7 @@ const COORDINATE_TRANSFER_PLAN_VERSION = 'coordinate-transfer-plan/v1'; * * @typedef {Object} ResolvedComparisonSide * @property {Record} requested - Original requested selector - * @property {import('./JoinReducer.js').WarpStateV5} state - Materialized state + * @property {WarpStateV5} state - Materialized state * @property {Array<{ patch: import('../types/WarpTypesV2.js').PatchV2, sha: string }>} patchEntries - Patch entries * @property {Record} resolved - Resolved metadata with digests */ @@ -604,7 +605,7 @@ function buildStrandMetadata(strandId, descriptor) { * @param {import('../WarpRuntime.js').default} graph * @param {{ * requested: Record, - * state: import('./JoinReducer.js').WarpStateV5, + * state: WarpStateV5, * patchEntries: Array<{ patch: import('../types/WarpTypesV2.js').PatchV2, sha: string }>, * coordinateKind: 'frontier'|'strand'|'strand_base', * lamportCeiling: number|null, @@ -710,7 +711,7 @@ async function resolveStrandComparisonSide(graph, selector, scope) { const strandId = /** @type {string} */ (selector.strandId ?? ''); const strands = new StrandService({ graph }); const descriptor = await strands.getOrThrow(strandId); - const state = /** @type {import('./JoinReducer.js').WarpStateV5} */ (await callInternalRuntimeMethod( + const state = /** @type {WarpStateV5} */ (await callInternalRuntimeMethod( graph, 'materializeStrand', strandId, diff --git a/src/domain/services/ProvenanceController.js b/src/domain/services/ProvenanceController.js index 73444fe3..cd1180c0 100644 --- a/src/domain/services/ProvenanceController.js +++ b/src/domain/services/ProvenanceController.js @@ -12,6 +12,7 @@ import { createEmptyStateV5, reduceV5 } from './JoinReducer.js'; import { ProvenancePayload } from './ProvenancePayload.js'; import { decodePatchMessage, detectMessageKind } from './WarpMessageCodec.js'; +/** @typedef {import('./JoinReducer.js').WarpStateV5} WarpStateV5 */ /** @typedef {import('../types/WarpTypesV2.js').PatchV2} PatchV2 */ /** @@ -64,7 +65,7 @@ export default class ProvenanceController { * * @param {string} nodeId * @param {{receipts?: boolean}} [options] - * @returns {Promise<{state: import('./JoinReducer.js').WarpStateV5, patchCount: number, receipts?: import('../types/TickReceipt.js').TickReceipt[]}>} + * @returns {Promise<{state: WarpStateV5, patchCount: number, receipts?: import('../types/TickReceipt.js').TickReceipt[]}>} */ async materializeSlice(nodeId, options) { const host = this._host; @@ -107,7 +108,7 @@ export default class ProvenanceController { host._logTiming('materializeSlice', t0, { metrics: `${sortedPatches.length} patches` }); if (collectReceipts) { - const result = /** @type {{state: import('./JoinReducer.js').WarpStateV5, receipts: import('../types/TickReceipt.js').TickReceipt[]}} */ (reduceV5(sortedPatches, undefined, { receipts: true })); + const result = /** @type {{state: WarpStateV5, receipts: import('../types/TickReceipt.js').TickReceipt[]}} */ (reduceV5(sortedPatches, undefined, { receipts: true })); return { state: result.state, patchCount: sortedPatches.length, diff --git a/src/domain/services/SubscriptionController.js b/src/domain/services/SubscriptionController.js index 9bb30433..fc615409 100644 --- a/src/domain/services/SubscriptionController.js +++ b/src/domain/services/SubscriptionController.js @@ -11,10 +11,14 @@ import { diffStates, isEmptyDiff } from './StateDiff.js'; import { matchGlob } from '../utils/matchGlob.js'; /** @typedef {import('./JoinReducer.js').WarpStateV5} WarpStateV5 */ +/** @typedef {import('./StateDiff.js').StateDiffResult} StateDiffResult */ +/** @typedef {import('./StateDiff.js').EdgeChange} EdgeChange */ +/** @typedef {import('./StateDiff.js').PropSet} PropSet */ +/** @typedef {import('./StateDiff.js').PropRemoved} PropRemoved */ /** * @typedef {Object} Subscriber - * @property {(diff: import('./StateDiff.js').StateDiffResult) => void} onChange + * @property {(diff: StateDiffResult) => void} onChange * @property {((error: unknown) => void)|undefined} [onError] * @property {boolean} pendingReplay */ @@ -51,7 +55,7 @@ export default class SubscriptionController { * fires `onChange` with a diff from empty state to current state. If * `_cachedState` is null, replay is deferred until the first materialize. * - * @param {{ onChange: (diff: import('./StateDiff.js').StateDiffResult) => void, onError?: (error: unknown) => void, replay?: boolean }} options + * @param {{ onChange: (diff: StateDiffResult) => void, onError?: (error: unknown) => void, replay?: boolean }} options * @returns {{ unsubscribe: () => void }} */ subscribe({ onChange, onError, replay = false }) { @@ -104,7 +108,7 @@ export default class SubscriptionController { * `hasFrontierChanged()` and auto-materializes if changed. * * @param {string|string[]} pattern - * @param {{ onChange: (diff: import('./StateDiff.js').StateDiffResult) => void, onError?: (error: unknown) => void, poll?: number }} options + * @param {{ onChange: (diff: StateDiffResult) => void, onError?: (error: unknown) => void, poll?: number }} options * @returns {{ unsubscribe: () => void }} */ watch(pattern, { onChange, onError, poll }) { @@ -127,7 +131,7 @@ export default class SubscriptionController { /** * Filtered onChange that only passes matching changes. - * @param {import('./StateDiff.js').StateDiffResult} diff + * @param {StateDiffResult} diff */ const filteredOnChange = (diff) => { const filteredDiff = { @@ -136,12 +140,12 @@ export default class SubscriptionController { removed: diff.nodes.removed.filter(matchesPattern), }, edges: { - added: diff.edges.added.filter((/** @type {import('./StateDiff.js').EdgeChange} */ e) => matchesPattern(e.from) || matchesPattern(e.to)), - removed: diff.edges.removed.filter((/** @type {import('./StateDiff.js').EdgeChange} */ e) => matchesPattern(e.from) || matchesPattern(e.to)), + added: diff.edges.added.filter((/** @type {EdgeChange} */ e) => matchesPattern(e.from) || matchesPattern(e.to)), + removed: diff.edges.removed.filter((/** @type {EdgeChange} */ e) => matchesPattern(e.from) || matchesPattern(e.to)), }, props: { - set: diff.props.set.filter((/** @type {import('./StateDiff.js').PropSet} */ p) => matchesPattern(p.nodeId)), - removed: diff.props.removed.filter((/** @type {import('./StateDiff.js').PropRemoved} */ p) => matchesPattern(p.nodeId)), + set: diff.props.set.filter((/** @type {PropSet} */ p) => matchesPattern(p.nodeId)), + removed: diff.props.removed.filter((/** @type {PropRemoved} */ p) => matchesPattern(p.nodeId)), }, }; @@ -214,7 +218,7 @@ export default class SubscriptionController { * Handles deferred replay for subscribers added with `replay: true` before * cached state was available. * - * @param {import('./StateDiff.js').StateDiffResult} diff + * @param {StateDiffResult} diff * @param {WarpStateV5} currentState */ _notifySubscribers(diff, currentState) { From 9546707bd08d61b0bc77443ca57a17eff7584191 Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 13:39:06 -0700 Subject: [PATCH 50/73] fix: use @import JSDoc tag to fix Deno AST panic on JSR publish MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Deno 2.6.7 deno_ast panics when multiple @typedef declarations import from the same module path (overlapping text changes during path rewriting). The @import tag (TypeScript 5.5+) consolidates multiple type imports into a single declaration per module, producing exactly one text change per import path. Converted: - SubscriptionController: 5 @typedef → 2 @import (JoinReducer, StateDiff) - ComparisonController: 9 @typedef → 2 @import (index.js, JoinReducer) - ProvenanceController: 2 @typedef → 2 @import (JoinReducer, WarpTypesV2) Removed unused VisibleStateScopePrefixFilterV1 import (TS6196). --- src/domain/services/ComparisonController.js | 12 +++--------- src/domain/services/ProvenanceController.js | 4 ++-- src/domain/services/SubscriptionController.js | 7 ++----- 3 files changed, 7 insertions(+), 16 deletions(-) diff --git a/src/domain/services/ComparisonController.js b/src/domain/services/ComparisonController.js index e3b9e24a..4dd4e006 100644 --- a/src/domain/services/ComparisonController.js +++ b/src/domain/services/ComparisonController.js @@ -29,16 +29,10 @@ import { callInternalRuntimeMethod } from '../utils/callInternalRuntimeMethod.js const COORDINATE_COMPARISON_VERSION = 'coordinate-compare/v1'; const COORDINATE_TRANSFER_PLAN_VERSION = 'coordinate-transfer-plan/v1'; +/** @import { VisibleStateScopeV1, VisibleStateReaderV5, CoordinateComparisonSelectorV1, CoordinateTransferPlanSelectorV1, CoordinateComparisonV1, CoordinateTransferPlanV1, StrandDescriptor as StrandDescriptorV1 } from '../../../index.js' */ +/** @import { WarpStateV5 } from './JoinReducer.js' */ + /** - * @typedef {import('../../../index.js').VisibleStateScopePrefixFilterV1} VisibleStateScopePrefixFilterV1 - * @typedef {import('../../../index.js').VisibleStateScopeV1} VisibleStateScopeV1 - * @typedef {import('../../../index.js').VisibleStateReaderV5} VisibleStateReaderV5 - * @typedef {import('../../../index.js').CoordinateComparisonSelectorV1} CoordinateComparisonSelectorV1 - * @typedef {import('../../../index.js').CoordinateTransferPlanSelectorV1} CoordinateTransferPlanSelectorV1 - * @typedef {import('../../../index.js').StrandDescriptor} StrandDescriptorV1 - * @typedef {import('../../../index.js').CoordinateComparisonV1} CoordinateComparisonV1 - * @typedef {import('../../../index.js').CoordinateTransferPlanV1} CoordinateTransferPlanV1 - * @typedef {import('./JoinReducer.js').WarpStateV5} WarpStateV5 * @typedef {{ left: Record, right: Record, targetId?: string|null, scope?: VisibleStateScopeV1|null }} InternalCompareCoordinatesOptions * @typedef {{ source: Record, target: Record, scope?: VisibleStateScopeV1|null }} InternalPlanCoordinateTransferOptions */ diff --git a/src/domain/services/ProvenanceController.js b/src/domain/services/ProvenanceController.js index cd1180c0..9767ea2b 100644 --- a/src/domain/services/ProvenanceController.js +++ b/src/domain/services/ProvenanceController.js @@ -12,8 +12,8 @@ import { createEmptyStateV5, reduceV5 } from './JoinReducer.js'; import { ProvenancePayload } from './ProvenancePayload.js'; import { decodePatchMessage, detectMessageKind } from './WarpMessageCodec.js'; -/** @typedef {import('./JoinReducer.js').WarpStateV5} WarpStateV5 */ -/** @typedef {import('../types/WarpTypesV2.js').PatchV2} PatchV2 */ +/** @import { WarpStateV5 } from './JoinReducer.js' */ +/** @import { PatchV2 } from '../types/WarpTypesV2.js' */ /** * The host interface that ProvenanceController depends on. diff --git a/src/domain/services/SubscriptionController.js b/src/domain/services/SubscriptionController.js index fc615409..0ebcfeda 100644 --- a/src/domain/services/SubscriptionController.js +++ b/src/domain/services/SubscriptionController.js @@ -10,11 +10,8 @@ import { diffStates, isEmptyDiff } from './StateDiff.js'; import { matchGlob } from '../utils/matchGlob.js'; -/** @typedef {import('./JoinReducer.js').WarpStateV5} WarpStateV5 */ -/** @typedef {import('./StateDiff.js').StateDiffResult} StateDiffResult */ -/** @typedef {import('./StateDiff.js').EdgeChange} EdgeChange */ -/** @typedef {import('./StateDiff.js').PropSet} PropSet */ -/** @typedef {import('./StateDiff.js').PropRemoved} PropRemoved */ +/** @import { WarpStateV5 } from './JoinReducer.js' */ +/** @import { StateDiffResult, EdgeChange, PropSet, PropRemoved } from './StateDiff.js' */ /** * @typedef {Object} Subscriber From b4dd4beb5b7db1dc246aefa01d66656d70a20b4f Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 14:06:10 -0700 Subject: [PATCH 51/73] refactor(NDNM): promote WarpStateV5 from typedef to class MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit WarpStateV5 is now a real JavaScript class in its own module (src/domain/services/WarpStateV5.js). This eliminates the most duplicated JSDoc import path in the codebase — 30+ files imported WarpStateV5 from JoinReducer.js, causing Deno AST panics during JSR publish when combined with other JoinReducer type imports. The class provides: - Constructor with named fields - static empty() factory (replaces createEmptyStateV5 internals) - clone() method (replaces cloneStateV5 internals) cloneStateV5() now throws PatchError if passed a non-instance, enforcing that all state construction goes through the class. JoinReducer re-exports the class for backward compatibility. Four construction sites updated: JoinReducer.joinStates, CheckpointSerializerV5.deserializeFullStateV5, CheckpointService.materializeIncremental, VisibleStateScopeV1.scopeMaterializedStateV5. Closes bad-code/PROTO_typedef-warpstatev5-to-class. --- src/domain/WarpCore.js | 14 +-- src/domain/services/BitmapIndexReader.js | 10 ++- src/domain/services/CheckpointSerializerV5.js | 5 +- src/domain/services/CheckpointService.js | 3 +- src/domain/services/ComparisonController.js | 4 +- .../services/ConflictAnalyzerService.js | 4 +- src/domain/services/ForkController.js | 4 +- src/domain/services/GraphTraversal.js | 5 +- src/domain/services/JoinReducer.js | 45 ++++------ .../services/MaterializedViewService.js | 2 +- src/domain/services/Observer.js | 3 +- src/domain/services/PatchBuilderV2.js | 4 +- src/domain/services/StrandController.js | 2 +- src/domain/services/StrandService.js | 7 +- src/domain/services/SyncController.js | 4 +- .../services/VisibleStateComparisonV5.js | 8 +- src/domain/services/VisibleStateScopeV1.js | 5 +- .../services/VisibleStateTransferPlannerV5.js | 8 +- src/domain/services/WarpStateV5.js | 86 +++++++++++++++++++ src/domain/services/Worldline.js | 19 ++-- src/domain/types/WarpPersistence.js | 4 +- src/domain/utils/defaultCrypto.js | 6 +- .../warp/materializeAdvanced.methods.js | 6 +- .../services/VisibleStateScopeV1.test.js | 5 +- 24 files changed, 169 insertions(+), 94 deletions(-) create mode 100644 src/domain/services/WarpStateV5.js diff --git a/src/domain/WarpCore.js b/src/domain/WarpCore.js index 1c7baee8..35078c9b 100644 --- a/src/domain/WarpCore.js +++ b/src/domain/WarpCore.js @@ -7,6 +7,8 @@ import { } from './services/CoordinateFactExport.js'; import { computeChecksum } from './utils/checksumUtils.js'; + +/** @import { CoordinateComparisonSelectorV1, CoordinateComparisonV1, CoordinateTransferPlanSelectorV1, CoordinateTransferPlanV1, CryptoPort, StrandBraidOptions, StrandCreateOptions, StrandDescriptor, StrandIntentDescriptor, StrandTickRecord, VisibleStateScopeV1 } from '../../index.js' */ /** @typedef {Parameters[1]} InternalBraidStrandOptions */ /** @typedef {Parameters[1]} InternalMaterializeStrandOptions */ /** @typedef {Parameters[1]} InternalCompareStrandOptions */ @@ -14,17 +16,7 @@ import { computeChecksum } from './utils/checksumUtils.js'; /** @typedef {Parameters[0]} InternalCompareCoordinatesOptions */ /** @typedef {Parameters[0]} InternalPlanCoordinateTransferOptions */ /** @typedef {Parameters[0]} InternalConflictAnalyzeOptions */ -/** @typedef {import('../../index.js').CoordinateComparisonV1} CoordinateComparisonV1 */ -/** @typedef {import('../../index.js').CoordinateTransferPlanV1} CoordinateTransferPlanV1 */ -/** @typedef {import('../../index.js').CoordinateComparisonSelectorV1} CoordinateComparisonSelectorV1 */ -/** @typedef {import('../../index.js').CoordinateTransferPlanSelectorV1} CoordinateTransferPlanSelectorV1 */ -/** @typedef {import('../../index.js').VisibleStateScopeV1} VisibleStateScopeV1 */ -/** @typedef {import('../../index.js').CryptoPort} CryptoPort */ -/** @typedef {import('../../index.js').StrandCreateOptions} StrandCreateOptions */ -/** @typedef {import('../../index.js').StrandBraidOptions} StrandBraidOptions */ -/** @typedef {import('../../index.js').StrandDescriptor} StrandDescriptor */ -/** @typedef {import('../../index.js').StrandIntentDescriptor} StrandIntentDescriptor */ -/** @typedef {import('../../index.js').StrandTickRecord} StrandTickRecord */ + /** * Refreshes the comparison digest for a coordinate comparison result. diff --git a/src/domain/services/BitmapIndexReader.js b/src/domain/services/BitmapIndexReader.js index 3060ecdc..096ce891 100644 --- a/src/domain/services/BitmapIndexReader.js +++ b/src/domain/services/BitmapIndexReader.js @@ -7,12 +7,16 @@ import { canonicalStringify } from '../utils/canonicalStringify.js'; import { isValidShardOid } from '../utils/validateShardOid.js'; import { base64Decode } from '../utils/bytes.js'; + +/** @import { default as CryptoPort } from '../../ports/CryptoPort.js' */ +/** @import { default as LoggerPort } from '../../ports/LoggerPort.js' */ +/** @import { RoaringBitmapSubset as BitmapShard } from '../utils/roaring.js' */ /** @typedef {import('../../ports/IndexStoragePort.js').default} IndexStoragePort */ /** @typedef {import('../types/WarpPersistence.js').IndexStorage} IndexStorage */ -/** @typedef {import('../../ports/LoggerPort.js').default} LoggerPort */ -/** @typedef {import('../../ports/CryptoPort.js').default} CryptoPort */ + + /** @typedef {Record} JsonShard */ -/** @typedef {import('../utils/roaring.js').RoaringBitmapSubset} BitmapShard */ + /** @typedef {JsonShard | BitmapShard} LoadedShard */ /** diff --git a/src/domain/services/CheckpointSerializerV5.js b/src/domain/services/CheckpointSerializerV5.js index fe7802c3..09227d63 100644 --- a/src/domain/services/CheckpointSerializerV5.js +++ b/src/domain/services/CheckpointSerializerV5.js @@ -17,6 +17,7 @@ import { orsetSerialize, orsetDeserialize } from '../crdt/ORSet.js'; import { vvSerialize, vvDeserialize } from '../crdt/VersionVector.js'; import { decodeDot } from '../crdt/Dot.js'; import { createEmptyStateV5 } from './JoinReducer.js'; +import WarpStateV5 from './WarpStateV5.js'; // ============================================================================ // Full State Serialization (for Checkpoints) @@ -108,13 +109,13 @@ export function deserializeFullStateV5(buffer, { codec: codecOpt } = {}) { if (obj['version'] !== undefined && obj['version'] !== 'full-v5') { throw new Error(`Unsupported full state version: expected 'full-v5', got '${JSON.stringify(obj['version'])}'`); } - return { + return new WarpStateV5({ nodeAlive: orsetDeserialize(obj['nodeAlive'] ?? {}), edgeAlive: orsetDeserialize(obj['edgeAlive'] ?? {}), prop: deserializeProps(/** @type {[string, unknown][]} */ (obj['prop'])), observedFrontier: vvDeserialize(/** @type {{[x: string]: number}} */ (obj['observedFrontier'] ?? {})), edgeBirthEvent: /** @type {Map} */ (deserializeEdgeBirthEvent(obj)), - }; + }); } // ============================================================================ diff --git a/src/domain/services/CheckpointService.js b/src/domain/services/CheckpointService.js index 61370498..28be85f2 100644 --- a/src/domain/services/CheckpointService.js +++ b/src/domain/services/CheckpointService.js @@ -25,6 +25,7 @@ import { createORSet, orsetAdd, orsetCompact } from '../crdt/ORSet.js'; import { createDot } from '../crdt/Dot.js'; import { createVersionVector } from '../crdt/VersionVector.js'; import { cloneStateV5, reduceV5 } from './JoinReducer.js'; +import WarpStateV5 from './WarpStateV5.js'; import { encodeEdgeKey, encodePropKey, CONTENT_PROPERTY_KEY, decodePropKey, isEdgePropKey, decodeEdgePropKey } from './KeyCodec.js'; import { ProvenanceIndex } from './ProvenanceIndex.js'; @@ -563,5 +564,5 @@ export function reconstructStateV5FromCheckpoint(visibleProjection) { edgeBirthEvent.set(edgeKey, { lamport: 0, writerId: '', patchSha: '0000', opIndex: 0 }); } - return { nodeAlive, edgeAlive, prop, observedFrontier, edgeBirthEvent }; + return new WarpStateV5({ nodeAlive, edgeAlive, prop, observedFrontier, edgeBirthEvent }); } diff --git a/src/domain/services/ComparisonController.js b/src/domain/services/ComparisonController.js index 4dd4e006..f63f5d61 100644 --- a/src/domain/services/ComparisonController.js +++ b/src/domain/services/ComparisonController.js @@ -26,6 +26,8 @@ import StrandService from './StrandService.js'; import { computeChecksum } from '../utils/checksumUtils.js'; import { callInternalRuntimeMethod } from '../utils/callInternalRuntimeMethod.js'; + +/** @import { default as ComparisonHost } from '../WarpRuntime.js' */ const COORDINATE_COMPARISON_VERSION = 'coordinate-compare/v1'; const COORDINATE_TRANSFER_PLAN_VERSION = 'coordinate-transfer-plan/v1'; @@ -1084,7 +1086,7 @@ async function compareCoordinatesImpl(graph, options) { /** * The host interface that ComparisonController depends on. * - * @typedef {import('../WarpRuntime.js').default} ComparisonHost + */ export default class ComparisonController { diff --git a/src/domain/services/ConflictAnalyzerService.js b/src/domain/services/ConflictAnalyzerService.js index f86a3632..5da87409 100644 --- a/src/domain/services/ConflictAnalyzerService.js +++ b/src/domain/services/ConflictAnalyzerService.js @@ -15,8 +15,10 @@ import { createEventId } from '../utils/EventId.js'; import { decodeEdgeKey } from './KeyCodec.js'; import StrandService from './StrandService.js'; + +/** @import { PatchV2 } from '../types/WarpTypesV2.js' */ /** @typedef {import('../WarpRuntime.js').default} WarpRuntime */ -/** @typedef {import('../types/WarpTypesV2.js').PatchV2} PatchV2 */ + /** @typedef {import('../types/TickReceipt.js').TickReceipt} TickReceipt */ /** @typedef {import('../utils/EventId.js').EventId} EventId */ diff --git a/src/domain/services/ForkController.js b/src/domain/services/ForkController.js index d7893273..b4b30ddf 100644 --- a/src/domain/services/ForkController.js +++ b/src/domain/services/ForkController.js @@ -12,12 +12,14 @@ import { validateGraphName, validateWriterId, buildWriterRef, buildWritersPrefix import { generateWriterId } from '../utils/WriterId.js'; import { createWormhole as createWormholeImpl } from './WormholeService.js'; + +/** @import { default as ForkHost } from '../WarpRuntime.js' */ const DEFAULT_ADJACENCY_CACHE_SIZE = 3; /** * The host interface that ForkController depends on. * - * @typedef {import('../WarpRuntime.js').default} ForkHost + */ export default class ForkController { diff --git a/src/domain/services/GraphTraversal.js b/src/domain/services/GraphTraversal.js index 7140cda3..227ba455 100644 --- a/src/domain/services/GraphTraversal.js +++ b/src/domain/services/GraphTraversal.js @@ -42,11 +42,8 @@ import MinHeap from '../utils/MinHeap.js'; import LRUCache from '../utils/LRUCache.js'; import { checkAborted } from '../utils/cancellation.js'; -/** @typedef {import('../../ports/NeighborProviderPort.js').default} NeighborProviderPort */ -/** @typedef {import('../../ports/NeighborProviderPort.js').Direction} Direction */ -/** @typedef {import('../../ports/NeighborProviderPort.js').NeighborEdge} NeighborEdge */ -/** @typedef {import('../../ports/NeighborProviderPort.js').NeighborOptions} NeighborOptions */ +/** @import { Direction, NeighborEdge, NeighborOptions, default as NeighborProviderPort } from '../../ports/NeighborProviderPort.js' */ /** * @typedef {Object} TraversalStats * @property {number} nodesVisited diff --git a/src/domain/services/JoinReducer.js b/src/domain/services/JoinReducer.js index faaad4a5..e1d24155 100644 --- a/src/domain/services/JoinReducer.js +++ b/src/domain/services/JoinReducer.js @@ -9,8 +9,8 @@ * } */ -import { createORSet, orsetAdd, orsetRemove, orsetJoin, orsetContains, orsetClone } from '../crdt/ORSet.js'; -import { createVersionVector, vvMerge, vvClone, vvDeserialize } from '../crdt/VersionVector.js'; +import { orsetAdd, orsetRemove, orsetJoin, orsetContains } from '../crdt/ORSet.js'; +import { vvMerge, vvDeserialize } from '../crdt/VersionVector.js'; import { lwwSet, lwwMax } from '../crdt/LWW.js'; import { createEventId, compareEventIds } from '../utils/EventId.js'; import { createTickReceipt, OP_TYPES } from '../types/TickReceipt.js'; @@ -19,6 +19,9 @@ import { encodeEdgeKey, decodeEdgeKey, encodePropKey, encodeEdgePropKey, EDGE_PR import { normalizeRawOp } from './OpNormalizer.js'; import { createEmptyDiff, mergeDiffs } from '../types/PatchDiff.js'; import PatchError from '../errors/PatchError.js'; +import WarpStateV5 from './WarpStateV5.js'; + +export { default as WarpStateV5 } from './WarpStateV5.js'; // Re-export key codec functions for backward compatibility export { @@ -31,18 +34,7 @@ export { // Re-export op normalization for consumers that operate on raw patches export { normalizeRawOp, lowerCanonicalOp } from './OpNormalizer.js'; -/** - * @typedef {Object} WarpStateV5 - * @property {import('../crdt/ORSet.js').ORSet} nodeAlive - ORSet of alive nodes - * @property {import('../crdt/ORSet.js').ORSet} edgeAlive - ORSet of alive edges - * @property {Map>} prop - Properties with LWW - * @property {import('../crdt/VersionVector.js').VersionVector} observedFrontier - Observed version vector - * @property {Map} edgeBirthEvent - EdgeKey → EventId of most recent EdgeAdd (for clean-slate prop visibility). - * Always present at runtime (initialized to empty Map by createEmptyStateV5 and - * deserializeFullStateV5). Edge birth events were introduced in a later schema - * version; older checkpoints serialize without this field, but the deserializer - * always produces an empty Map for them. - */ +// WarpStateV5 class imported from ./WarpStateV5.js (re-exported above) /** * @typedef {Object} OpLike @@ -79,13 +71,7 @@ export { normalizeRawOp, lowerCanonicalOp } from './OpNormalizer.js'; * @returns {WarpStateV5} A fresh, empty WARP state ready for patch application */ export function createEmptyStateV5() { - return { - nodeAlive: createORSet(), - edgeAlive: createORSet(), - prop: new Map(), - observedFrontier: createVersionVector(), - edgeBirthEvent: new Map(), - }; + return WarpStateV5.empty(); } /** @@ -968,13 +954,13 @@ export function join(state, patch, patchSha, collectReceipts) { * @returns {WarpStateV5} New state representing the join of a and b */ export function joinStates(a, b) { - return { + return new WarpStateV5({ nodeAlive: orsetJoin(a.nodeAlive, b.nodeAlive), edgeAlive: orsetJoin(a.edgeAlive, b.edgeAlive), prop: mergeProps(a.prop, b.prop), observedFrontier: vvMerge(a.observedFrontier, b.observedFrontier), edgeBirthEvent: mergeEdgeBirthEvent(a.edgeBirthEvent, b.edgeBirthEvent), - }; + }); } /** @@ -1102,11 +1088,10 @@ export function reduceV5(patches, initialState, options) { * @returns {WarpStateV5} A new state with identical contents but independent data structures */ export function cloneStateV5(state) { - return { - nodeAlive: orsetClone(state.nodeAlive), - edgeAlive: orsetClone(state.edgeAlive), - prop: new Map(state.prop), - observedFrontier: vvClone(state.observedFrontier), - edgeBirthEvent: new Map(state.edgeBirthEvent ?? []), - }; + if (!(state instanceof WarpStateV5)) { + throw new PatchError('cloneStateV5: expected WarpStateV5 instance', { + context: { actual: typeof state }, + }); + } + return state.clone(); } diff --git a/src/domain/services/MaterializedViewService.js b/src/domain/services/MaterializedViewService.js index 9507ef0c..b3ba815b 100644 --- a/src/domain/services/MaterializedViewService.js +++ b/src/domain/services/MaterializedViewService.js @@ -155,7 +155,7 @@ function sampleNodes(allNodes, sampleRate, seed) { /** * Builds adjacency maps from state for ground-truth verification. * - * @param {import('../services/JoinReducer.js').WarpStateV5} state + * @param {import('./JoinReducer.js').WarpStateV5} state * @returns {{ outgoing: Map>, incoming: Map> }} */ function buildGroundTruthAdjacency(state) { diff --git a/src/domain/services/Observer.js b/src/domain/services/Observer.js index 01b16e6c..65c05018 100644 --- a/src/domain/services/Observer.js +++ b/src/domain/services/Observer.js @@ -15,8 +15,9 @@ import { createStateReaderV5 } from './StateReaderV5.js'; import { orsetContains, orsetElements } from '../crdt/ORSet.js'; import { decodeEdgeKey } from './KeyCodec.js'; import { matchGlob } from '../utils/matchGlob.js'; -/** @typedef {import('../../../index.js').WorldlineSource} WorldlineSource */ + +/** @import { WorldlineSource } from '../../../index.js' */ /** * Clones an observer worldline source descriptor, producing an independent copy. * @param {{ diff --git a/src/domain/services/PatchBuilderV2.js b/src/domain/services/PatchBuilderV2.js index 370316b1..02063cba 100644 --- a/src/domain/services/PatchBuilderV2.js +++ b/src/domain/services/PatchBuilderV2.js @@ -164,7 +164,7 @@ export class PatchBuilderV2 { /** * Creates a new PatchBuilderV2. * - * @param {{ persistence: import('../../ports/CommitPort.js').default & import('../../ports/BlobPort.js').default & import('../../ports/TreePort.js').default & import('../../ports/RefPort.js').default, graphName: string, writerId: string, lamport: number, versionVector: import('../crdt/VersionVector.js').VersionVector, getCurrentState: () => import('../services/JoinReducer.js').WarpStateV5 | null, expectedParentSha?: string|null, targetRefPath?: string, onCommitSuccess?: ((result: {patch: import('../types/WarpTypesV2.js').PatchV2, sha: string}) => void | Promise)|null, onDeleteWithData?: 'reject'|'cascade'|'warn', codec?: import('../../ports/CodecPort.js').default, logger?: import('../../ports/LoggerPort.js').default, blobStorage?: import('../../ports/BlobStoragePort.js').default, patchBlobStorage?: import('../../ports/BlobStoragePort.js').default }} options + * @param {{ persistence: import('../../ports/CommitPort.js').default & import('../../ports/BlobPort.js').default & import('../../ports/TreePort.js').default & import('../../ports/RefPort.js').default, graphName: string, writerId: string, lamport: number, versionVector: import('../crdt/VersionVector.js').VersionVector, getCurrentState: () => import('./JoinReducer.js').WarpStateV5 | null, expectedParentSha?: string|null, targetRefPath?: string, onCommitSuccess?: ((result: {patch: import('../types/WarpTypesV2.js').PatchV2, sha: string}) => void | Promise)|null, onDeleteWithData?: 'reject'|'cascade'|'warn', codec?: import('../../ports/CodecPort.js').default, logger?: import('../../ports/LoggerPort.js').default, blobStorage?: import('../../ports/BlobStoragePort.js').default, patchBlobStorage?: import('../../ports/BlobStoragePort.js').default }} options */ constructor({ persistence, graphName, writerId, lamport, versionVector, getCurrentState, expectedParentSha = null, targetRefPath, onCommitSuccess = null, onDeleteWithData = 'warn', codec, logger, blobStorage, patchBlobStorage }) { /** @type {import('../../ports/CommitPort.js').default & import('../../ports/BlobPort.js').default & import('../../ports/TreePort.js').default & import('../../ports/RefPort.js').default} */ @@ -187,7 +187,7 @@ export class PatchBuilderV2 { /** @type {import('../crdt/VersionVector.js').VersionVector} */ this._vv = vvClone(versionVector); // Clone to track local increments - /** @type {() => import('../services/JoinReducer.js').WarpStateV5 | null} */ + /** @type {() => import('./JoinReducer.js').WarpStateV5 | null} */ this._getCurrentState = getCurrentState; /** diff --git a/src/domain/services/StrandController.js b/src/domain/services/StrandController.js index bea98445..f0e5880b 100644 --- a/src/domain/services/StrandController.js +++ b/src/domain/services/StrandController.js @@ -89,7 +89,7 @@ export default class StrandController { * Materializes the graph state scoped to a single strand. * @param {string} strandId * @param {{ receipts?: boolean, ceiling?: number|null }} [options] - * @returns {Promise} + * @returns {Promise} */ async materializeStrand(strandId, options) { return await this._strandService.materialize(strandId, options); diff --git a/src/domain/services/StrandService.js b/src/domain/services/StrandService.js index 4e57177b..670761dd 100644 --- a/src/domain/services/StrandService.js +++ b/src/domain/services/StrandService.js @@ -27,8 +27,9 @@ import { createImmutableValue, createImmutableWarpStateV5 } from './ImmutableSna import { ProvenanceIndex } from './ProvenanceIndex.js'; import { encodePatchMessage } from './WarpMessageCodec.js'; -/** @typedef {import('../WarpRuntime.js').default} WarpRuntime */ -/** @typedef {import('../types/WarpTypesV2.js').PatchV2} PatchV2 */ + +/** @import { default as WarpRuntime } from '../WarpRuntime.js' */ +/** @import { PatchV2 } from '../types/WarpTypesV2.js' */ /** * @typedef {{ * strandId: string, @@ -1050,7 +1051,7 @@ export default class StrandService { * * @param {string} strandId * @param {{ receipts?: boolean, ceiling?: number|null }} [options] - * @returns {Promise} + * @returns {Promise} */ async materialize(strandId, options = {}) { const detached = await openDetachedReadGraph(this._graph); diff --git a/src/domain/services/SyncController.js b/src/domain/services/SyncController.js index c8d3ce3a..e9768f55 100644 --- a/src/domain/services/SyncController.js +++ b/src/domain/services/SyncController.js @@ -38,7 +38,7 @@ import SyncTrustGate from './SyncTrustGate.js'; * in unit tests. * * @typedef {Object} SyncHost - * @property {import('../services/JoinReducer.js').WarpStateV5|null} _cachedState + * @property {import('./JoinReducer.js').WarpStateV5|null} _cachedState * @property {Map|null} _lastFrontier * @property {boolean} _stateDirty * @property {number} _patchesSinceGC @@ -52,7 +52,7 @@ import SyncTrustGate from './SyncTrustGate.js'; * @property {number} _patchesSinceCheckpoint * @property {(op: string, t0: number, opts?: {metrics?: string, error?: Error}) => void} _logTiming * @property {(options?: Record) => Promise} materialize - * @property {(state: import('../services/JoinReducer.js').WarpStateV5) => Promise} _setMaterializedState + * @property {(state: import('./JoinReducer.js').WarpStateV5) => Promise} _setMaterializedState * @property {() => Promise} discoverWriters * @property {((trust: { mode?: 'off'|'log-only'|'enforce', pin?: string|null }|undefined|null) => SyncTrustGate|null)} [_createSyncTrustGate] */ diff --git a/src/domain/services/VisibleStateComparisonV5.js b/src/domain/services/VisibleStateComparisonV5.js index 197db966..f3975b88 100644 --- a/src/domain/services/VisibleStateComparisonV5.js +++ b/src/domain/services/VisibleStateComparisonV5.js @@ -1,13 +1,13 @@ import { canonicalStringify } from '../utils/canonicalStringify.js'; import { createStateReaderV5 } from './StateReaderV5.js'; + +/** @import { VisibleNodeViewV5, VisibleStateComparisonV5, VisibleStateNeighborV5, VisibleStateReaderV5 } from '../../../index.js' */ export const VISIBLE_STATE_COMPARISON_VERSION = 'visible-state-compare/v1'; /** - * @typedef {import('../../../index.js').VisibleStateReaderV5} VisibleStateReaderV5 - * @typedef {import('../../../index.js').VisibleNodeViewV5} VisibleNodeViewV5 - * @typedef {import('../../../index.js').VisibleStateNeighborV5} VisibleStateNeighborV5 - * @typedef {import('../../../index.js').VisibleStateComparisonV5} VisibleStateComparisonV5 + + * @typedef {import('./JoinReducer.js').WarpStateV5} WarpStateV5 */ diff --git a/src/domain/services/VisibleStateScopeV1.js b/src/domain/services/VisibleStateScopeV1.js index cb2b5e41..47447a5f 100644 --- a/src/domain/services/VisibleStateScopeV1.js +++ b/src/domain/services/VisibleStateScopeV1.js @@ -1,6 +1,7 @@ import QueryError from '../errors/QueryError.js'; import { createORSet, orsetContains } from '../crdt/ORSet.js'; import { vvClone } from '../crdt/VersionVector.js'; +import WarpStateV5 from './WarpStateV5.js'; import { normalizeRawOp } from './OpNormalizer.js'; import { decodeEdgeKey, @@ -369,13 +370,13 @@ export function scopeMaterializedStateV5(state, scope) { state.edgeAlive.tombstones, ); - return { + return new WarpStateV5({ nodeAlive: scopedNodeAlive, edgeAlive: scopedEdgeAlive, prop: collectScopedProps(state, scopedNodeIds, scopedEdgeKeys), observedFrontier: vvClone(state.observedFrontier), edgeBirthEvent: collectScopedEdgeBirthEvents(state, scopedEdgeKeys), - }; + }); } /** diff --git a/src/domain/services/VisibleStateTransferPlannerV5.js b/src/domain/services/VisibleStateTransferPlannerV5.js index eff5f21a..1d1d540e 100644 --- a/src/domain/services/VisibleStateTransferPlannerV5.js +++ b/src/domain/services/VisibleStateTransferPlannerV5.js @@ -5,14 +5,10 @@ import { } from './KeyCodec.js'; import { canonicalStringify } from '../utils/canonicalStringify.js'; + +/** @import { ContentMeta, VisibleStateReaderV5, VisibleStateTransferOperationV1, VisibleStateTransferPlanSummaryV1 } from '../../../index.js' */ export const VISIBLE_STATE_TRANSFER_PLAN_VERSION = 'visible-state-transfer-plan/v1'; -/** - * @typedef {import('../../../index.js').VisibleStateReaderV5} VisibleStateReaderV5 - * @typedef {import('../../../index.js').ContentMeta} ContentMeta - * @typedef {import('../../../index.js').VisibleStateTransferOperationV1} VisibleStateTransferOperationV1 - * @typedef {import('../../../index.js').VisibleStateTransferPlanSummaryV1} VisibleStateTransferPlanSummaryV1 - */ const ATTACHMENT_PROPERTY_KEYS = new Set([ CONTENT_PROPERTY_KEY, diff --git a/src/domain/services/WarpStateV5.js b/src/domain/services/WarpStateV5.js new file mode 100644 index 00000000..b0b2b5af --- /dev/null +++ b/src/domain/services/WarpStateV5.js @@ -0,0 +1,86 @@ +/** + * WarpStateV5 — the core CRDT materialized state object. + * + * Holds the alive sets (OR-Set for nodes and edges), property registers + * (LWW), the observed version vector frontier, and edge birth events. + * + * @module domain/services/WarpStateV5 + */ + +import { createORSet, orsetClone } from '../crdt/ORSet.js'; +import { createVersionVector, vvClone } from '../crdt/VersionVector.js'; + +/** + * The CRDT materialized state for a WARP graph. + * + * Instances are mutable during reduce (patch application) but should + * be cloned before handing to consumers that expect isolation. + */ +export default class WarpStateV5 { + /** @type {import('../crdt/ORSet.js').ORSet} */ + nodeAlive; + + /** @type {import('../crdt/ORSet.js').ORSet} */ + edgeAlive; + + /** @type {Map>} */ + prop; + + /** @type {import('../crdt/VersionVector.js').VersionVector} */ + observedFrontier; + + /** + * EdgeKey → EventId of most recent EdgeAdd (for clean-slate prop visibility). + * @type {Map} + */ + edgeBirthEvent; + + /** + * Creates a WarpStateV5 from field values. + * + * @param {{ + * nodeAlive: import('../crdt/ORSet.js').ORSet, + * edgeAlive: import('../crdt/ORSet.js').ORSet, + * prop: Map>, + * observedFrontier: import('../crdt/VersionVector.js').VersionVector, + * edgeBirthEvent?: Map + * }} fields + */ + constructor({ nodeAlive, edgeAlive, prop, observedFrontier, edgeBirthEvent }) { + this.nodeAlive = nodeAlive; + this.edgeAlive = edgeAlive; + this.prop = prop; + this.observedFrontier = observedFrontier; + this.edgeBirthEvent = edgeBirthEvent ?? /** @type {Map} */ (new Map()); + } + + /** + * Creates an empty state with fresh OR-Sets and version vector. + * + * @returns {WarpStateV5} + */ + static empty() { + return new WarpStateV5({ + nodeAlive: createORSet(), + edgeAlive: createORSet(), + prop: new Map(), + observedFrontier: createVersionVector(), + edgeBirthEvent: new Map(), + }); + } + + /** + * Creates a deep clone with independent data structures. + * + * @returns {WarpStateV5} + */ + clone() { + return new WarpStateV5({ + nodeAlive: orsetClone(this.nodeAlive), + edgeAlive: orsetClone(this.edgeAlive), + prop: new Map(this.prop), + observedFrontier: vvClone(this.observedFrontier), + edgeBirthEvent: new Map(this.edgeBirthEvent), + }); + } +} diff --git a/src/domain/services/Worldline.js b/src/domain/services/Worldline.js index 3b5762ba..5e7a3933 100644 --- a/src/domain/services/Worldline.js +++ b/src/domain/services/Worldline.js @@ -13,12 +13,13 @@ import LogicalTraversal from './LogicalTraversal.js'; import { toInternalStrandShape } from '../utils/strandPublicShape.js'; import { callInternalRuntimeMethod } from '../utils/callInternalRuntimeMethod.js'; -/** @typedef {import('../WarpRuntime.js').default} WarpRuntime */ -/** @typedef {import('../../../index.js').ObserverConfig} ObserverConfig */ + +/** @import { ObserverConfig, WorldlineOptions, WorldlineSource } from '../../../index.js' */ +/** @import { default as WarpRuntime } from '../WarpRuntime.js' */ /** - * @typedef {import('../../../index.js').WorldlineSource} WorldlineSource - * @typedef {import('../../../index.js').WorldlineOptions} WorldlineOptions - * @typedef {import('../services/JoinReducer.js').WarpStateV5 | { state: import('../services/JoinReducer.js').WarpStateV5, receipts: import('../types/TickReceipt.js').TickReceipt[] }} MaterializedSourceResult + + + * @typedef {import('./JoinReducer.js').WarpStateV5 | { state: import('./JoinReducer.js').WarpStateV5, receipts: import('../types/TickReceipt.js').TickReceipt[] }} MaterializedSourceResult * @typedef {{ * _materializeGraph: () => Promise<{ * state: unknown, @@ -141,7 +142,7 @@ function orUndefined(value) { * @param {WarpRuntime} graph * @param {{ kind: 'live', ceiling?: number|null }} source * @param {boolean} collectReceipts - * @returns {Promise} + * @returns {Promise} */ async function materializeLiveSource(graph, source, collectReceipts) { if (collectReceipts) { @@ -161,7 +162,7 @@ async function materializeLiveSource(graph, source, collectReceipts) { * @param {WarpRuntime} graph * @param {{ kind: 'coordinate', frontier: Map|Record, ceiling?: number|null }} source * @param {boolean} collectReceipts - * @returns {Promise} + * @returns {Promise} */ async function materializeCoordinateSource(graph, source, collectReceipts) { if (collectReceipts) { @@ -214,7 +215,7 @@ async function materializeStrandSource(graph, source, collectReceipts) { * @param {WarpRuntime} graph * @param {WorldlineSource} source * @param {boolean} collectReceipts - * @returns {Promise} + * @returns {Promise} */ async function materializeSource(graph, source, collectReceipts) { if (source.kind === 'live') { @@ -288,7 +289,7 @@ export default class Worldline { * Materializes the pinned worldline source into a detached snapshot. * * @param {{ receipts?: false } | { receipts: true }} [options] - * @returns {Promise} + * @returns {Promise} */ async materialize(options = undefined) { const detached = await openDetachedReadGraph(this._graph); diff --git a/src/domain/types/WarpPersistence.js b/src/domain/types/WarpPersistence.js index fbcab24d..0ece3253 100644 --- a/src/domain/types/WarpPersistence.js +++ b/src/domain/types/WarpPersistence.js @@ -1,3 +1,5 @@ +/** @import { default as RefPersistence } from '../../ports/RefPort.js' */ + /** * Role-specific persistence port types. * @@ -23,7 +25,7 @@ /** * Ref-only persistence — ref reads, writes, CAS, listing. - * @typedef {import('../../ports/RefPort.js').default} RefPersistence + */ /** diff --git a/src/domain/utils/defaultCrypto.js b/src/domain/utils/defaultCrypto.js index d2991a6d..399c723d 100644 --- a/src/domain/utils/defaultCrypto.js +++ b/src/domain/utils/defaultCrypto.js @@ -1,3 +1,5 @@ +/** @import { Hash, Hmac } from 'node:crypto' */ + /** * Default crypto implementation for domain services. * @@ -13,10 +15,6 @@ * @module domain/utils/defaultCrypto */ -/** - * @typedef {import('node:crypto').Hash} Hash - * @typedef {import('node:crypto').Hmac} Hmac - */ /** @type {((algorithm: string) => Hash)|null} */ let _createHash = null; diff --git a/src/domain/warp/materializeAdvanced.methods.js b/src/domain/warp/materializeAdvanced.methods.js index 3a853945..2c136c11 100644 --- a/src/domain/warp/materializeAdvanced.methods.js +++ b/src/domain/warp/materializeAdvanced.methods.js @@ -23,8 +23,7 @@ import BitmapNeighborProvider from '../services/BitmapNeighborProvider.js'; import { QueryError } from './_internal.js'; /** @typedef {import('../types/WarpPersistence.js').CorePersistence} CorePersistence */ -/** @typedef {import('../services/JoinReducer.js').WarpStateV5} WarpStateV5 */ -/** @typedef {import('../types/TickReceipt.js').TickReceipt} TickReceipt */ + /** * @typedef {{ outgoing: Map>, incoming: Map> }} AdjacencyMap @@ -33,6 +32,9 @@ import { QueryError } from './_internal.js'; import { buildWriterRef } from '../utils/RefLayout.js'; + +/** @import { WarpStateV5 } from '../services/JoinReducer.js' */ +/** @import { TickReceipt } from '../types/TickReceipt.js' */ /** * Creates a shallow-frozen public view of materialized state. * diff --git a/test/unit/domain/services/VisibleStateScopeV1.test.js b/test/unit/domain/services/VisibleStateScopeV1.test.js index f843b5a0..2adb159f 100644 --- a/test/unit/domain/services/VisibleStateScopeV1.test.js +++ b/test/unit/domain/services/VisibleStateScopeV1.test.js @@ -15,6 +15,7 @@ import { normalizeVisibleStateScopeV1, scopeMaterializedStateV5, } from '../../../../src/domain/services/VisibleStateScopeV1.js'; +import WarpStateV5 from '../../../../src/domain/services/WarpStateV5.js'; function buildScopedFixtureState() { const nodeAlive = createORSet(); @@ -31,7 +32,7 @@ function buildScopedFixtureState() { [encodeEdgePropKey('task:1', 'comparison-artifact:cmp-1', 'governs', 'via'), lwwSet(createEventId(3, 'alice', 'abc1236', 0), 'control-plane')], ]); - return { + return new WarpStateV5({ nodeAlive, edgeAlive, prop, @@ -39,7 +40,7 @@ function buildScopedFixtureState() { edgeBirthEvent: new Map([ [edgeKey, createEventId(3, 'alice', 'abc1236', 0)], ]), - }; + }); } describe('VisibleStateScopeV1', () => { From 07e4339a10ae513fce38cc5e176e242d81f67030 Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 14:12:24 -0700 Subject: [PATCH 52/73] fix: resolve TSC conflicts from WarpStateV5 class promotion - VisibleStateScopeV1: removed conflicting @typedef (runtime import of WarpStateV5 class conflicts with JSDoc typedef of same name) - BitmapIndexReader: removed unused @import aliases (CryptoPort, LoggerPort) introduced by earlier JSDoc dedup script - WarpPersistence: removed unused @import alias (RefPersistence) --- src/domain/services/BitmapIndexReader.js | 2 -- src/domain/services/VisibleStateScopeV1.js | 1 - src/domain/types/WarpPersistence.js | 2 -- 3 files changed, 5 deletions(-) diff --git a/src/domain/services/BitmapIndexReader.js b/src/domain/services/BitmapIndexReader.js index 096ce891..b026244d 100644 --- a/src/domain/services/BitmapIndexReader.js +++ b/src/domain/services/BitmapIndexReader.js @@ -8,8 +8,6 @@ import { isValidShardOid } from '../utils/validateShardOid.js'; import { base64Decode } from '../utils/bytes.js'; -/** @import { default as CryptoPort } from '../../ports/CryptoPort.js' */ -/** @import { default as LoggerPort } from '../../ports/LoggerPort.js' */ /** @import { RoaringBitmapSubset as BitmapShard } from '../utils/roaring.js' */ /** @typedef {import('../../ports/IndexStoragePort.js').default} IndexStoragePort */ /** @typedef {import('../types/WarpPersistence.js').IndexStorage} IndexStorage */ diff --git a/src/domain/services/VisibleStateScopeV1.js b/src/domain/services/VisibleStateScopeV1.js index 47447a5f..d5f51166 100644 --- a/src/domain/services/VisibleStateScopeV1.js +++ b/src/domain/services/VisibleStateScopeV1.js @@ -19,7 +19,6 @@ import { * @typedef {{ * nodeIdPrefixes?: VisibleStateScopePrefixFilterV1 * }} VisibleStateScopeV1 - * @typedef {import('./JoinReducer.js').WarpStateV5} WarpStateV5 */ /** diff --git a/src/domain/types/WarpPersistence.js b/src/domain/types/WarpPersistence.js index 0ece3253..b1bc5a40 100644 --- a/src/domain/types/WarpPersistence.js +++ b/src/domain/types/WarpPersistence.js @@ -1,5 +1,3 @@ -/** @import { default as RefPersistence } from '../../ports/RefPort.js' */ - /** * Role-specific persistence port types. * From ce9d075adb3fbad68723e314fc36c1a1cb7b36e0 Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 14:17:34 -0700 Subject: [PATCH 53/73] fix: normalize stale JSDoc import paths in QueryController QueryController had 33 JSDoc imports using '../services/' (the old path from warp/query.methods.js) instead of './' (correct for services/QueryController.js). Both resolve to the same directory, but Deno's AST rewriter treats them as distinct paths, causing overlapping text changes and a panic during JSR publish. Also redirected WarpStateV5 imports from JoinReducer.js to the new WarpStateV5.js module, eliminating the duplicate import path that was the root cause of the Deno AST panic at byte offset 3780. --- src/domain/services/QueryController.js | 66 +++++++++++++------------- 1 file changed, 33 insertions(+), 33 deletions(-) diff --git a/src/domain/services/QueryController.js b/src/domain/services/QueryController.js index 2dcff3c5..199a451b 100644 --- a/src/domain/services/QueryController.js +++ b/src/domain/services/QueryController.js @@ -124,10 +124,10 @@ async function openDetachedObserverGraph(graph) { * Snapshots the current materialized state with a cloned copy and hash. * * @param {import('../WarpRuntime.js').default} graph - * @returns {Promise<{ state: import('../services/JoinReducer.js').WarpStateV5, stateHash: string }>} + * @returns {Promise<{ state: import('./WarpStateV5.js').default, stateHash: string }>} */ async function snapshotCurrentMaterialized(graph) { - const materialized = await /** @type {{ _materializeGraph: () => Promise<{state: import('../services/JoinReducer.js').WarpStateV5, stateHash: string|null}> }} */ (graph)._materializeGraph(); + const materialized = await /** @type {{ _materializeGraph: () => Promise<{state: import('./WarpStateV5.js').default, stateHash: string|null}> }} */ (graph)._materializeGraph(); return { state: cloneStateV5(materialized.state), stateHash: /** @type {string} */ (materialized.stateHash), @@ -138,8 +138,8 @@ async function snapshotCurrentMaterialized(graph) { * Clones and hashes a returned state for snapshot isolation. * * @param {import('../WarpRuntime.js').default} graph - * @param {import('../services/JoinReducer.js').WarpStateV5} state - * @returns {Promise<{ state: import('../services/JoinReducer.js').WarpStateV5, stateHash: string }>} + * @param {import('./WarpStateV5.js').default} state + * @returns {Promise<{ state: import('./WarpStateV5.js').default, stateHash: string }>} */ async function snapshotReturnedState(graph, state) { const stateHash = await computeStateHashV5(state, { @@ -157,7 +157,7 @@ async function snapshotReturnedState(graph, state) { * * @param {import('../WarpRuntime.js').default} graph * @param {ObserverOptions|undefined} options - * @returns {Promise<{ state: import('../services/JoinReducer.js').WarpStateV5, stateHash: string }>} + * @returns {Promise<{ state: import('./WarpStateV5.js').default, stateHash: string }>} */ async function resolveObserverSnapshot(graph, options) { const source = cloneObserverSource(options?.source); @@ -168,7 +168,7 @@ async function resolveObserverSnapshot(graph, options) { if (source.kind === 'live') { const detached = await openDetachedObserverGraph(graph); - const state = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (await detached.materialize({ + const state = /** @type {import('./WarpStateV5.js').default} */ (await detached.materialize({ ceiling: source.ceiling ?? null, })); return await snapshotReturnedState(detached, state); @@ -176,7 +176,7 @@ async function resolveObserverSnapshot(graph, options) { if (source.kind === 'coordinate') { const detached = await openDetachedObserverGraph(graph); - const state = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (await detached.materializeCoordinate({ + const state = /** @type {import('./WarpStateV5.js').default} */ (await detached.materializeCoordinate({ frontier: source.frontier, ceiling: source.ceiling ?? null, })); @@ -188,7 +188,7 @@ async function resolveObserverSnapshot(graph, options) { const internalSource = /** @type {{ strandId: string, ceiling?: number|null }} */ ( /** @type {unknown} */ (toInternalStrandShape(source)) ); - const state = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ ( + const state = /** @type {import('./WarpStateV5.js').default} */ ( await callInternalRuntimeMethod(detached, 'materializeStrand', internalSource.strandId, { ceiling: internalSource.ceiling ?? null, }) @@ -212,7 +212,7 @@ async function resolveObserverSnapshot(graph, options) { */ async function hasNode(nodeId) { await this._host._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); + const s = /** @type {import('./WarpStateV5.js').default} */ (this._host._cachedState); return orsetContains(s.nodeAlive, nodeId); } @@ -241,7 +241,7 @@ async function getNodeProps(nodeId) { } // ── Linear scan fallback ───────────────────────────────────────────── - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); + const s = /** @type {import('./WarpStateV5.js').default} */ (this._host._cachedState); if (!orsetContains(s.nodeAlive, nodeId)) { return null; @@ -271,7 +271,7 @@ async function getNodeProps(nodeId) { */ async function getEdgeProps(from, to, label) { await this._host._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); + const s = /** @type {import('./WarpStateV5.js').default} */ (this._host._cachedState); const edgeKey = encodeEdgeKey(from, to, label); if (!orsetContains(s.edgeAlive, edgeKey)) { @@ -339,7 +339,7 @@ async function neighbors(nodeId, direction = 'both', edgeLabel = undefined) { } // ── Linear scan fallback ───────────────────────────────────────────── - return _linearNeighbors(/** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState), nodeId, direction, edgeLabel); + return _linearNeighbors(/** @type {import('./WarpStateV5.js').default} */ (this._host._cachedState), nodeId, direction, edgeLabel); } /** @@ -368,14 +368,14 @@ async function _indexedNeighbors(provider, nodeId, direction, opts) { /** * Linear-scan neighbor lookup from raw CRDT state. * - * @param {import('../services/JoinReducer.js').WarpStateV5} cachedState + * @param {import('./WarpStateV5.js').default} cachedState * @param {string} nodeId * @param {'outgoing' | 'incoming' | 'both'} direction * @param {string} [edgeLabel] * @returns {Array<{nodeId: string, label: string, direction: 'outgoing' | 'incoming'}>} */ function _linearNeighbors(cachedState, nodeId, direction, edgeLabel) { - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (cachedState); + const s = /** @type {import('./WarpStateV5.js').default} */ (cachedState); /** @type {Array<{nodeId: string, label: string, direction: 'outgoing' | 'incoming'}>} */ const result = []; const checkOut = direction === 'outgoing' || direction === 'both'; @@ -400,7 +400,7 @@ function _linearNeighbors(cachedState, nodeId, direction, edgeLabel) { /** * Returns a defensive copy of the current materialized state. * - * @returns {Promise} + * @returns {Promise} * @this {QueryController} */ async function getStateSnapshot() { @@ -411,7 +411,7 @@ async function getStateSnapshot() { if (!this._host._cachedState) { return null; } - return createImmutableWarpStateV5(/** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState)); + return createImmutableWarpStateV5(/** @type {import('./WarpStateV5.js').default} */ (this._host._cachedState)); } /** @@ -423,7 +423,7 @@ async function getStateSnapshot() { */ async function getNodes() { await this._host._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); + const s = /** @type {import('./WarpStateV5.js').default} */ (this._host._cachedState); return [...orsetElements(s.nodeAlive)]; } @@ -436,7 +436,7 @@ async function getNodes() { */ async function getEdges() { await this._host._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); + const s = /** @type {import('./WarpStateV5.js').default} */ (this._host._cachedState); /** @type {Map>} */ const edgePropsByKey = new Map(); @@ -483,14 +483,14 @@ async function getEdges() { */ async function getPropertyCount() { await this._host._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); + const s = /** @type {import('./WarpStateV5.js').default} */ (this._host._cachedState); return s.prop.size; } /** * Creates a fluent query builder for the logical graph. * - * @returns {import('../services/QueryBuilder.js').default} A fluent query builder + * @returns {import('./QueryBuilder.js').default} A fluent query builder * @this {QueryController} */ function query() { @@ -501,7 +501,7 @@ function query() { * Creates a first-class worldline handle over a pinned read source. * * @param {ObserverOptions} [options] - * @returns {import('../services/Worldline.js').default} + * @returns {import('./Worldline.js').default} * @this {QueryController} */ function worldline(options = undefined) { @@ -545,7 +545,7 @@ function normalizeObserverArgs(nameOrConfig, configOrOptions, maybeOptions) { * @param {{ match: string|string[], expose?: string[], redact?: string[] }|ObserverOptions} [configOrOptions] * Observer configuration when a name is supplied, otherwise observer options * @param {ObserverOptions} [maybeOptions] - Optional pinned read source - * @returns {Promise} A read-only observer + * @returns {Promise} A read-only observer * @this {QueryController} */ async function observer(nameOrConfig, configOrOptions = undefined, maybeOptions = undefined) { @@ -575,7 +575,7 @@ async function observer(nameOrConfig, configOrOptions = undefined, maybeOptions */ async function translationCost(configA, configB) { await this._host._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); + const s = /** @type {import('./WarpStateV5.js').default} */ (this._host._cachedState); return computeTranslationCost(configA, configB, s); } @@ -621,7 +621,7 @@ function visibleEdgeRegister(register, birthEvent) { /** * Looks up the current node attachment registers directly from materialized state. * - * @param {import('../services/JoinReducer.js').WarpStateV5} state + * @param {import('./WarpStateV5.js').default} state * @param {string} nodeId * @returns {{ contentRegister: { eventId: import('../utils/EventId.js').EventId|null, value: string }, mimeRegister: { eventId: import('../utils/EventId.js').EventId|null, value: unknown }|null, sizeRegister: { eventId: import('../utils/EventId.js').EventId|null, value: unknown }|null }|null} */ @@ -643,7 +643,7 @@ function getNodeContentRegisters(state, nodeId) { /** * Looks up the current edge attachment registers directly from materialized state. * - * @param {import('../services/JoinReducer.js').WarpStateV5} state + * @param {import('./WarpStateV5.js').default} state * @param {string} from * @param {string} to * @param {string} label @@ -719,7 +719,7 @@ function extractContentMeta(contentRegister, mimeRegister, sizeRegister) { */ async function getContentOid(nodeId) { await this._host._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); + const s = /** @type {import('./WarpStateV5.js').default} */ (this._host._cachedState); const registers = getNodeContentRegisters(s, nodeId); return registers?.contentRegister.value ?? null; } @@ -734,7 +734,7 @@ async function getContentOid(nodeId) { */ async function getContentMeta(nodeId) { await this._host._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); + const s = /** @type {import('./WarpStateV5.js').default} */ (this._host._cachedState); const registers = getNodeContentRegisters(s, nodeId); return registers ? extractContentMeta(registers.contentRegister, registers.mimeRegister, registers.sizeRegister) @@ -757,7 +757,7 @@ async function getContentMeta(nodeId) { */ async function getContent(nodeId) { await this._host._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); + const s = /** @type {import('./WarpStateV5.js').default} */ (this._host._cachedState); const registers = getNodeContentRegisters(s, nodeId); if (!registers) { return null; @@ -781,7 +781,7 @@ async function getContent(nodeId) { */ async function getEdgeContentOid(from, to, label) { await this._host._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); + const s = /** @type {import('./WarpStateV5.js').default} */ (this._host._cachedState); const registers = getEdgeContentRegisters(s, from, to, label); return registers?.contentRegister.value ?? null; } @@ -798,7 +798,7 @@ async function getEdgeContentOid(from, to, label) { */ async function getEdgeContentMeta(from, to, label) { await this._host._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); + const s = /** @type {import('./WarpStateV5.js').default} */ (this._host._cachedState); const registers = getEdgeContentRegisters(s, from, to, label); return registers ? extractContentMeta(registers.contentRegister, registers.mimeRegister, registers.sizeRegister) @@ -823,7 +823,7 @@ async function getEdgeContentMeta(from, to, label) { */ async function getEdgeContent(from, to, label) { await this._host._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); + const s = /** @type {import('./WarpStateV5.js').default} */ (this._host._cachedState); const registers = getEdgeContentRegisters(s, from, to, label); if (!registers) { return null; @@ -847,7 +847,7 @@ async function getEdgeContent(from, to, label) { */ async function getContentStream(nodeId) { await this._host._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); + const s = /** @type {import('./WarpStateV5.js').default} */ (this._host._cachedState); const registers = getNodeContentRegisters(s, nodeId); if (!registers) { return null; @@ -875,7 +875,7 @@ async function getContentStream(nodeId) { */ async function getEdgeContentStream(from, to, label) { await this._host._ensureFreshState(); - const s = /** @type {import('../services/JoinReducer.js').WarpStateV5} */ (this._host._cachedState); + const s = /** @type {import('./WarpStateV5.js').default} */ (this._host._cachedState); const registers = getEdgeContentRegisters(s, from, to, label); if (!registers) { return null; From 82ff74b3e538cc570235d2e2389083fc7ace72eb Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 14:46:28 -0700 Subject: [PATCH 54/73] chore: inbox 4 backlog items from METHOD v2 draft review - DX_bearing-md: living signpost document (replaces ROADMAP role) - DX_method-cli-tooling: CLI for METHOD workspace operations - DX_method-v2-upgrade: incorporate v2 draft improvements - DX_witness-directory-convention: structured playback evidence --- docs/method/backlog/inbox/DX_bearing-md.md | 7 +++++++ .../backlog/inbox/DX_method-cli-tooling.md | 8 ++++++++ .../method/backlog/inbox/DX_method-v2-upgrade.md | 16 ++++++++++++++++ .../inbox/DX_witness-directory-convention.md | 6 ++++++ 4 files changed, 37 insertions(+) create mode 100644 docs/method/backlog/inbox/DX_bearing-md.md create mode 100644 docs/method/backlog/inbox/DX_method-cli-tooling.md create mode 100644 docs/method/backlog/inbox/DX_method-v2-upgrade.md create mode 100644 docs/method/backlog/inbox/DX_witness-directory-convention.md diff --git a/docs/method/backlog/inbox/DX_bearing-md.md b/docs/method/backlog/inbox/DX_bearing-md.md new file mode 100644 index 00000000..313a17a8 --- /dev/null +++ b/docs/method/backlog/inbox/DX_bearing-md.md @@ -0,0 +1,7 @@ +# Create BEARING.md + +Single living signpost at `docs/BEARING.md`. Updated at cycle +boundaries. Three questions: Where are we going? What just shipped? +What feels wrong? + +Replaces the role that ROADMAP.md currently half-fills. diff --git a/docs/method/backlog/inbox/DX_method-cli-tooling.md b/docs/method/backlog/inbox/DX_method-cli-tooling.md new file mode 100644 index 00000000..5f1fd969 --- /dev/null +++ b/docs/method/backlog/inbox/DX_method-cli-tooling.md @@ -0,0 +1,8 @@ +# METHOD CLI tooling + +Small CLI for METHOD workspace operations: `method inbox`, +`method pull`, `method close`, `method status`. Scaffolds files, +numbers cycles, summarizes backlog lanes. + +Open question: TypeScript or plain JavaScript? Bijou dependency +or zero-dep shell scripts? diff --git a/docs/method/backlog/inbox/DX_method-v2-upgrade.md b/docs/method/backlog/inbox/DX_method-v2-upgrade.md new file mode 100644 index 00000000..303275c8 --- /dev/null +++ b/docs/method/backlog/inbox/DX_method-v2-upgrade.md @@ -0,0 +1,16 @@ +# Upgrade METHOD.md to v2 draft + +Incorporate improvements from the pre-0002 draft review: + +- Stances section (agent-human parity, agent-surface-first) +- Design constraints (meaning without decoration, a11y, l10n) +- Playback witness definition +- Disagreement protocol (dual-sponsor consensus) +- BEARING.md coordination mechanism +- Cycle types (feature, design, debt) +- Naming conventions table + +Defer or qualify: +- Localization posture (mark as "when applicable" for dev tools) +- Debt cycle posture sections (make optional) +- "Does not go back" commitment language (align with pivot option) diff --git a/docs/method/backlog/inbox/DX_witness-directory-convention.md b/docs/method/backlog/inbox/DX_witness-directory-convention.md new file mode 100644 index 00000000..d83fa670 --- /dev/null +++ b/docs/method/backlog/inbox/DX_witness-directory-convention.md @@ -0,0 +1,6 @@ +# Witness directory convention for playback + +Each cycle retro gets a `witness/` directory containing the concrete +playback artifacts: test output logs, transcripts, screenshots, +recordings. Currently retros are prose-only — adding structured +evidence makes the playback step auditable. From e65c76bb3faa59da13052ca495f87177c841b9e9 Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 15:04:39 -0700 Subject: [PATCH 55/73] refactor(NDNM): promote Dot, EventId, EffectEmission, DeliveryObservation to classes MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Four XS typedef-to-class liberations: - Dot: constructor validates writerId + counter. Fields declared in alphabetical order (counter, writerId) to match CBOR canonical encoding — property order determines serialization, and HMAC verification of existing BTR data depends on byte-identical output. - EventId: constructor validates lamport, writerId, patchSha, opIndex. Fields in alphabetical order (lamport, opIndex, patchSha, writerId). - EffectEmission: constructor validates and freezes. EffectCoordinate also promoted to a class (nested frozen value object). - DeliveryObservation: constructor validates outcome, freezes lens snapshot. Conditional spread for optional `reason` field to satisfy exactOptionalPropertyTypes. All four retain their factory functions (createDot, createEventId, createEffectEmission, createDeliveryObservation) for backward compatibility — factories now delegate to constructors. Closes bad-code/PROTO_typedef-{dot,eventid,effectemission,deliveryobservation}-to-class. --- ...TO_typedef-deliveryobservation-to-class.md | 9 -- .../bad-code/PROTO_typedef-dot-to-class.md | 9 -- .../PROTO_typedef-effectemission-to-class.md | 10 --- .../PROTO_typedef-eventid-to-class.md | 8 -- src/domain/crdt/Dot.js | 48 +++++++---- src/domain/types/DeliveryObservation.js | 82 ++++++++++-------- src/domain/types/EffectEmission.js | 77 ++++++++++------- src/domain/utils/EventId.js | 84 +++++++++++-------- 8 files changed, 174 insertions(+), 153 deletions(-) delete mode 100644 docs/method/backlog/bad-code/PROTO_typedef-deliveryobservation-to-class.md delete mode 100644 docs/method/backlog/bad-code/PROTO_typedef-dot-to-class.md delete mode 100644 docs/method/backlog/bad-code/PROTO_typedef-effectemission-to-class.md delete mode 100644 docs/method/backlog/bad-code/PROTO_typedef-eventid-to-class.md diff --git a/docs/method/backlog/bad-code/PROTO_typedef-deliveryobservation-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-deliveryobservation-to-class.md deleted file mode 100644 index 45df6f55..00000000 --- a/docs/method/backlog/bad-code/PROTO_typedef-deliveryobservation-to-class.md +++ /dev/null @@ -1,9 +0,0 @@ -# Promote DeliveryObservation from @typedef to class - -**Effort:** XS - -## Problem - -`src/domain/types/DeliveryObservation.js` defines `DeliveryObservation` -as a `@typedef {Object}` with a factory (`createDeliveryObservation`) -returning a frozen object. Should be a class. diff --git a/docs/method/backlog/bad-code/PROTO_typedef-dot-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-dot-to-class.md deleted file mode 100644 index f915f5c7..00000000 --- a/docs/method/backlog/bad-code/PROTO_typedef-dot-to-class.md +++ /dev/null @@ -1,9 +0,0 @@ -# Promote Dot from @typedef to class - -**Effort:** XS - -## Problem - -`src/domain/crdt/Dot.js` defines `Dot` as a `@typedef {Object}` but it -has factory (`createDot`), encode/decode, and comparison functions. It should -be a class with those behaviors as methods. diff --git a/docs/method/backlog/bad-code/PROTO_typedef-effectemission-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-effectemission-to-class.md deleted file mode 100644 index 5b0efde8..00000000 --- a/docs/method/backlog/bad-code/PROTO_typedef-effectemission-to-class.md +++ /dev/null @@ -1,10 +0,0 @@ -# Promote EffectEmission from @typedef to class - -**Effort:** XS - -## Problem - -`src/domain/types/EffectEmission.js` defines `EffectEmission` as a -`@typedef {Object}` but has a factory (`createEffectEmission`) that -returns a frozen object. Should be a class. `EffectCoordinate` could -merge into it as a nested shape or separate class. diff --git a/docs/method/backlog/bad-code/PROTO_typedef-eventid-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-eventid-to-class.md deleted file mode 100644 index d141a91d..00000000 --- a/docs/method/backlog/bad-code/PROTO_typedef-eventid-to-class.md +++ /dev/null @@ -1,8 +0,0 @@ -# Promote EventId from @typedef to class - -**Effort:** XS - -## Problem - -`src/domain/utils/EventId.js` defines `EventId` as a `@typedef {Object}` -but has a factory (`createEventId`) and comparison logic. Should be a class. diff --git a/src/domain/crdt/Dot.js b/src/domain/crdt/Dot.js index 5774deda..e2c4bc96 100644 --- a/src/domain/crdt/Dot.js +++ b/src/domain/crdt/Dot.js @@ -56,13 +56,36 @@ */ /** - * Dot - Unique operation identifier for CRDT operations. - * A dot is a (writerId, counter) pair that uniquely identifies an operation. - * - * @typedef {Object} Dot - * @property {string} writerId - Writer identifier (non-empty string) - * @property {number} counter - Monotonic counter (positive integer) + * Dot — unique operation identity for CRDT semantics. + * A (writerId, counter) pair that serves as a "birth certificate" + * for each CRDT operation. */ +export class Dot { + /** @type {number} Monotonic counter (positive integer) */ + counter; + + /** @type {string} Writer identifier (non-empty string) */ + writerId; + + /** + * Creates a validated Dot. + * + * @param {string} writerId - Must be non-empty string + * @param {number} counter - Must be positive integer (> 0) + */ + constructor(writerId, counter) { + if (typeof writerId !== 'string' || writerId.length === 0) { + throw new Error('writerId must be a non-empty string'); + } + + if (!Number.isInteger(counter) || counter <= 0) { + throw new Error('counter must be a positive integer'); + } + + this.counter = counter; + this.writerId = writerId; + } +} /** * Creates a validated Dot. @@ -70,18 +93,9 @@ * @param {string} writerId - Must be non-empty string * @param {number} counter - Must be positive integer (> 0) * @returns {Dot} - * @throws {Error} If validation fails */ export function createDot(writerId, counter) { - if (typeof writerId !== 'string' || writerId.length === 0) { - throw new Error('writerId must be a non-empty string'); - } - - if (!Number.isInteger(counter) || counter <= 0) { - throw new Error('counter must be a positive integer'); - } - - return { writerId, counter }; + return new Dot(writerId, counter); } /** @@ -136,7 +150,7 @@ export function decodeDot(encoded) { throw new Error('Invalid encoded dot format: invalid counter'); } - return { writerId, counter }; + return new Dot(writerId, counter); } /** diff --git a/src/domain/types/DeliveryObservation.js b/src/domain/types/DeliveryObservation.js index 6d950174..dd621572 100644 --- a/src/domain/types/DeliveryObservation.js +++ b/src/domain/types/DeliveryObservation.js @@ -23,14 +23,49 @@ const modeSet = new Set(DELIVERY_MODES); */ /** - * @typedef {Object} DeliveryObservation - * @property {string} emissionId - Links to the EffectEmission - * @property {string} sinkId - Which sink/adapter handled it - * @property {'delivered' | 'suppressed' | 'failed' | 'skipped'} outcome - * @property {string} [reason] - Why (e.g., "replay mode") - * @property {number} timestamp - Wall-clock milliseconds - * @property {Readonly} lens - Execution context at delivery time + * DeliveryObservation — trace record of how a sink handled an emitted effect. */ +export class DeliveryObservation { + /** @type {string} Links to the EffectEmission */ + emissionId; + + /** @type {string} Which sink/adapter handled it */ + sinkId; + + /** @type {'delivered' | 'suppressed' | 'failed' | 'skipped'} */ + outcome; + + /** @type {string | undefined} Why (e.g., "replay mode") */ + reason; + + /** @type {number} Wall-clock milliseconds */ + timestamp; + + /** @type {Readonly} Execution context at delivery time */ + lens; + + /** + * Creates an immutable DeliveryObservation. + * @param {{ emissionId: string, sinkId: string, outcome: string, reason?: string, timestamp: number, lens: { mode: string, suppressExternal: boolean } }} fields + */ + constructor({ emissionId, sinkId, outcome, reason, timestamp, lens }) { + requireNonEmptyString(emissionId, 'emissionId'); + requireNonEmptyString(sinkId, 'sinkId'); + validateOutcome(outcome); + validateTimestamp(timestamp); + validateLens(lens); + + this.emissionId = emissionId; + this.sinkId = sinkId; + this.outcome = /** @type {'delivered' | 'suppressed' | 'failed' | 'skipped'} */ (outcome); + this.timestamp = timestamp; + this.lens = freezeLens(lens); + if (reason !== undefined) { + this.reason = reason; + } + Object.freeze(this); + } +} // ============================================================================ // Validation @@ -124,34 +159,11 @@ function freezeLens(lens) { * }} params * @returns {Readonly} */ -export function createDeliveryObservation({ - emissionId, - sinkId, - outcome, - reason, - timestamp, - lens, -}) { - requireNonEmptyString(emissionId, 'emissionId'); - requireNonEmptyString(sinkId, 'sinkId'); - validateOutcome(outcome); - validateTimestamp(timestamp); - validateLens(lens); - - /** @type {{ emissionId: string, sinkId: string, outcome: 'delivered' | 'suppressed' | 'failed' | 'skipped', timestamp: number, lens: Readonly, reason?: string }} */ - const obs = { - emissionId, - sinkId, - outcome: /** @type {'delivered' | 'suppressed' | 'failed' | 'skipped'} */ (outcome), - timestamp, - lens: freezeLens(lens), - }; - - if (reason !== undefined) { - obs.reason = reason; - } - - return Object.freeze(obs); +export function createDeliveryObservation({ emissionId, sinkId, outcome, reason, timestamp, lens }) { + return new DeliveryObservation({ + emissionId, sinkId, outcome, timestamp, lens, + ...(reason !== undefined ? { reason } : {}), + }); } // ============================================================================ diff --git a/src/domain/types/EffectEmission.js b/src/domain/types/EffectEmission.js index e14d8077..f2c97741 100644 --- a/src/domain/types/EffectEmission.js +++ b/src/domain/types/EffectEmission.js @@ -22,20 +22,56 @@ export { DELIVERY_MODES, DELIVERY_OUTCOMES }; // ============================================================================ /** - * @typedef {Object} EffectCoordinate - * @property {Record | null} frontier - Writer tip SHAs at emission time - * @property {number | null} ceiling - Lamport ceiling (if capped) + * Causal coordinate at emission time. */ +export class EffectCoordinate { + /** @type {Record | null} Writer tip SHAs at emission time */ + frontier; + + /** @type {number | null} Lamport ceiling (if capped) */ + ceiling; + + /** + * Creates an immutable EffectCoordinate. + * @param {{ frontier: Record | null, ceiling: number | null }} fields + */ + constructor({ frontier, ceiling }) { + this.frontier = frontier ? Object.freeze({ ...frontier }) : null; + this.ceiling = ceiling ?? null; + Object.freeze(this); + } +} /** - * @typedef {Object} EffectEmission - * @property {string} id - Unique emission ID - * @property {string} kind - Effect kind (generic string, app chooses meaning) - * @property {unknown} payload - Opaque effect payload - * @property {number} timestamp - Wall-clock milliseconds - * @property {string | null} writer - Writer ID (null if not writer-scoped) - * @property {Readonly} coordinate - Causal position + * EffectEmission — host-domain trace object for an outbound effect candidate. */ +export class EffectEmission { + /** @type {string} */ id; + /** @type {string} */ kind; + /** @type {unknown} */ payload; + /** @type {number} */ timestamp; + /** @type {string | null} */ writer; + /** @type {Readonly} */ coordinate; + + /** + * Creates an immutable EffectEmission. + * @param {{ id: string, kind: string, payload: unknown, timestamp: number, writer: string | null, coordinate: { frontier: Record | null, ceiling: number | null } }} fields + */ + constructor({ id, kind, payload, timestamp, writer, coordinate }) { + requireNonEmptyString(id, 'id'); + requireNonEmptyString(kind, 'kind'); + validateTimestamp(timestamp); + validateCoordinate(coordinate); + + this.id = id; + this.kind = kind; + this.payload = payload; + this.timestamp = timestamp; + this.writer = writer ?? null; + this.coordinate = new EffectCoordinate(coordinate); + Object.freeze(this); + } +} // ============================================================================ // Validation @@ -96,26 +132,7 @@ function validateCoordinate(value) { * @returns {Readonly} */ export function createEffectEmission({ id, kind, payload, timestamp, writer, coordinate }) { - requireNonEmptyString(id, 'id'); - requireNonEmptyString(kind, 'kind'); - validateTimestamp(timestamp); - validateCoordinate(coordinate); - - const frozenCoordinate = Object.freeze({ - frontier: coordinate.frontier - ? Object.freeze({ ...coordinate.frontier }) - : null, - ceiling: coordinate.ceiling ?? null, - }); - - return Object.freeze({ - id, - kind, - payload, - timestamp, - writer: writer ?? null, - coordinate: frozenCoordinate, - }); + return new EffectEmission({ id, kind, payload, timestamp, writer, coordinate }); } // ============================================================================ diff --git a/src/domain/utils/EventId.js b/src/domain/utils/EventId.js index c99aaa16..c7eda823 100644 --- a/src/domain/utils/EventId.js +++ b/src/domain/utils/EventId.js @@ -1,48 +1,62 @@ -/** - * EventId for total ordering of operations (WARP spec Section 7). - * - * @typedef {Object} EventId - * @property {number} lamport - Monotonic counter (positive integer) - * @property {string} writerId - Writer identifier (non-empty string) - * @property {string} patchSha - Patch commit SHA (hex OID, 4-64 chars) - * @property {number} opIndex - Operation index within patch (non-negative integer) - */ - // Regex for validating hex OID (4-64 hex characters) const HEX_OID_REGEX = /^[0-9a-f]{4,64}$/; /** - * Creates a validated EventId. - * - * @param {number} lamport - Must be positive integer (> 0) - * @param {string} writerId - Must be non-empty string - * @param {string} patchSha - Must be valid hex OID (4-64 chars) - * @param {number} opIndex - Must be non-negative integer (>= 0) - * @returns {EventId} - * @throws {Error} If validation fails + * EventId — total ordering identity for CRDT operations (WARP spec Section 7). */ -export function createEventId(lamport, writerId, patchSha, opIndex) { - // Validate lamport is positive integer - if (!Number.isInteger(lamport) || lamport <= 0) { - throw new Error('lamport must be a positive integer'); - } +export class EventId { + /** @type {number} Monotonic counter (positive integer) */ + lamport; - // Validate writerId is non-empty string - if (typeof writerId !== 'string' || writerId.length === 0) { - throw new Error('writerId must be a non-empty string'); - } + /** @type {number} Operation index within patch (non-negative integer) */ + opIndex; - // Validate patchSha is hex string 4-64 chars - if (typeof patchSha !== 'string' || !HEX_OID_REGEX.test(patchSha)) { - throw new Error('patchSha must be a hex string of 4-64 characters'); - } + /** @type {string} Patch commit SHA (hex OID, 4-64 chars) */ + patchSha; + + /** @type {string} Writer identifier (non-empty string) */ + writerId; + + /** + * Creates a validated EventId. + * + * @param {number} lamport - Must be positive integer (> 0) + * @param {string} writerId - Must be non-empty string + * @param {string} patchSha - Must be valid hex OID (4-64 chars) + * @param {number} opIndex - Must be non-negative integer (>= 0) + */ + constructor(lamport, writerId, patchSha, opIndex) { + if (!Number.isInteger(lamport) || lamport <= 0) { + throw new Error('lamport must be a positive integer'); + } + if (typeof writerId !== 'string' || writerId.length === 0) { + throw new Error('writerId must be a non-empty string'); + } + if (typeof patchSha !== 'string' || !HEX_OID_REGEX.test(patchSha)) { + throw new Error('patchSha must be a hex string of 4-64 characters'); + } + if (!Number.isInteger(opIndex) || opIndex < 0) { + throw new Error('opIndex must be a non-negative integer'); + } - // Validate opIndex is non-negative integer - if (!Number.isInteger(opIndex) || opIndex < 0) { - throw new Error('opIndex must be a non-negative integer'); + this.lamport = lamport; + this.opIndex = opIndex; + this.patchSha = patchSha; + this.writerId = writerId; } +} - return { lamport, writerId, patchSha, opIndex }; +/** + * Creates a validated EventId. + * + * @param {number} lamport + * @param {string} writerId + * @param {string} patchSha + * @param {number} opIndex + * @returns {EventId} + */ +export function createEventId(lamport, writerId, patchSha, opIndex) { + return new EventId(lamport, writerId, patchSha, opIndex); } /** From 1ae6826117fd10aff6cc77704967192ff7fcb895 Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 15:16:36 -0700 Subject: [PATCH 56/73] fix: restore alphabetical field order for CBOR compatibility MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Reverted codec changes — making the codec sort class instance keys is a wire format change that breaks HMAC verification of existing persisted data. The correct long-term fix (codec-level canonical key sorting for all types, like Echo's Rust encoder) requires a schema version migration. For now, domain classes that get CBOR-serialized MUST declare fields in alphabetical order to match the codec's sort behavior for plain objects. This is documented in the new backlog item PROTO_cbor-codec-class-key-sorting. Restored alphabetical field order in Dot (counter, writerId) and EventId (lamport, opIndex, patchSha, writerId). --- .../PROTO_cbor-codec-class-key-sorting.md | 30 +++++++++++++++++++ src/domain/crdt/Dot.js | 2 +- src/domain/utils/EventId.js | 4 +-- 3 files changed, 33 insertions(+), 3 deletions(-) create mode 100644 docs/method/backlog/bad-code/PROTO_cbor-codec-class-key-sorting.md diff --git a/docs/method/backlog/bad-code/PROTO_cbor-codec-class-key-sorting.md b/docs/method/backlog/bad-code/PROTO_cbor-codec-class-key-sorting.md new file mode 100644 index 00000000..19718d92 --- /dev/null +++ b/docs/method/backlog/bad-code/PROTO_cbor-codec-class-key-sorting.md @@ -0,0 +1,30 @@ +# CBOR codec skips key sorting for class instances + +**Effort:** M + +## Problem + +Both `CborCodec.js` and `defaultCodec.js` only sort keys for plain +objects (`constructor === Object`). Class instances pass through +unsorted, meaning their CBOR output depends on field declaration +order. This forces all domain classes to declare fields in +alphabetical order — coupling class design to serialization. + +Echo's canonical CBOR encoder (Rust) sorts ALL map keys by encoded +byte representation at encode time, making source property order +irrelevant. git-warp's codec should do the same. + +## Fix + +Update `isPlainObject`/`sortObjectKeys` in both codecs to sort keys +for all object types except built-in CBOR-native types (Uint8Array, +Date, Set, Map, RegExp). This is a wire format change — existing +CBOR data was encoded with unsorted class instance keys. Requires +schema version bump or migration path for persisted data. + +## Why not now + +Changing the codec changes the wire format for all CBOR-encoded data +(patches, checkpoints, BTRs). Existing HMAC-verified data and +content-addressed SHAs depend on byte-identical encoding. A codec +change requires a schema migration (version 4 → 5 or similar). diff --git a/src/domain/crdt/Dot.js b/src/domain/crdt/Dot.js index e2c4bc96..0d956350 100644 --- a/src/domain/crdt/Dot.js +++ b/src/domain/crdt/Dot.js @@ -82,8 +82,8 @@ export class Dot { throw new Error('counter must be a positive integer'); } - this.counter = counter; this.writerId = writerId; + this.counter = counter; } } diff --git a/src/domain/utils/EventId.js b/src/domain/utils/EventId.js index c7eda823..87aceecb 100644 --- a/src/domain/utils/EventId.js +++ b/src/domain/utils/EventId.js @@ -40,9 +40,9 @@ export class EventId { } this.lamport = lamport; - this.opIndex = opIndex; - this.patchSha = patchSha; this.writerId = writerId; + this.patchSha = patchSha; + this.opIndex = opIndex; } } From 17e14a343d2a9dbe8407fd5fb5f410c440aa02b2 Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 16:30:44 -0700 Subject: [PATCH 57/73] chore: move CBOR codec fix to ASAP, remove completed WarpStateV5 item --- .../PROTO_cbor-codec-class-key-sorting.md | 0 .../bad-code/PROTO_typedef-warpstatev5-to-class.md | 10 ---------- 2 files changed, 10 deletions(-) rename docs/method/backlog/{bad-code => asap}/PROTO_cbor-codec-class-key-sorting.md (100%) delete mode 100644 docs/method/backlog/bad-code/PROTO_typedef-warpstatev5-to-class.md diff --git a/docs/method/backlog/bad-code/PROTO_cbor-codec-class-key-sorting.md b/docs/method/backlog/asap/PROTO_cbor-codec-class-key-sorting.md similarity index 100% rename from docs/method/backlog/bad-code/PROTO_cbor-codec-class-key-sorting.md rename to docs/method/backlog/asap/PROTO_cbor-codec-class-key-sorting.md diff --git a/docs/method/backlog/bad-code/PROTO_typedef-warpstatev5-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-warpstatev5-to-class.md deleted file mode 100644 index a62b1f3c..00000000 --- a/docs/method/backlog/bad-code/PROTO_typedef-warpstatev5-to-class.md +++ /dev/null @@ -1,10 +0,0 @@ -# Promote WarpStateV5 from @typedef to class - -**Effort:** L - -## Problem - -`src/domain/services/JoinReducer.js` defines `WarpStateV5` as a -`@typedef {Object}`. This is the core CRDT materialized state — -constructed, cloned, mutated by all apply paths, checkpointed, and -serialized. Large blast radius; many consumers. From b591bd7fefc7fdb75fd085c152d92c27a7163728 Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 16:33:36 -0700 Subject: [PATCH 58/73] refactor(NDNM): promote TickReceipt and PatchDiff to classes - TickReceipt: constructor validates all fields and freezes instance. Fields in alphabetical order (lamport, ops, patchSha, writer). Part of the public API surface. - PatchDiff: constructor takes named fields, static empty() factory replaces createEmptyDiff() internals. mergeDiffs() returns class instances. Fields in alphabetical order. Neither type is CBOR-serialized, so no wire format concerns. Closes bad-code/PROTO_typedef-{tickreceipt,patchdiff}-to-class. --- .../PROTO_typedef-patchdiff-to-class.md | 9 --- .../PROTO_typedef-tickreceipt-to-class.md | 10 ---- src/domain/types/PatchDiff.js | 60 ++++++++++++++----- src/domain/types/TickReceipt.js | 51 ++++++++++------ 4 files changed, 78 insertions(+), 52 deletions(-) delete mode 100644 docs/method/backlog/bad-code/PROTO_typedef-patchdiff-to-class.md delete mode 100644 docs/method/backlog/bad-code/PROTO_typedef-tickreceipt-to-class.md diff --git a/docs/method/backlog/bad-code/PROTO_typedef-patchdiff-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-patchdiff-to-class.md deleted file mode 100644 index 0df4a5aa..00000000 --- a/docs/method/backlog/bad-code/PROTO_typedef-patchdiff-to-class.md +++ /dev/null @@ -1,9 +0,0 @@ -# Promote PatchDiff from @typedef to class - -**Effort:** S - -## Problem - -`src/domain/types/PatchDiff.js` defines `PatchDiff` as a `@typedef {Object}` -with a factory (`createEmptyDiff`) and merge logic (`mergeDiffs`). Real -data entity accumulated during reduce. It should be a class. diff --git a/docs/method/backlog/bad-code/PROTO_typedef-tickreceipt-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-tickreceipt-to-class.md deleted file mode 100644 index d0f2ff17..00000000 --- a/docs/method/backlog/bad-code/PROTO_typedef-tickreceipt-to-class.md +++ /dev/null @@ -1,10 +0,0 @@ -# Promote TickReceipt from @typedef to class - -**Effort:** S - -## Problem - -`src/domain/types/TickReceipt.js` defines `TickReceipt` as a -`@typedef {Object}` with a factory (`createTickReceipt`), validation, -canonical JSON serialization, and public export. Should be a class. -Part of the public API surface. diff --git a/src/domain/types/PatchDiff.js b/src/domain/types/PatchDiff.js index e3f14521..015415fc 100644 --- a/src/domain/types/PatchDiff.js +++ b/src/domain/types/PatchDiff.js @@ -24,27 +24,57 @@ */ /** - * @typedef {Object} PatchDiff - * @property {string[]} nodesAdded - Nodes that transitioned not-alive → alive - * @property {string[]} nodesRemoved - Nodes that transitioned alive → not-alive - * @property {EdgeDiffEntry[]} edgesAdded - Edges that transitioned not-alive → alive - * @property {EdgeDiffEntry[]} edgesRemoved - Edges that transitioned alive → not-alive - * @property {PropDiffEntry[]} propsChanged - Properties whose LWW winner actually changed + * PatchDiff — captures alive-ness transitions during patch application. */ +export class PatchDiff { + /** @type {EdgeDiffEntry[]} Edges that transitioned not-alive → alive */ + edgesAdded; + + /** @type {EdgeDiffEntry[]} Edges that transitioned alive → not-alive */ + edgesRemoved; + + /** @type {string[]} Nodes that transitioned not-alive → alive */ + nodesAdded; + + /** @type {string[]} Nodes that transitioned alive → not-alive */ + nodesRemoved; + + /** @type {PropDiffEntry[]} Properties whose LWW winner actually changed */ + propsChanged; + + /** + * Creates a PatchDiff from field values. + * @param {{ nodesAdded: string[], nodesRemoved: string[], edgesAdded: EdgeDiffEntry[], edgesRemoved: EdgeDiffEntry[], propsChanged: PropDiffEntry[] }} fields + */ + constructor({ nodesAdded, nodesRemoved, edgesAdded, edgesRemoved, propsChanged }) { + this.nodesAdded = nodesAdded; + this.nodesRemoved = nodesRemoved; + this.edgesAdded = edgesAdded; + this.edgesRemoved = edgesRemoved; + this.propsChanged = propsChanged; + } + + /** + * Creates an empty PatchDiff. + * @returns {PatchDiff} + */ + static empty() { + return new PatchDiff({ + nodesAdded: [], + nodesRemoved: [], + edgesAdded: [], + edgesRemoved: [], + propsChanged: [], + }); + } +} /** * Creates an empty PatchDiff. - * * @returns {PatchDiff} */ export function createEmptyDiff() { - return { - nodesAdded: [], - nodesRemoved: [], - edgesAdded: [], - edgesRemoved: [], - propsChanged: [], - }; + return PatchDiff.empty(); } /** @@ -110,5 +140,5 @@ export function mergeDiffs(a, b) { const propsChanged = deduplicateProps(a.propsChanged.concat(b.propsChanged)); - return { nodesAdded, nodesRemoved, edgesAdded, edgesRemoved, propsChanged }; + return new PatchDiff({ nodesAdded, nodesRemoved, edgesAdded, edgesRemoved, propsChanged }); } diff --git a/src/domain/types/TickReceipt.js b/src/domain/types/TickReceipt.js index ae1f5fc3..436893c1 100644 --- a/src/domain/types/TickReceipt.js +++ b/src/domain/types/TickReceipt.js @@ -165,32 +165,47 @@ function validateOpResult(value, i) { */ /** - * @typedef {Object} TickReceipt - * @property {string} patchSha - SHA of the patch commit - * @property {string} writer - Writer ID that produced the patch - * @property {number} lamport - Lamport timestamp of the patch - * @property {ReadonlyArray>} ops - Per-operation outcomes (frozen) + * TickReceipt — immutable record of per-operation outcomes from a single patch. */ +export class TickReceipt { + /** @type {number} Lamport timestamp of the patch */ + lamport; + + /** @type {ReadonlyArray>} Per-operation outcomes (frozen) */ + ops; + + /** @type {string} SHA of the patch commit */ + patchSha; + + /** @type {string} Writer ID that produced the patch */ + writer; + + /** + * Creates an immutable TickReceipt. + * @param {{ patchSha: string, writer: string, lamport: number, ops: OpOutcome[] }} fields + */ + constructor({ patchSha, writer, lamport, ops }) { + assertNonEmptyString(patchSha, 'patchSha'); + assertNonEmptyString(writer, 'writer'); + assertNonNegativeInt(lamport); + assertOpsArray(ops); + + this.lamport = lamport; + this.ops = freezeOps(ops); + this.patchSha = patchSha; + this.writer = writer; + Object.freeze(this); + } +} /** * Creates an immutable TickReceipt. * * @param {{ patchSha: string, writer: string, lamport: number, ops: OpOutcome[] }} params - * @returns {Readonly} Frozen tick receipt - * @throws {Error} If any parameter is invalid + * @returns {TickReceipt} */ export function createTickReceipt({ patchSha, writer, lamport, ops }) { - assertNonEmptyString(patchSha, 'patchSha'); - assertNonEmptyString(writer, 'writer'); - assertNonNegativeInt(lamport); - assertOpsArray(ops); - - return Object.freeze({ - patchSha, - writer, - lamport, - ops: freezeOps(ops), - }); + return new TickReceipt({ patchSha, writer, lamport, ops }); } /** From 2c1dc32f26567fea959112ce178d0b5b9a939f5c Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 16:57:02 -0700 Subject: [PATCH 59/73] fix: CBOR codec now sorts class instance keys for canonical encoding MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Both CborCodec and defaultCodec previously skipped key sorting for class instances (checked constructor === Object). This coupled class field declaration order to CBOR wire format — a fragile constraint that forced alphabetical field ordering in every domain class. Now both codecs sort keys for ALL objects except built-in types with their own CBOR representation (Uint8Array, Date, RegExp, Set, Map). Uses instanceof checks (not constructor equality) to correctly handle Buffer and other subclasses. This matches Echo's Rust canonical CBOR encoder which sorts all map keys by encoded byte representation at encode time, making source property order irrelevant. Reverted alphabetical field ordering in Dot (writerId, counter) and EventId (lamport, writerId, patchSha, opIndex) to natural domain order — the codec handles canonicalization now. No wire format compatibility concern: no released version has class instances in CBOR (all class promotions are on this unreleased branch). Closes asap/PROTO_cbor-codec-class-key-sorting. --- .../PROTO_cbor-codec-class-key-sorting.md | 30 ------------------- src/domain/crdt/Dot.js | 6 ++-- src/domain/utils/EventId.js | 8 ++--- src/domain/utils/defaultCodec.js | 19 +++++++++--- src/infrastructure/codecs/CborCodec.js | 21 +++++++++++-- 5 files changed, 40 insertions(+), 44 deletions(-) delete mode 100644 docs/method/backlog/asap/PROTO_cbor-codec-class-key-sorting.md diff --git a/docs/method/backlog/asap/PROTO_cbor-codec-class-key-sorting.md b/docs/method/backlog/asap/PROTO_cbor-codec-class-key-sorting.md deleted file mode 100644 index 19718d92..00000000 --- a/docs/method/backlog/asap/PROTO_cbor-codec-class-key-sorting.md +++ /dev/null @@ -1,30 +0,0 @@ -# CBOR codec skips key sorting for class instances - -**Effort:** M - -## Problem - -Both `CborCodec.js` and `defaultCodec.js` only sort keys for plain -objects (`constructor === Object`). Class instances pass through -unsorted, meaning their CBOR output depends on field declaration -order. This forces all domain classes to declare fields in -alphabetical order — coupling class design to serialization. - -Echo's canonical CBOR encoder (Rust) sorts ALL map keys by encoded -byte representation at encode time, making source property order -irrelevant. git-warp's codec should do the same. - -## Fix - -Update `isPlainObject`/`sortObjectKeys` in both codecs to sort keys -for all object types except built-in CBOR-native types (Uint8Array, -Date, Set, Map, RegExp). This is a wire format change — existing -CBOR data was encoded with unsorted class instance keys. Requires -schema version bump or migration path for persisted data. - -## Why not now - -Changing the codec changes the wire format for all CBOR-encoded data -(patches, checkpoints, BTRs). Existing HMAC-verified data and -content-addressed SHAs depend on byte-identical encoding. A codec -change requires a schema migration (version 4 → 5 or similar). diff --git a/src/domain/crdt/Dot.js b/src/domain/crdt/Dot.js index 0d956350..2d1734f9 100644 --- a/src/domain/crdt/Dot.js +++ b/src/domain/crdt/Dot.js @@ -61,12 +61,12 @@ * for each CRDT operation. */ export class Dot { - /** @type {number} Monotonic counter (positive integer) */ - counter; - /** @type {string} Writer identifier (non-empty string) */ writerId; + /** @type {number} Monotonic counter (positive integer) */ + counter; + /** * Creates a validated Dot. * diff --git a/src/domain/utils/EventId.js b/src/domain/utils/EventId.js index 87aceecb..883cdb5b 100644 --- a/src/domain/utils/EventId.js +++ b/src/domain/utils/EventId.js @@ -8,14 +8,14 @@ export class EventId { /** @type {number} Monotonic counter (positive integer) */ lamport; - /** @type {number} Operation index within patch (non-negative integer) */ - opIndex; + /** @type {string} Writer identifier (non-empty string) */ + writerId; /** @type {string} Patch commit SHA (hex OID, 4-64 chars) */ patchSha; - /** @type {string} Writer identifier (non-empty string) */ - writerId; + /** @type {number} Operation index within patch (non-negative integer) */ + opIndex; /** * Creates a validated EventId. diff --git a/src/domain/utils/defaultCodec.js b/src/domain/utils/defaultCodec.js index 0be10a4a..ad1d64cc 100644 --- a/src/domain/utils/defaultCodec.js +++ b/src/domain/utils/defaultCodec.js @@ -55,15 +55,26 @@ function sortMapKeys(map) { return sorted; } +/** @type {ReadonlyArray} */ +const CBOR_NATIVE = [Uint8Array, Date, RegExp, Set]; + +/** + * Returns true if the value is a built-in type with its own CBOR encoding. + * @param {object} value + * @returns {boolean} + */ +function isCborNative(value) { + return CBOR_NATIVE.some((T) => value instanceof T); +} + /** - * Sorts keys of a plain object and recursively sorts nested values. + * Sorts keys of any object and recursively sorts nested values. + * Skips built-in types that have their own CBOR representation. * @param {Record} obj * @returns {Record} */ function sortObjectKeys(obj) { - if (obj.constructor !== Object && obj.constructor !== undefined) { - return obj; - } + if (isCborNative(obj)) { return obj; } /** @type {Record} */ const sorted = {}; for (const key of Object.keys(obj).sort()) { diff --git a/src/infrastructure/codecs/CborCodec.js b/src/infrastructure/codecs/CborCodec.js index 14b2fa9c..47f5629c 100644 --- a/src/infrastructure/codecs/CborCodec.js +++ b/src/infrastructure/codecs/CborCodec.js @@ -75,15 +75,30 @@ const encoder = new Encoder({ mapsAsObjects: true, }); +/** @type {ReadonlyArray} */ +const CBOR_NATIVE_TYPES = [Uint8Array, Date, RegExp, Set, Map]; + +/** + * Returns true if the value is a built-in type with its own CBOR encoding. + * @param {object} value + * @returns {boolean} + * @private + */ +function isCborNative(value) { + return CBOR_NATIVE_TYPES.some((T) => value instanceof T); +} + /** - * Checks if a value is a plain object (constructed via Object or Object.create(null)). + * Checks if a value should have its keys sorted for canonical CBOR. + * Returns true for plain objects AND domain class instances. + * Returns false for built-in types with their own CBOR representation. * * @param {unknown} value - The value to check - * @returns {boolean} True if value is a plain object + * @returns {boolean} True if value's keys should be sorted * @private */ function isPlainObject(value) { - return typeof value === 'object' && value !== null && (value.constructor === Object || value.constructor === undefined); + return typeof value === 'object' && value !== null && !isCborNative(/** @type {object} */ (value)); } /** From 2e47de97a4973c275b1bc350a560aca3eacd4d3a Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 17:04:21 -0700 Subject: [PATCH 60/73] refactor(NDNM): promote LWWRegister and BTR to classes MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit - LWWRegister: generic class with eventId + value fields. Factory lwwSet() delegates to constructor. Merge (lwwMax), coalesce, and value extraction remain as module-level functions. - BTR: frozen class with 7 fields. createBTR() and deserializeBTR() now return class instances. No field ordering concern — the CBOR codec now handles canonical key sorting for class instances. Closes bad-code/PROTO_typedef-{lww,btr}-to-class. --- .../bad-code/PROTO_typedef-btr-to-class.md | 9 --- .../bad-code/PROTO_typedef-lww-to-class.md | 9 --- src/domain/crdt/LWW.js | 24 ++++++-- .../services/BoundaryTransitionRecord.js | 59 +++++++++++++------ 4 files changed, 60 insertions(+), 41 deletions(-) delete mode 100644 docs/method/backlog/bad-code/PROTO_typedef-btr-to-class.md delete mode 100644 docs/method/backlog/bad-code/PROTO_typedef-lww-to-class.md diff --git a/docs/method/backlog/bad-code/PROTO_typedef-btr-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-btr-to-class.md deleted file mode 100644 index e057bf44..00000000 --- a/docs/method/backlog/bad-code/PROTO_typedef-btr-to-class.md +++ /dev/null @@ -1,9 +0,0 @@ -# Promote BTR from @typedef to class - -**Effort:** S - -## Problem - -`src/domain/services/BoundaryTransitionRecord.js` defines `BTR` as a -`@typedef {Object}`. Tamper-evident package — constructed, frozen, -verified, serialized. Should be a class. diff --git a/docs/method/backlog/bad-code/PROTO_typedef-lww-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-lww-to-class.md deleted file mode 100644 index 80841e78..00000000 --- a/docs/method/backlog/bad-code/PROTO_typedef-lww-to-class.md +++ /dev/null @@ -1,9 +0,0 @@ -# Promote LWWRegister from @typedef to class - -**Effort:** S - -## Problem - -`src/domain/crdt/LWW.js` defines `LWWRegister` as a `@typedef {Object}` -but it has semilattice merge semantics (`lwwMax`) and setter logic (`lwwSet`). -Should be a class. diff --git a/src/domain/crdt/LWW.js b/src/domain/crdt/LWW.js index 3612fae6..df95d4b6 100644 --- a/src/domain/crdt/LWW.js +++ b/src/domain/crdt/LWW.js @@ -69,12 +69,26 @@ import { compareEventIds } from '../utils/EventId.js'; */ /** - * LWW Register - stores value with EventId for conflict resolution + * LWW Register — stores value with EventId for conflict resolution. * @template T - * @typedef {Object} LWWRegister - * @property {import('../utils/EventId.js').EventId} eventId - * @property {T} value */ +export class LWWRegister { + /** @type {import('../utils/EventId.js').EventId} */ + eventId; + + /** @type {T} */ + value; + + /** + * Creates an LWW register. + * @param {import('../utils/EventId.js').EventId} eventId + * @param {T} value + */ + constructor(eventId, value) { + this.eventId = eventId; + this.value = value; + } +} /** * Creates an LWW register with the given EventId and value. @@ -84,7 +98,7 @@ import { compareEventIds } from '../utils/EventId.js'; * @returns {LWWRegister} */ export function lwwSet(eventId, value) { - return { eventId, value }; + return new LWWRegister(eventId, value); } /** diff --git a/src/domain/services/BoundaryTransitionRecord.js b/src/domain/services/BoundaryTransitionRecord.js index d88585dc..a16fad61 100644 --- a/src/domain/services/BoundaryTransitionRecord.js +++ b/src/domain/services/BoundaryTransitionRecord.js @@ -126,15 +126,46 @@ async function computeHmac(fields, key, { crypto, codec }) { } /** - * @typedef {Object} BTR - * @property {number} version - BTR format version - * @property {string} h_in - Hash of input state (hex SHA-256) - * @property {string} h_out - Hash of output state (hex SHA-256) - * @property {Uint8Array} U_0 - Serialized initial state (CBOR) - * @property {Array} P - Serialized provenance payload - * @property {string} t - ISO 8601 timestamp - * @property {string} kappa - Authentication tag (hex HMAC-SHA256) + * BTR — Boundary Transition Record. Tamper-evident package binding + * initial state, provenance payload, and output state hash. */ +export class BTR { + /** @type {string} Hash of input state (hex SHA-256) */ + h_in; + + /** @type {string} Hash of output state (hex SHA-256) */ + h_out; + + /** @type {string} Authentication tag (hex HMAC-SHA256) */ + kappa; + + /** @type {Array} Serialized provenance payload */ + P; + + /** @type {string} ISO 8601 timestamp */ + t; + + /** @type {Uint8Array} Serialized initial state (CBOR) */ + U_0; + + /** @type {number} BTR format version */ + version; + + /** + * Creates a BTR from field values. + * @param {{ version: number, h_in: string, h_out: string, U_0: Uint8Array, P: Array, t: string, kappa: string }} fields + */ + constructor({ version, h_in, h_out, U_0, P, t, kappa }) { + this.version = version; + this.h_in = h_in; + this.h_out = h_out; + this.U_0 = U_0; + this.P = P; + this.t = t; + this.kappa = kappa; + Object.freeze(this); + } +} /** * @typedef {Object} VerificationResult @@ -197,7 +228,7 @@ export async function createBTR(initialState, payload, options) { const fields = { version: BTR_VERSION, h_in, h_out, U_0, P, t: timestamp }; const kappa = await computeHmac(fields, key, /** @type {{ crypto: import('../../ports/CryptoPort.js').default, codec?: import('../../ports/CodecPort.js').default }} */ (deps)); - return { ...fields, kappa }; + return new BTR({ ...fields, kappa }); } /** @@ -491,15 +522,7 @@ export function deserializeBTR(bytes, { codec } = {}) { } const typed = /** @type {{ version: number, h_in: string, h_out: string, U_0: Uint8Array, P: Array, t: string, kappa: string }} */ (obj); - return /** @type {BTR} */ ({ - version: typed.version, - h_in: typed.h_in, - h_out: typed.h_out, - U_0: typed.U_0, - P: typed.P, - t: typed.t, - kappa: typed.kappa, - }); + return new BTR(typed); } /** From bab1d0c1725d449e6aaa1ef6d52d50a85704506a Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 17:07:38 -0700 Subject: [PATCH 61/73] refactor(NDNM): promote TrustState to class, keep TrustRecord as typedef MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit - TrustState: frozen class constructed by buildState(). Holds Maps of active/revoked keys, writer bindings, errors, and record count. - TrustRecord: remains a @typedef. It is Zod-parsed from external data and consumed as Record throughout the trust chain — making it a class would break the index-signature compatibility that Zod parsing and signature verification depend on. Closes bad-code/PROTO_typedef-truststate-to-class. --- .../PROTO_typedef-truststate-to-class.md | 9 ------ src/domain/trust/TrustStateBuilder.js | 32 ++++++++++++++----- 2 files changed, 24 insertions(+), 17 deletions(-) delete mode 100644 docs/method/backlog/bad-code/PROTO_typedef-truststate-to-class.md diff --git a/docs/method/backlog/bad-code/PROTO_typedef-truststate-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-truststate-to-class.md deleted file mode 100644 index 8abe3a13..00000000 --- a/docs/method/backlog/bad-code/PROTO_typedef-truststate-to-class.md +++ /dev/null @@ -1,9 +0,0 @@ -# Promote TrustState from @typedef to class - -**Effort:** S - -## Problem - -`src/domain/trust/TrustStateBuilder.js` defines `TrustState` as a -`@typedef {Object}`. Built by `buildState`, frozen, queried by -TrustEvaluator. Maps of bindings/keys. Should be a class. diff --git a/src/domain/trust/TrustStateBuilder.js b/src/domain/trust/TrustStateBuilder.js index 8db56180..511a43fa 100644 --- a/src/domain/trust/TrustStateBuilder.js +++ b/src/domain/trust/TrustStateBuilder.js @@ -26,14 +26,30 @@ import { TrustRecordSchema } from './schemas.js'; */ /** - * @typedef {Object} TrustState - * @property {Map} activeKeys - keyId → key info - * @property {Map} revokedKeys - * @property {Map} writerBindings - "writerId\0keyId" → binding - * @property {Map} revokedBindings - * @property {Array<{recordId: string, error: string}>} errors - * @property {number} recordsProcessed - Total number of records fed to the builder + * TrustState — frozen aggregate of all trust chain records. */ +export class TrustState { + /** @type {Map} */ activeKeys; + /** @type {Map} */ revokedKeys; + /** @type {Map} */ writerBindings; + /** @type {Map} */ revokedBindings; + /** @type {Array<{recordId: string, error: string}>} */ errors; + /** @type {number} */ recordsProcessed; + + /** + * Creates a frozen TrustState. + * @param {{ activeKeys: Map, revokedKeys: Map, writerBindings: Map, revokedBindings: Map, errors: Array<{recordId: string, error: string}>, recordsProcessed: number }} fields + */ + constructor(fields) { + this.activeKeys = fields.activeKeys; + this.revokedKeys = fields.revokedKeys; + this.writerBindings = fields.writerBindings; + this.revokedBindings = fields.revokedBindings; + this.errors = fields.errors; + this.recordsProcessed = fields.recordsProcessed; + Object.freeze(this); + } +} /** * @typedef {Object} TrustBuildOptions @@ -89,7 +105,7 @@ export function buildState(records, options = {}) { processRecord(rec, ctx); } - return Object.freeze({ + return new TrustState({ activeKeys: ctx.activeKeys, revokedKeys: ctx.revokedKeys, writerBindings: ctx.writerBindings, From 8e961e52e031f4cfdd9f4258dcc1a102cb86a68f Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 17:09:44 -0700 Subject: [PATCH 62/73] chore: update TrustRecord backlog item with promotion notes MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Not permanently blocked — Zod parse wrapping + consumer signature widening is straightforward S-tier work, just more touch points than the XS vassals. --- .../bad-code/PROTO_typedef-trustrecord-to-class.md | 10 ++++++++++ 1 file changed, 10 insertions(+) diff --git a/docs/method/backlog/bad-code/PROTO_typedef-trustrecord-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-trustrecord-to-class.md index 5d982e48..811aa2ac 100644 --- a/docs/method/backlog/bad-code/PROTO_typedef-trustrecord-to-class.md +++ b/docs/method/backlog/bad-code/PROTO_typedef-trustrecord-to-class.md @@ -7,3 +7,13 @@ `src/domain/trust/TrustStateBuilder.js` defines `TrustRecord` as a `@typedef {Object}`. Parsed, validated, and chained — full entity lifecycle. Should be a class. + +## Notes + +TrustRecord is Zod-parsed from external data. Functions like +`computeSignaturePayload` and `buildState` accept +`Record`. Promoting to a class requires: +1. Wrap Zod parse output in `new TrustRecord(parsed.data)` +2. Widen consumer signatures or add `toRecord()` on the class + +Not blocked — just more touch points than the XS vassals. From a215aa3c51ae5a8d33281e6d3c33206f3474b602 Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 17:11:40 -0700 Subject: [PATCH 63/73] =?UTF-8?q?chore:=20upgrade=20TrustRecord=20backlog?= =?UTF-8?q?=20item=20=E2=80=94=20root=20cause=20is=20Record=20pipeline?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit The typedef is the symptom. The disease is 20+ function signatures across 5 trust files that accept Record because nobody introduced a concrete type at the CBOR decode boundary. Effort upgraded from S to M. --- .../PROTO_typedef-trustrecord-to-class.md | 47 ++++++++++++++----- 1 file changed, 36 insertions(+), 11 deletions(-) diff --git a/docs/method/backlog/bad-code/PROTO_typedef-trustrecord-to-class.md b/docs/method/backlog/bad-code/PROTO_typedef-trustrecord-to-class.md index 811aa2ac..45a6346a 100644 --- a/docs/method/backlog/bad-code/PROTO_typedef-trustrecord-to-class.md +++ b/docs/method/backlog/bad-code/PROTO_typedef-trustrecord-to-class.md @@ -1,19 +1,44 @@ # Promote TrustRecord from @typedef to class -**Effort:** S +**Effort:** M (upgraded from S — root cause is deeper than the typedef) ## Problem -`src/domain/trust/TrustStateBuilder.js` defines `TrustRecord` as a -`@typedef {Object}`. Parsed, validated, and chained — full entity -lifecycle. Should be a class. +The entire trust pipeline operates on `Record` — the +JavaScript equivalent of `any` in a trench coat. Trust records are +CBOR-decoded to `unknown`, cast to `Record`, and +passed through 20+ function signatures in that form across 5 files: -## Notes +- `TrustRecordService.js` — 10 occurrences +- `TrustCanonical.js` — 3 occurrences +- `TrustStateBuilder.js` — 1 occurrence +- `TrustEvaluator.js` — 1 occurrence +- `schemas.js` — 4 occurrences -TrustRecord is Zod-parsed from external data. Functions like -`computeSignaturePayload` and `buildState` accept -`Record`. Promoting to a class requires: -1. Wrap Zod parse output in `new TrustRecord(parsed.data)` -2. Widen consumer signatures or add `toRecord()` on the class +The `TrustRecord` typedef exists but is never enforced at the decode +boundary. Every consumer does bracket access and manual casting +because the type system says "bag of unknowns." -Not blocked — just more touch points than the XS vassals. +## Root cause + +`codec.decode()` returns `unknown`. The trust pipeline casts to +`Record` and never narrows further. The Zod schema +(`TrustRecordSchema`) validates the shape but doesn't produce a +typed output that propagates — the parse result is immediately +consumed and the validated shape is lost. + +## Fix + +1. Create `TrustRecord` class in `TrustStateBuilder.js` (or own file) +2. At the CBOR decode boundary in `TrustRecordService.js`, Zod-parse + then wrap: `new TrustRecord(parsed.data)` +3. Replace all `Record` signatures downstream with + `TrustRecord` +4. `computeSignaturePayload`, `computeRecordId`, `verifyRecordId` in + `TrustCanonical.js` — accept `TrustRecord` instead of + `Record` +5. `buildState` in `TrustStateBuilder.js` — accept `TrustRecord[]` +6. Schema validators in `schemas.js` — accept `TrustRecord` + +The class eliminates bracket access, manual casts, and the pretense +that we don't know what a trust record looks like. From 17346645727337b66dd5695fb8ede9c628d71fd9 Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 17:25:30 -0700 Subject: [PATCH 64/73] chore: update CHANGELOG, promote 3 backlog items to ASAP MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit CHANGELOG updated with: - 11 typedef-to-class promotions - CBOR codec canonical key sorting fix - WarpStateV5 class as new public export Moved to ASAP for next cycle: - PROTO_typedef-trustrecord-to-class (M — Record pipeline cleanup) - PROTO_typedef-orset-to-class (M — 10+ CRDT operations) - PROTO_typedef-patchv2-to-class (M — core entity, deep integration) --- CHANGELOG.md | 5 ++++- .../{bad-code => asap}/PROTO_typedef-orset-to-class.md | 0 .../{bad-code => asap}/PROTO_typedef-patchv2-to-class.md | 0 .../{bad-code => asap}/PROTO_typedef-trustrecord-to-class.md | 0 4 files changed, 4 insertions(+), 1 deletion(-) rename docs/method/backlog/{bad-code => asap}/PROTO_typedef-orset-to-class.md (100%) rename docs/method/backlog/{bad-code => asap}/PROTO_typedef-patchv2-to-class.md (100%) rename docs/method/backlog/{bad-code => asap}/PROTO_typedef-trustrecord-to-class.md (100%) diff --git a/CHANGELOG.md b/CHANGELOG.md index fc6363ea..88dc460c 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -31,12 +31,15 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 - **`ProvenanceController`** (242 LOC) — patch lookups, backward causal cone, slice materialization - **`ForkController`** (274 LOC) — fork creation, wormhole compression, backfill rejection - **`QueryController`** (964 LOC) — all read queries, observer/worldline factories, content access -- **`AuditReceipt` promoted to class** — replaced `@typedef {Object}` with a real JavaScript class. Constructor validates and freezes. Fields declared in alphabetical order for canonical CBOR serialization. +- **`AuditReceipt` promoted to class** — replaced `@typedef {Object}` with a real JavaScript class. Constructor validates and freezes. - **WarpApp/WarpCore content methods** — replaced direct function imports from `query.methods.js` with `callInternalRuntimeMethod()` delegation, which correctly resolves dynamically wired prototype methods. +- **11 typedef-to-class promotions (NO_DOGS_NO_MASTERS)** — replaced phantom `@typedef {Object}` shapes with real JavaScript classes: `WarpStateV5`, `Dot`, `EventId`, `EffectEmission`, `EffectCoordinate`, `DeliveryObservation`, `TickReceipt`, `PatchDiff`, `LWWRegister`, `BTR`, `TrustState`. Each class has a constructor, validates inputs where applicable, and supports `instanceof`. Factory functions retained for backward compatibility. +- **CBOR codec canonical key sorting for class instances** — both `CborCodec` and `defaultCodec` now sort keys for all object types (not just plain objects), using `instanceof` checks to skip built-in CBOR-native types (Uint8Array, Date, Set, Map, RegExp). This decouples class field declaration order from wire format, matching Echo's Rust canonical encoder behavior. ### Added - **`AuditError`** — domain error class for audit receipt validation and persistence failures. Exported from package root with four static error codes. +- **`WarpStateV5` class** — core CRDT materialized state promoted from typedef to its own module (`src/domain/services/WarpStateV5.js`). Provides `static empty()` factory and `clone()` method. Re-exported from `JoinReducer.js` for backward compatibility. - **`NO_DOGS_NO_MASTERS` legend** — backlog legend for god object decomposition and typedef-to-class liberation. Code: `NDNM_`. - **Effect emission & delivery observation substrate slice** — new receipt families for outbound effects and their delivery lifecycle. `EffectEmission` records that the system produced an outbound effect candidate at a causal coordinate. `DeliveryObservation` records how a sink handled that emission (delivered, suppressed, failed, skipped). `ExternalizationPolicy` provides execution context (live/replay/inspect) that shapes delivery behavior. Preset lenses `LIVE_LENS`, `REPLAY_LENS`, and `INSPECT_LENS` cover common modes. diff --git a/docs/method/backlog/bad-code/PROTO_typedef-orset-to-class.md b/docs/method/backlog/asap/PROTO_typedef-orset-to-class.md similarity index 100% rename from docs/method/backlog/bad-code/PROTO_typedef-orset-to-class.md rename to docs/method/backlog/asap/PROTO_typedef-orset-to-class.md diff --git a/docs/method/backlog/bad-code/PROTO_typedef-patchv2-to-class.md b/docs/method/backlog/asap/PROTO_typedef-patchv2-to-class.md similarity index 100% rename from docs/method/backlog/bad-code/PROTO_typedef-patchv2-to-class.md rename to docs/method/backlog/asap/PROTO_typedef-patchv2-to-class.md diff --git a/docs/method/backlog/bad-code/PROTO_typedef-trustrecord-to-class.md b/docs/method/backlog/asap/PROTO_typedef-trustrecord-to-class.md similarity index 100% rename from docs/method/backlog/bad-code/PROTO_typedef-trustrecord-to-class.md rename to docs/method/backlog/asap/PROTO_typedef-trustrecord-to-class.md From 0662e68935f797f33c159a6e7186b3c98248cf22 Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 18:06:23 -0700 Subject: [PATCH 65/73] fix: address self-code-review findings (7 of 12 resolved) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Critical: - CONTRIBUTING.md: fixed broken cycle path (docs/design/cycles/ → docs/design//), stale BACKLOG/ refs, and retro path mismatch (docs/archive/retrospectives/ → docs/method/retro/) Major: - ROADMAP.md: updated stale BACKLOG/ references to docs/method/backlog/ - AuditReceiptService: writer mismatch now throws with explicit E_AUDIT_WRITER_MISMATCH code instead of defaulting to E_AUDIT_INVALID Minor: - TickReceipt: added @throws annotation to constructor JSDoc - METHOD.md: clarified structure diagram with pattern - DeliveryObservation: documented that reason field is omitted (not null) when absent Nits acknowledged but deferred: - WarpRuntime delegation boilerplate (DRY) — future cleanup - WarpCore single-line JSDoc — cosmetic - Em-dash inconsistency — cosmetic - WarpStateV5 edgeBirthEvent default — intentional for old checkpoints - BTR computeHmac guard — validated upstream --- .github/CONTRIBUTING.md | 14 +++++++------- METHOD.md | 4 ++-- docs/ROADMAP.md | 12 ++++++------ index.d.ts | 1 + src/domain/errors/AuditError.js | 4 ++++ src/domain/services/AuditReceiptService.js | 2 +- src/domain/types/DeliveryObservation.js | 2 +- src/domain/types/TickReceipt.js | 1 + 8 files changed, 23 insertions(+), 17 deletions(-) diff --git a/.github/CONTRIBUTING.md b/.github/CONTRIBUTING.md index 40fb8132..fa11943d 100644 --- a/.github/CONTRIBUTING.md +++ b/.github/CONTRIBUTING.md @@ -2,20 +2,20 @@ ## Planning Sources Of Truth -- `BACKLOG/` — individual markdown files, one per item (`B{number}.md`) -- `docs/design/cycles//` — active work; backlog file moves here +- `docs/method/backlog/` — lane-organized backlog (inbox/, asap/, up-next/, cool-ideas/, bad-code/) +- `docs/design//` — active cycle work; backlog item promotes here - `CHANGELOG.md` — what has landed -- `docs/archive/retrospectives/` — closed cycle audits +- `docs/method/retro//` — closed cycle retrospectives No milestones. No ROADMAP. Cycles are the unit of work. +See [METHOD.md](../../METHOD.md) for the full process. ## Cycle Process A cycle is one backlog item, start to finish: -1. **Pull** — move `BACKLOG/B{number}.md` to `docs/design/cycles//` -2. **Design** — the backlog file becomes the design doc; add hills, non-goals, - invariants as needed +1. **Pull** — promote a backlog item to `docs/design//` +2. **Design** — write the design doc; add hills, playback questions, non-goals 3. **Spec** — write failing tests as executable spec 4. **Implement** — make the tests pass 5. **Close** — retrospective, drift audit, CHANGELOG, tech debt journal, @@ -23,7 +23,7 @@ A cycle is one backlog item, start to finish: ### Retrospectives -Every closed cycle gets a retrospective in `docs/archive/retrospectives/`. +Every closed cycle gets a retrospective in `docs/method/retro//`. At minimum: 1. Governing design docs and backlog IDs diff --git a/METHOD.md b/METHOD.md index e62da7cd..3dfb33d1 100644 --- a/METHOD.md +++ b/METHOD.md @@ -38,12 +38,12 @@ docs/ bad-code/ tech debt *.md shaped work not in a named lane legends/ named domains - retro//.md retrospectives + retro// cycle retrospectives graveyard/ rejected ideas process.md how cycles run release.md how releases work design/ - /.md cycle design docs + / cycle design docs *.md living documents ``` diff --git a/docs/ROADMAP.md b/docs/ROADMAP.md index 09f5f1ff..b0784894 100644 --- a/docs/ROADMAP.md +++ b/docs/ROADMAP.md @@ -522,9 +522,9 @@ All milestones are complete: M10 → M12 → M13 (internal) → M11 → M14. M13 The active roadmap is **26 standalone items** sorted into **8 priority tiers** (P0–P7) with **6 execution waves**. The GitHub issue queue is clear; Wave 1 is complete, and Wave 2 now starts at B88 in the CI & Tooling pack, with the roaring benchmark investigation queued in the performance lane. See [Execution Order](#execution-order) for the full sequence. -Rejected items live in `GRAVEYARD.md`. Resurrections require an RFC. -Promotable pre-design intake now lives in `BACKLOG/`. This file remains the -committed milestone/release inventory. +Rejected items live in `docs/method/graveyard/`. Resurrections must address the rejection note. +Promotable pre-design intake lives in `docs/method/backlog/` with lane organization. +This file remains the committed milestone/release inventory. --- @@ -715,13 +715,13 @@ Exploratory concepts captured during PR hardening. These are intentionally fully - Golden output tests for deterministic summary formatting. - Smoke test ensuring script exits non-zero on API/auth failures. -## Concern 4 — Documentation Drift: `ROADMAP.md` vs `BACKLOG/` +## Concern 4 — Documentation Drift: `ROADMAP.md` vs Backlog The roles are now split explicitly: - `ROADMAP.md` owns committed milestone/release inventory -- `BACKLOG/` owns promotable pre-design items -- `docs/design/` owns active design docs +- `docs/method/backlog/` owns promotable pre-design items (lane-organized) +- `docs/design/` owns active cycle design docs Backlog items should be promoted into `docs/design/` before tests and implementation begin. diff --git a/index.d.ts b/index.d.ts index 22f7c04a..edecec31 100644 --- a/index.d.ts +++ b/index.d.ts @@ -1222,6 +1222,7 @@ export class AuditError extends Error { static readonly E_AUDIT_CAS_FAILED: 'E_AUDIT_CAS_FAILED'; static readonly E_AUDIT_DEGRADED: 'E_AUDIT_DEGRADED'; static readonly E_AUDIT_CHAIN_GAP: 'E_AUDIT_CHAIN_GAP'; + static readonly E_AUDIT_WRITER_MISMATCH: 'E_AUDIT_WRITER_MISMATCH'; constructor(message: string, options?: { code?: string; diff --git a/src/domain/errors/AuditError.js b/src/domain/errors/AuditError.js index 59dd6d91..ca1f0a71 100644 --- a/src/domain/errors/AuditError.js +++ b/src/domain/errors/AuditError.js @@ -11,6 +11,7 @@ import WarpError from './WarpError.js'; * | `E_AUDIT_CAS_FAILED` | Compare-and-swap failed during audit commit | * | `E_AUDIT_DEGRADED` | Audit service degraded after exhausting retries | * | `E_AUDIT_CHAIN_GAP` | Audit chain has a gap (missing commit in ancestry) | + * | `E_AUDIT_WRITER_MISMATCH` | TickReceipt writer does not match the service's writerId | * * @class AuditError * @extends WarpError @@ -32,6 +33,9 @@ export default class AuditError extends WarpError { /** Audit chain has a gap (missing commit in ancestry). */ static E_AUDIT_CHAIN_GAP = 'E_AUDIT_CHAIN_GAP'; + /** TickReceipt writer does not match the service's writerId. */ + static E_AUDIT_WRITER_MISMATCH = 'E_AUDIT_WRITER_MISMATCH'; + /** * Creates an AuditError with the given message and error code. * @param {string} message - Human-readable error description diff --git a/src/domain/services/AuditReceiptService.js b/src/domain/services/AuditReceiptService.js index 923ce539..968707c4 100644 --- a/src/domain/services/AuditReceiptService.js +++ b/src/domain/services/AuditReceiptService.js @@ -359,7 +359,7 @@ export class AuditReceiptService { }); throw new AuditError( `Audit writer mismatch: expected '${this._writerId}', got '${writer}'`, - { context: { expected: this._writerId, actual: writer, patchSha } }, + { code: AuditError.E_AUDIT_WRITER_MISMATCH, context: { expected: this._writerId, actual: writer, patchSha } }, ); } diff --git a/src/domain/types/DeliveryObservation.js b/src/domain/types/DeliveryObservation.js index dd621572..a69865a8 100644 --- a/src/domain/types/DeliveryObservation.js +++ b/src/domain/types/DeliveryObservation.js @@ -35,7 +35,7 @@ export class DeliveryObservation { /** @type {'delivered' | 'suppressed' | 'failed' | 'skipped'} */ outcome; - /** @type {string | undefined} Why (e.g., "replay mode") */ + /** @type {string | undefined} Why (e.g., "replay mode"). Omitted (not null) when absent. */ reason; /** @type {number} Wall-clock milliseconds */ diff --git a/src/domain/types/TickReceipt.js b/src/domain/types/TickReceipt.js index 436893c1..8143a982 100644 --- a/src/domain/types/TickReceipt.js +++ b/src/domain/types/TickReceipt.js @@ -183,6 +183,7 @@ export class TickReceipt { /** * Creates an immutable TickReceipt. * @param {{ patchSha: string, writer: string, lamport: number, ops: OpOutcome[] }} fields + * @throws {Error} If any field is invalid */ constructor({ patchSha, writer, lamport, ops }) { assertNonEmptyString(patchSha, 'patchSha'); From 110c7a6e591803e3bdaeb3ac7bc4567952fed32f Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 18:07:13 -0700 Subject: [PATCH 66/73] chore: log 2 deferred nits as backlog items - DX_warpruntime-delegation-dry (XS): extract delegation boilerplate helper - DX_warpcore-jsdoc-block-style (XS): expand single-line JSDoc to blocks --- .../bad-code/DX_warpcore-jsdoc-block-style.md | 9 +++++++++ .../bad-code/DX_warpruntime-delegation-dry.md | 16 ++++++++++++++++ 2 files changed, 25 insertions(+) create mode 100644 docs/method/backlog/bad-code/DX_warpcore-jsdoc-block-style.md create mode 100644 docs/method/backlog/bad-code/DX_warpruntime-delegation-dry.md diff --git a/docs/method/backlog/bad-code/DX_warpcore-jsdoc-block-style.md b/docs/method/backlog/bad-code/DX_warpcore-jsdoc-block-style.md new file mode 100644 index 00000000..ed341363 --- /dev/null +++ b/docs/method/backlog/bad-code/DX_warpcore-jsdoc-block-style.md @@ -0,0 +1,9 @@ +# WarpCore content methods use single-line JSDoc + +**Effort:** XS + +## Problem + +`WarpCore.js` lines 162-184 collapse content method JSDoc into +single-line comments. Functional but inconsistent with the block +JSDoc style used everywhere else in the codebase. diff --git a/docs/method/backlog/bad-code/DX_warpruntime-delegation-dry.md b/docs/method/backlog/bad-code/DX_warpruntime-delegation-dry.md new file mode 100644 index 00000000..f30ce32a --- /dev/null +++ b/docs/method/backlog/bad-code/DX_warpruntime-delegation-dry.md @@ -0,0 +1,16 @@ +# DRY up WarpRuntime delegation boilerplate + +**Effort:** XS + +## Problem + +`WarpRuntime.js` lines 646-813 repeat the same `Object.defineProperty` +delegation loop 7 times (StrandController, QueryController, +ForkController, ProvenanceController, SubscriptionController, +ComparisonController, SyncController). Each loop is identical except +for the controller field name and method list. + +## Fix + +Extract a helper: `delegateToController(Class, controllerField, methods)`. +One call per controller, zero boilerplate. From 23fbacb53a59ad2928800a42a51fb969f77e8eee Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 18:37:57 -0700 Subject: [PATCH 67/73] =?UTF-8?q?fix:=20address=20CodeRabbit=20review=20fe?= =?UTF-8?q?edback=20=E2=80=94=20all=20findings=20resolved?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Inline comments: - ForkController: fork creation now rolls back ref on WarpRuntime.open failure (was non-atomic — dangling ref on error) - ForkController._isAncestor: replaced MAX_WALK counter with visited Set for true cycle detection (was false-positive on deep histories) - ForkController._validatePatchAgainstCheckpoint: backfill rejection and divergence now throw typed ForkError with E_FORK_BACKFILL_REJECTED and E_FORK_WRITER_DIVERGED codes (was raw Error) - JoinReducer.cloneStateV5: restored structural fallback for plain objects from checkpoint deserialization (was strict instanceof-only) - JoinReducer: receiptName load-time validation now checks against TickReceipt OP_TYPES allowlist (catches typos at module load) Duplicate comments: - ComparisonController: introduced ComparisonSideResolver class to hold graph + scope + liveFrontier as instance state, with role tag ('left'/'right'/'source'/'target') preventing side swaps. Live frontier captured once and shared across both sides. - ComparisonController: added assertOptionsObject() validation for compareStrand() and planStrandTransfer() options params (was leaking raw TypeErrors on null/array/primitive input) --- src/domain/services/ComparisonController.js | 290 +++++++++++--------- src/domain/services/ForkController.js | 64 +++-- src/domain/services/JoinReducer.js | 25 +- 3 files changed, 219 insertions(+), 160 deletions(-) diff --git a/src/domain/services/ComparisonController.js b/src/domain/services/ComparisonController.js index f63f5d61..8dd2d9a6 100644 --- a/src/domain/services/ComparisonController.js +++ b/src/domain/services/ComparisonController.js @@ -646,146 +646,151 @@ async function finalizeComparisonSide(graph, params, scope) { * @param {import('../WarpRuntime.js').default} graph * @param {NormalizedSelector} selector * @param {VisibleStateScopeV1|null} scope - * @returns {Promise>} + * @param {{ liveFrontier?: Map|null }} [opts] - Resolution options * @private */ -async function resolveLiveComparisonSide(graph, selector, scope) { - const ceiling = selector.ceiling ?? null; - const requestedFrontier = /** @type {Map} */ (await graph.getFrontier()); - const requestedRecord = normalizeFrontierRecord(requestedFrontier, 'live.frontier'); - const state = await graph.materializeCoordinate({ - frontier: frontierRecordToMap(requestedRecord), - ...optionalCeiling(ceiling), - }); - const patchEntries = await collectPatchEntriesForFrontier(graph, requestedRecord, ceiling); - return await finalizeComparisonSide(graph, { - requested: { kind: 'live', ...optionalCeiling(ceiling) }, - state, - patchEntries, - coordinateKind: 'frontier', - lamportCeiling: ceiling, - }, scope); -} /** - * Resolves an explicit 'coordinate' side. + * Resolves a comparison side for coordinate/strand comparisons. * - * @param {import('../WarpRuntime.js').default} graph - * @param {NormalizedSelector} selector - * @param {VisibleStateScopeV1|null} scope - * @returns {Promise>} - * @private + * Holds the graph, scope, pre-captured live frontier, and the side's + * role ('left'/'right'/'source'/'target') as instance state. + * The role tag prevents accidentally swapping sides in downstream calls. */ -async function resolveCoordinateComparisonSide(graph, selector, scope) { - const ceiling = selector.ceiling ?? null; - const frontier = /** @type {Record} */ (selector.frontier ?? {}); - const state = await graph.materializeCoordinate({ - frontier: frontierRecordToMap(frontier), - ...optionalCeiling(ceiling), - }); - const patchEntries = await collectPatchEntriesForFrontier(graph, frontier, ceiling); - return await finalizeComparisonSide(graph, { - requested: { ...buildCoordinateRequest(frontier, ceiling), kind: 'coordinate' }, - state, - patchEntries, - coordinateKind: 'frontier', - lamportCeiling: ceiling, - }, scope); -} +class ComparisonSideResolver { + /** @type {import('../WarpRuntime.js').default} */ + _graph; -/** - * Resolves a 'strand' coordinate side. - * - * @param {import('../WarpRuntime.js').default} graph - * @param {NormalizedSelector} selector - * @param {VisibleStateScopeV1|null} scope - * @returns {Promise>} - * @private - */ -async function resolveStrandComparisonSide(graph, selector, scope) { - const ceiling = selector.ceiling ?? null; - const strandId = /** @type {string} */ (selector.strandId ?? ''); - const strands = new StrandService({ graph }); - const descriptor = await strands.getOrThrow(strandId); - const state = /** @type {WarpStateV5} */ (await callInternalRuntimeMethod( - graph, - 'materializeStrand', - strandId, - ceiling === null ? undefined : { ceiling }, - )); - const patchEntries = await strands.getPatchEntries( - strandId, - ceiling === null ? undefined : { ceiling }, - ); - return await finalizeComparisonSide(graph, { - requested: { kind: 'strand', strandId, ...optionalCeiling(ceiling) }, - state, - patchEntries, - coordinateKind: 'strand', - lamportCeiling: ceiling, - strand: buildStrandMetadata(strandId, descriptor), - }, scope); -} + /** @type {VisibleStateScopeV1|null} */ + _scope; -/** - * Resolves a 'strand_base' coordinate side. - * - * @param {import('../WarpRuntime.js').default} graph - * @param {NormalizedSelector} selector - * @param {VisibleStateScopeV1|null} scope - * @returns {Promise>} - * @private - */ -async function resolveStrandBaseComparisonSide(graph, selector, scope) { - const ceiling = selector.ceiling ?? null; - const strandId = /** @type {string} */ (selector.strandId ?? ''); - const strands = new StrandService({ graph }); - const descriptor = await strands.getOrThrow(strandId); - const effectiveCeiling = combineCeilings(descriptor.baseObservation.lamportCeiling, ceiling); - const state = await graph.materializeCoordinate({ - frontier: descriptor.baseObservation.frontier, - ...optionalCeiling(effectiveCeiling), - }); - const patchEntries = await collectPatchEntriesForFrontier(graph, descriptor.baseObservation.frontier, effectiveCeiling); - return await finalizeComparisonSide(graph, { - requested: { - kind: 'strand_base', - strandId, - frontier: { ...descriptor.baseObservation.frontier }, - baseLamportCeiling: descriptor.baseObservation.lamportCeiling, - ...optionalCeiling(ceiling), - }, - state, - patchEntries, - coordinateKind: 'strand_base', - lamportCeiling: effectiveCeiling, - strand: buildStrandMetadata(strandId, /** @type {StrandDescriptorV1} */ (descriptor)), - }, scope); -} + /** @type {Map|null} */ + _liveFrontier; -/** - * Dispatches coordinate side resolution based on selector kind. - * - * @param {import('../WarpRuntime.js').default} graph - * @param {NormalizedSelector} selector - * @param {VisibleStateScopeV1|null} [scope] - * @returns {Promise>} - * @private - */ -async function resolveComparisonSide(graph, selector, scope = null) { - if (selector.kind === 'live') { - return await resolveLiveComparisonSide(graph, selector, scope); + /** @type {string} */ + role; + + /** + * Creates a resolver bound to a graph, scope, and role. + * @param {string} role - 'left', 'right', 'source', or 'target' + * @param {import('../WarpRuntime.js').default} graph + * @param {{ scope: VisibleStateScopeV1|null, liveFrontier?: Map|null }} opts + */ + constructor(role, graph, opts) { + this.role = role; + this._graph = graph; + this._scope = opts.scope; + this._liveFrontier = opts.liveFrontier ?? null; + } + + /** + * Dispatches to the appropriate resolver based on selector kind. + * @param {NormalizedSelector} selector + * @returns {Promise} + */ + async resolve(selector) { + if (selector.kind === 'live') { + return await this._resolveLive(selector); + } + if (selector.kind === 'coordinate') { + return await this._resolveCoordinate(selector); + } + if (selector.kind === 'strand') { + return await this._resolveStrand(selector); + } + return await this._resolveStrandBase(selector); } - if (selector.kind === 'coordinate') { - return await resolveCoordinateComparisonSide(graph, selector, scope); + /** + * Resolves a 'live' side using the captured or fresh frontier. + * @param {NormalizedSelector} selector + * @returns {Promise} + */ + async _resolveLive(selector) { + const ceiling = selector.ceiling ?? null; + const requestedFrontier = this._liveFrontier ?? /** @type {Map} */ (await this._graph.getFrontier()); + const requestedRecord = normalizeFrontierRecord(requestedFrontier, 'live.frontier'); + const state = await this._graph.materializeCoordinate({ + frontier: frontierRecordToMap(requestedRecord), + ...optionalCeiling(ceiling), + }); + const patchEntries = await collectPatchEntriesForFrontier(this._graph, requestedRecord, ceiling); + return /** @type {ResolvedComparisonSide} */ (await finalizeComparisonSide(this._graph, { + requested: { kind: 'live', ...optionalCeiling(ceiling) }, + state, patchEntries, coordinateKind: 'frontier', lamportCeiling: ceiling, + }, this._scope)); } - if (selector.kind === 'strand') { - return await resolveStrandComparisonSide(graph, selector, scope); + /** + * Resolves an explicit 'coordinate' side. + * @param {NormalizedSelector} selector + * @returns {Promise} + */ + async _resolveCoordinate(selector) { + const ceiling = selector.ceiling ?? null; + const frontier = /** @type {Record} */ (selector.frontier ?? {}); + const state = await this._graph.materializeCoordinate({ + frontier: frontierRecordToMap(frontier), + ...optionalCeiling(ceiling), + }); + const patchEntries = await collectPatchEntriesForFrontier(this._graph, frontier, ceiling); + return /** @type {ResolvedComparisonSide} */ (await finalizeComparisonSide(this._graph, { + requested: { ...buildCoordinateRequest(frontier, ceiling), kind: 'coordinate' }, + state, patchEntries, coordinateKind: 'frontier', lamportCeiling: ceiling, + }, this._scope)); } - return await resolveStrandBaseComparisonSide(graph, selector, scope); + /** + * Resolves a 'strand' coordinate side. + * @param {NormalizedSelector} selector + * @returns {Promise} + */ + async _resolveStrand(selector) { + const ceiling = selector.ceiling ?? null; + const strandId = /** @type {string} */ (selector.strandId ?? ''); + const strands = new StrandService({ graph: this._graph }); + const descriptor = await strands.getOrThrow(strandId); + const state = /** @type {WarpStateV5} */ (await callInternalRuntimeMethod( + this._graph, 'materializeStrand', strandId, + ceiling === null ? undefined : { ceiling }, + )); + const patchEntries = await strands.getPatchEntries( + strandId, ceiling === null ? undefined : { ceiling }, + ); + return /** @type {ResolvedComparisonSide} */ (await finalizeComparisonSide(this._graph, { + requested: { kind: 'strand', strandId, ...optionalCeiling(ceiling) }, + state, patchEntries, coordinateKind: 'strand', lamportCeiling: ceiling, + strand: buildStrandMetadata(strandId, descriptor), + }, this._scope)); + } + + /** + * Resolves a 'strand_base' coordinate side. + * @param {NormalizedSelector} selector + * @returns {Promise} + */ + async _resolveStrandBase(selector) { + const ceiling = selector.ceiling ?? null; + const strandId = /** @type {string} */ (selector.strandId ?? ''); + const strands = new StrandService({ graph: this._graph }); + const descriptor = await strands.getOrThrow(strandId); + const effectiveCeiling = combineCeilings(descriptor.baseObservation.lamportCeiling, ceiling); + const state = await this._graph.materializeCoordinate({ + frontier: descriptor.baseObservation.frontier, + ...optionalCeiling(effectiveCeiling), + }); + const patchEntries = await collectPatchEntriesForFrontier(this._graph, descriptor.baseObservation.frontier, effectiveCeiling); + return /** @type {ResolvedComparisonSide} */ (await finalizeComparisonSide(this._graph, { + requested: { + kind: 'strand_base', strandId, + frontier: { ...descriptor.baseObservation.frontier }, + baseLamportCeiling: descriptor.baseObservation.lamportCeiling, + ...optionalCeiling(ceiling), + }, + state, patchEntries, coordinateKind: 'strand_base', lamportCeiling: effectiveCeiling, + strand: buildStrandMetadata(strandId, /** @type {StrandDescriptorV1} */ (descriptor)), + }, this._scope)); + } } /** @@ -838,6 +843,7 @@ function normalizeAgainstSelector(normalizedStrandId, against, againstCeiling) { * @returns {Promise} */ async function compareStrandImpl(graph, strandId, options = {}) { + assertOptionsObject(options, 'compareStrand()'); const normalizedStrandId = normalizeRequiredString(strandId, 'strandId'); const ceiling = normalizeLamportCeiling(options.ceiling, 'ceiling'); const againstCeiling = normalizeLamportCeiling(options.againstCeiling, 'againstCeiling'); @@ -911,6 +917,7 @@ function normalizeIntoSelector(normalizedStrandId, into, intoCeiling) { * @returns {Promise} */ async function planStrandTransferImpl(graph, strandId, options = {}) { + assertOptionsObject(options, 'planStrandTransfer()'); const normalizedStrandId = normalizeRequiredString(strandId, 'strandId'); const ceiling = normalizeLamportCeiling(options.ceiling, 'ceiling'); const intoCeiling = normalizeLamportCeiling(options.intoCeiling, 'intoCeiling'); @@ -932,6 +939,23 @@ async function planStrandTransferImpl(graph, strandId, options = {}) { * @param {unknown} options * @returns {void} */ +/** + * Asserts that an options argument is a plain object (not null, array, or primitive). + * @param {unknown} options + * @param {string} callerName + * @returns {void} + */ +function assertOptionsObject(options, callerName) { + if (options !== null && options !== undefined && (typeof options !== 'object' || Array.isArray(options))) { + throw new QueryError(`${callerName} options must be an object`, { code: 'invalid_coordinate' }); + } +} + +/** + * Asserts that transfer options are valid. + * @param {unknown} options + * @returns {void} + */ function assertTransferOptions(options) { const isInvalid = options === null || options === undefined || typeof options !== 'object' || Array.isArray(options); if (isInvalid) { @@ -996,13 +1020,17 @@ async function planCoordinateTransferImpl(graph, options) { const normalizedSource = /** @type {NormalizedSelector} */ (normalizeSelector(options.source, 'source')); const normalizedTarget = /** @type {NormalizedSelector} */ (normalizeSelector(options.target, 'target')); const scope = normalizeVisibleStateScopeV1(options.scope, 'scope'); + // Capture frontier once for consistency across comparison + transfer plan + const liveFrontier = (normalizedSource.kind === 'live' || normalizedTarget.kind === 'live') + ? /** @type {Map} */ (await graph.getFrontier()) + : null; const comp = await compareCoordinatesImpl(graph, { left: /** @type {CoordinateComparisonSelectorV1} */ (/** @type {unknown} */ (normalizedSource)), right: /** @type {CoordinateComparisonSelectorV1} */ (/** @type {unknown} */ (normalizedTarget)), ...(scope !== null && scope !== undefined ? { scope } : {}), }); - const sourceSide = /** @type {ResolvedComparisonSide} */ (await resolveComparisonSide(graph, normalizedSource, scope)); - const targetSide = /** @type {ResolvedComparisonSide} */ (await resolveComparisonSide(graph, normalizedTarget, scope)); + const sourceSide = await new ComparisonSideResolver('source', graph, { scope, liveFrontier }).resolve(normalizedSource); + const targetSide = await new ComparisonSideResolver('target', graph, { scope, liveFrontier }).resolve(normalizedTarget); /** Loads node content blob by OID. @type {(nodeId: string, meta: { oid: string }) => Promise} */ const loadNodeContent = async (_nodeId, meta) => await readContentBlobByOid(graph, meta.oid); /** Loads edge content blob by OID. @type {(edge: unknown, meta: { oid: string }) => Promise} */ @@ -1063,8 +1091,12 @@ function assertComparisonOptions(options) { async function compareCoordinatesImpl(graph, options) { const { normalizedLeft, normalizedRight, targetId, scope } = extractComparisonInputs(options); - const left = /** @type {ResolvedComparisonSide} */ (await resolveComparisonSide(graph, normalizedLeft, scope)); - const right = /** @type {ResolvedComparisonSide} */ (await resolveComparisonSide(graph, normalizedRight, scope)); + // Capture the live frontier ONCE so both sides see the same snapshot + const liveFrontier = (normalizedLeft.kind === 'live' || normalizedRight.kind === 'live') + ? /** @type {Map} */ (await graph.getFrontier()) + : null; + const left = await new ComparisonSideResolver('left', graph, { scope, liveFrontier }).resolve(normalizedLeft); + const right = await new ComparisonSideResolver('right', graph, { scope, liveFrontier }).resolve(normalizedRight); const visiblePatchDivergence = buildPatchDivergenceImpl(left.patchEntries, right.patchEntries, targetId); const visibleState = compareVisibleStateV5(left.state, right.state, { targetId }); diff --git a/src/domain/services/ForkController.js b/src/domain/services/ForkController.js index b4b30ddf..a3bb0256 100644 --- a/src/domain/services/ForkController.js +++ b/src/domain/services/ForkController.js @@ -129,20 +129,32 @@ export default class ForkController { // Dynamic import to avoid circular dependency const { default: WarpRuntime } = await import('../WarpRuntime.js'); - const forkGraph = await WarpRuntime.open({ - persistence: host._persistence, - graphName: resolvedForkName, - writerId: resolvedForkWriterId, - gcPolicy: host._gcPolicy, - adjacencyCacheSize: host._adjacencyCache?.maxSize ?? DEFAULT_ADJACENCY_CACHE_SIZE, - ...(host._checkpointPolicy ? { checkpointPolicy: host._checkpointPolicy } : {}), - autoMaterialize: host._autoMaterialize, - onDeleteWithData: host._onDeleteWithData, - ...(host._logger ? { logger: host._logger } : {}), - clock: host._clock, - crypto: host._crypto, - codec: host._codec, - }); + /** @type {import('../WarpRuntime.js').default} */ + let forkGraph; + try { + forkGraph = await WarpRuntime.open({ + persistence: host._persistence, + graphName: resolvedForkName, + writerId: resolvedForkWriterId, + gcPolicy: host._gcPolicy, + adjacencyCacheSize: host._adjacencyCache?.maxSize ?? DEFAULT_ADJACENCY_CACHE_SIZE, + ...(host._checkpointPolicy ? { checkpointPolicy: host._checkpointPolicy } : {}), + autoMaterialize: host._autoMaterialize, + onDeleteWithData: host._onDeleteWithData, + ...(host._logger ? { logger: host._logger } : {}), + clock: host._clock, + crypto: host._crypto, + codec: host._codec, + }); + } catch (openErr) { + // Rollback: delete the ref we just created to avoid a dangling fork + try { + await host._persistence.deleteRef(forkWriterRef); + } catch { + // Best-effort rollback — log but don't mask the original error + } + throw openErr; + } host._logTiming('fork', t0, { metrics: `from=${from} at=${at.slice(0, 7)} name=${resolvedForkName}`, @@ -203,12 +215,16 @@ export default class ForkController { /** @type {string | null} */ let cur = descendantSha; - const MAX_WALK = 100_000; - let steps = 0; + /** @type {Set} */ + const visited = new Set(); while (cur !== null) { - if (++steps > MAX_WALK) { - throw new Error(`_isAncestor: exceeded ${MAX_WALK} steps — possible cycle`); + if (visited.has(cur)) { + throw new ForkError('Cycle detected in commit graph', { + code: 'E_FORK_CYCLE_DETECTED', + context: { sha: cur }, + }); } + visited.add(cur); const nodeInfo = await this._host._persistence.getNodeInfo(cur); const parent = nodeInfo.parents?.[0] ?? null; if (parent === ancestorSha) { @@ -260,16 +276,16 @@ export default class ForkController { const relation = await this._relationToCheckpointHead(ckHead, incomingSha); if (relation === 'same' || relation === 'behind') { - throw new Error( - `Backfill rejected for writer ${writerId}: ` + - `incoming patch is ${relation} checkpoint frontier` + throw new ForkError( + `Backfill rejected for writer ${writerId}: incoming patch is ${relation} checkpoint frontier`, + { code: 'E_FORK_BACKFILL_REJECTED', context: { writerId, incomingSha, relation, ckHead } }, ); } if (relation === 'diverged') { - throw new Error( - `Writer fork detected for ${writerId}: ` + - `incoming patch does not extend checkpoint head` + throw new ForkError( + `Writer fork detected for ${writerId}: incoming patch does not extend checkpoint head`, + { code: 'E_FORK_WRITER_DIVERGED', context: { writerId, incomingSha, ckHead } }, ); } } diff --git a/src/domain/services/JoinReducer.js b/src/domain/services/JoinReducer.js index e1d24155..8cc4592b 100644 --- a/src/domain/services/JoinReducer.js +++ b/src/domain/services/JoinReducer.js @@ -9,8 +9,8 @@ * } */ -import { orsetAdd, orsetRemove, orsetJoin, orsetContains } from '../crdt/ORSet.js'; -import { vvMerge, vvDeserialize } from '../crdt/VersionVector.js'; +import { orsetAdd, orsetRemove, orsetJoin, orsetContains, orsetClone } from '../crdt/ORSet.js'; +import { vvMerge, vvClone, vvDeserialize } from '../crdt/VersionVector.js'; import { lwwSet, lwwMax } from '../crdt/LWW.js'; import { createEventId, compareEventIds } from '../utils/EventId.js'; import { createTickReceipt, OP_TYPES } from '../types/TickReceipt.js'; @@ -471,6 +471,9 @@ for (const [type, strategy] of OP_STRATEGIES) { if (typeof strategy.receiptName !== 'string' || strategy.receiptName.length === 0) { throw new Error(`OpStrategy '${type}' missing required property 'receiptName'`); } + if (!OP_TYPES.includes(strategy.receiptName)) { + throw new Error(`OpStrategy '${type}' receiptName '${strategy.receiptName}' is not in TickReceipt OP_TYPES`); + } } /** @@ -1088,10 +1091,18 @@ export function reduceV5(patches, initialState, options) { * @returns {WarpStateV5} A new state with identical contents but independent data structures */ export function cloneStateV5(state) { - if (!(state instanceof WarpStateV5)) { - throw new PatchError('cloneStateV5: expected WarpStateV5 instance', { - context: { actual: typeof state }, - }); + if (state instanceof WarpStateV5) { + return state.clone(); } - return state.clone(); + // Structural fallback: normalize plain/deserialized objects into WarpStateV5. + // This handles checkpoint deserialization and test fixtures that construct + // state as plain objects. + const s = /** @type {Record} */ (/** @type {unknown} */ (state)); + return new WarpStateV5({ + nodeAlive: orsetClone(/** @type {import('../crdt/ORSet.js').ORSet} */ (s['nodeAlive'])), + edgeAlive: orsetClone(/** @type {import('../crdt/ORSet.js').ORSet} */ (s['edgeAlive'])), + prop: new Map(/** @type {Map>} */ (s['prop'])), + observedFrontier: vvClone(/** @type {import('../crdt/VersionVector.js').VersionVector} */ (s['observedFrontier'])), + edgeBirthEvent: new Map(/** @type {Map} */ (s['edgeBirthEvent'] ?? [])), + }); } From fa5421c9e547cd890780d37b3e2ef2bcc94b7370 Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 19:01:53 -0700 Subject: [PATCH 68/73] refactor(NDNM): selector subclasses, OpOutcome subclasses, ResolvedComparisonSide class MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit ComparisonController: - NormalizedSelector is now a base class with 4 subclasses: LiveSelector, CoordinateSelector, StrandSelector, StrandBaseSelector. Each implements resolve() directly — no dispatch switch, no resolver intermediary. The selector IS the resolver. - ResolvedComparisonSide promoted from typedef to class. - ComparisonSideResolver deleted (selectors resolve themselves). - Live frontier captured once and shared via resolve() param. - assertOptionsObject() guards on compareStrand/planStrandTransfer. JoinReducer: - OpOutcomeResult is now a base class with 3 subclasses: OpApplied, OpSuperseded, OpRedundant. OpSuperseded carries the winning EventId as a structured field instead of a formatted string. - cloneStateV5 restored structural fallback for plain objects. - receiptName load-time validation checks OP_TYPES allowlist. BoundaryTransitionRecord: - VerificationResult promoted from typedef to class. - BTR.deserializeBTR returns class instance. ForkController: - _isAncestor uses visited Set for true cycle detection. - Fork ref creation rolls back on WarpRuntime.open failure. - Backfill rejection uses typed ForkError codes. Logged asap/NDNM_comparison-pipeline-class-hierarchy for remaining value object work (LamportCeiling, StrandId, WriterId, etc.). --- ...DNM_comparison-pipeline-class-hierarchy.md | 25 + .../services/BoundaryTransitionRecord.js | 33 +- src/domain/services/ComparisonController.js | 430 +++++++++--------- src/domain/services/JoinReducer.js | 100 +++- 4 files changed, 347 insertions(+), 241 deletions(-) create mode 100644 docs/method/backlog/asap/NDNM_comparison-pipeline-class-hierarchy.md diff --git a/docs/method/backlog/asap/NDNM_comparison-pipeline-class-hierarchy.md b/docs/method/backlog/asap/NDNM_comparison-pipeline-class-hierarchy.md new file mode 100644 index 00000000..0fb1d02e --- /dev/null +++ b/docs/method/backlog/asap/NDNM_comparison-pipeline-class-hierarchy.md @@ -0,0 +1,25 @@ +# Comparison pipeline: proper class hierarchy + +**Effort:** L + +## Problem + +ComparisonController's comparison pipeline uses `unknown` params, +validator functions, and string-switched dispatch that should be +class hierarchies with constructors. + +Partially addressed in this PR: +- NormalizedSelector → LiveSelector, CoordinateSelector, StrandSelector, + StrandBaseSelector subclasses (each implements resolve()) +- OpOutcomeResult → OpApplied, OpSuperseded, OpRedundant subclasses +- ResolvedComparisonSide class +- ComparisonSideResolver eliminated (selectors resolve themselves) + +Still needed: +- LamportCeiling value object (validates non-negative int in constructor) +- StrandId value object (validates non-empty string in constructor) +- WriterId value object (same pattern) +- Remove all `normalizeX(unknown)` validator functions — these become + constructors +- Remove all `assertX(unknown)` guard functions — same +- Replace `Record` options bags with typed classes diff --git a/src/domain/services/BoundaryTransitionRecord.js b/src/domain/services/BoundaryTransitionRecord.js index a16fad61..fdfafcd5 100644 --- a/src/domain/services/BoundaryTransitionRecord.js +++ b/src/domain/services/BoundaryTransitionRecord.js @@ -168,10 +168,27 @@ export class BTR { } /** - * @typedef {Object} VerificationResult - * @property {boolean} valid - Whether the BTR is valid - * @property {string} [reason] - Reason for failure (if invalid) + * VerificationResult — outcome of BTR HMAC/replay verification. */ +export class VerificationResult { + /** @type {boolean} */ + valid; + + /** @type {string|undefined} Reason for failure (if invalid) */ + reason; + + /** + * Creates a VerificationResult. + * @param {boolean} valid + * @param {string} [reason] + */ + constructor(valid, reason) { + this.valid = valid; + if (reason !== undefined) { + this.reason = reason; + } + } +} /** * Creates a Boundary Transition Record from an initial state and payload. @@ -376,7 +393,7 @@ async function verifyReplayHash(btr, deps = {}) { export async function verifyBTR(btr, key, options = {}) { const structureError = validateBTRStructure(btr); if (structureError !== null) { - return { valid: false, reason: structureError }; + return new VerificationResult(false, structureError); } const hmacDeps = /** @type {{ crypto: import('../../ports/CryptoPort.js').default, codec?: import('../../ports/CodecPort.js').default }} */ (buildDeps({ crypto: options.crypto, codec: options.codec })); @@ -400,10 +417,10 @@ async function verifyReplayIfRequested(btr, options) { const replayDeps = buildDeps({ crypto: options.crypto, codec: options.codec }); const replayError = await verifyReplayHash(btr, replayDeps); if (replayError !== null) { - return { valid: false, reason: replayError }; + return new VerificationResult(false, replayError); } } - return { valid: true }; + return new VerificationResult(true); } /** @@ -420,12 +437,12 @@ async function verifyHmacSafe(btr, key, deps) { hmacValid = await verifyHmac(btr, key, deps); } catch (err) { if (err instanceof RangeError) { - return { valid: false, reason: `Invalid hex in authentication tag: ${err.message}` }; + return new VerificationResult(false, `Invalid hex in authentication tag: ${err.message}`); } throw err; } if (!hmacValid) { - return { valid: false, reason: 'Authentication tag mismatch' }; + return new VerificationResult(false, 'Authentication tag mismatch'); } return null; } diff --git a/src/domain/services/ComparisonController.js b/src/domain/services/ComparisonController.js index 8dd2d9a6..a5d43f21 100644 --- a/src/domain/services/ComparisonController.js +++ b/src/domain/services/ComparisonController.js @@ -40,24 +40,190 @@ const COORDINATE_TRANSFER_PLAN_VERSION = 'coordinate-transfer-plan/v1'; */ /** - * Internal normalized selector shape after validation. - * - * @typedef {Object} NormalizedSelector - * @property {string} kind - Selector kind (live, strand, strand_base, coordinate) - * @property {number|null} [ceiling] - Optional lamport ceiling - * @property {string} [strandId] - Strand identifier (for strand/strand_base kinds) - * @property {Record} [frontier] - Frontier record (for coordinate kind) +/** + * NormalizedSelector — base class for validated comparison selectors. + * Each subclass implements `resolve()` with the resolution logic for + * its kind, eliminating dispatch switches. */ +class NormalizedSelector { + /** @type {string} */ + kind; + + /** @type {number|null} */ + ceiling; + + /** + * Creates a NormalizedSelector. + * @param {string} kind + * @param {number|null} ceiling + */ + constructor(kind, ceiling) { + this.kind = kind; + this.ceiling = ceiling; + } + + /** + * Resolves this selector into a ResolvedComparisonSide. + * @param {import('../WarpRuntime.js').default} _graph + * @param {VisibleStateScopeV1|null} _scope + * @param {Map|null} _liveFrontier + * @returns {Promise} + */ + resolve(_graph, _scope, _liveFrontier) { + throw new QueryError(`NormalizedSelector.resolve() must be overridden by ${this.kind} subclass`, { code: 'invalid_coordinate' }); + } +} + +/** Live frontier selector. */ +class LiveSelector extends NormalizedSelector { + /** Creates a LiveSelector. + * @param {number|null} ceiling + */ + constructor(ceiling) { + super('live', ceiling); + } + + /** Resolves live frontier to a comparison side. @param {import('../WarpRuntime.js').default} graph @param {VisibleStateScopeV1|null} scope @param {Map|null} liveFrontier @returns {Promise} */ + async resolve(graph, scope, liveFrontier) { + const requestedFrontier = liveFrontier ?? /** @type {Map} */ (await graph.getFrontier()); + const requestedRecord = normalizeFrontierRecord(requestedFrontier, 'live.frontier'); + const state = await graph.materializeCoordinate({ + frontier: frontierRecordToMap(requestedRecord), + ...optionalCeiling(this.ceiling), + }); + const patchEntries = await collectPatchEntriesForFrontier(graph, requestedRecord, this.ceiling); + return await finalizeSide(graph, { + requested: { kind: 'live', ...optionalCeiling(this.ceiling) }, + state, patchEntries, coordinateKind: 'frontier', lamportCeiling: this.ceiling, + }, scope); + } +} + +/** Explicit coordinate (frontier) selector. */ +class CoordinateSelector extends NormalizedSelector { + /** @type {Record} */ + frontier; + + /** Creates a CoordinateSelector. + * @param {Record} frontier + * @param {number|null} ceiling + */ + constructor(frontier, ceiling) { + super('coordinate', ceiling); + this.frontier = frontier; + } + + /** Resolves explicit coordinate frontier to a comparison side. @param {import('../WarpRuntime.js').default} graph @param {VisibleStateScopeV1|null} scope @returns {Promise} */ + async resolve(graph, scope) { + const state = await graph.materializeCoordinate({ + frontier: frontierRecordToMap(this.frontier), + ...optionalCeiling(this.ceiling), + }); + const patchEntries = await collectPatchEntriesForFrontier(graph, this.frontier, this.ceiling); + return await finalizeSide(graph, { + requested: { ...buildCoordinateRequest(this.frontier, this.ceiling), kind: 'coordinate' }, + state, patchEntries, coordinateKind: 'frontier', lamportCeiling: this.ceiling, + }, scope); + } +} + +/** Strand overlay selector. */ +class StrandSelector extends NormalizedSelector { + /** @type {string} */ + strandId; + + /** Creates a StrandSelector. + * @param {string} strandId + * @param {number|null} ceiling + */ + constructor(strandId, ceiling) { + super('strand', ceiling); + this.strandId = strandId; + } + + /** Resolves strand overlay to a comparison side. @param {import('../WarpRuntime.js').default} graph @param {VisibleStateScopeV1|null} scope @returns {Promise} */ + async resolve(graph, scope) { + const strands = new StrandService({ graph }); + const descriptor = await strands.getOrThrow(this.strandId); + const state = /** @type {WarpStateV5} */ (await callInternalRuntimeMethod( + graph, 'materializeStrand', this.strandId, + this.ceiling === null ? undefined : { ceiling: this.ceiling }, + )); + const patchEntries = await strands.getPatchEntries( + this.strandId, this.ceiling === null ? undefined : { ceiling: this.ceiling }, + ); + return await finalizeSide(graph, { + requested: { kind: 'strand', strandId: this.strandId, ...optionalCeiling(this.ceiling) }, + state, patchEntries, coordinateKind: 'strand', lamportCeiling: this.ceiling, + strand: buildStrandMetadata(this.strandId, descriptor), + }, scope); + } +} + +/** Strand base observation selector. */ +class StrandBaseSelector extends NormalizedSelector { + /** @type {string} */ + strandId; + + /** Creates a StrandBaseSelector. + * @param {string} strandId + * @param {number|null} ceiling + */ + constructor(strandId, ceiling) { + super('strand_base', ceiling); + this.strandId = strandId; + } + + /** Resolves strand base observation to a comparison side. @param {import('../WarpRuntime.js').default} graph @param {VisibleStateScopeV1|null} scope @returns {Promise} */ + async resolve(graph, scope) { + const strands = new StrandService({ graph }); + const descriptor = await strands.getOrThrow(this.strandId); + const effectiveCeiling = combineCeilings(descriptor.baseObservation.lamportCeiling, this.ceiling); + const state = await graph.materializeCoordinate({ + frontier: descriptor.baseObservation.frontier, + ...optionalCeiling(effectiveCeiling), + }); + const patchEntries = await collectPatchEntriesForFrontier(graph, descriptor.baseObservation.frontier, effectiveCeiling); + return await finalizeSide(graph, { + requested: { + kind: 'strand_base', strandId: this.strandId, + frontier: { ...descriptor.baseObservation.frontier }, + baseLamportCeiling: descriptor.baseObservation.lamportCeiling, + ...optionalCeiling(this.ceiling), + }, + state, patchEntries, coordinateKind: 'strand_base', lamportCeiling: effectiveCeiling, + strand: buildStrandMetadata(this.strandId, /** @type {StrandDescriptorV1} */ (descriptor)), + }, scope); + } +} /** - * Resolved comparison side with state, entries, and metadata. - * - * @typedef {Object} ResolvedComparisonSide - * @property {Record} requested - Original requested selector - * @property {WarpStateV5} state - Materialized state - * @property {Array<{ patch: import('../types/WarpTypesV2.js').PatchV2, sha: string }>} patchEntries - Patch entries - * @property {Record} resolved - Resolved metadata with digests + * ResolvedComparisonSide — materialized state + metadata for one side of a comparison. */ +class ResolvedComparisonSide { + /** @type {Record} Original requested selector */ + requested; + + /** @type {Record} Resolved metadata with digests */ + resolved; + + /** @type {WarpStateV5} Materialized state */ + state; + + /** @type {Array<{ patch: import('../types/WarpTypesV2.js').PatchV2, sha: string }>} */ + patchEntries; + + /** + * Creates a ResolvedComparisonSide. + * @param {{ requested: Record, state: WarpStateV5, patchEntries: Array<{ patch: import('../types/WarpTypesV2.js').PatchV2, sha: string }>, resolved: Record }} fields + */ + constructor({ requested, state, patchEntries, resolved }) { + this.requested = requested; + this.resolved = resolved; + this.state = state; + this.patchEntries = patchEntries; + } +} /** * Deterministically compares two strings. @@ -485,22 +651,19 @@ async function collectPatchEntriesForFrontier(graph, frontierRecord, ceiling) { * @param {string} field - Field name for error context * @returns {Record} */ +/** + * Normalizes a raw selector into a NormalizedSelector. + * @param {Record} selector + * @param {string} field + * @returns {NormalizedSelector} + */ function normalizeSelector(selector, field) { const raw = /** @type {Record} */ (selector); const kind = extractSelectorKind(raw); - /** @type {Record, f: string) => Record>} */ - const handlers = { - live: normalizeLiveSelector, - coordinate: normalizeCoordinateSelector, - }; - const handler = handlers[kind]; - if (handler !== undefined) { - return handler(raw, field); - } - if (kind === 'strand' || kind === 'strand_base') { - return normalizeStrandSelector(raw, kind, field); - } + if (kind === 'live') { return normalizeLiveSelector(raw, field); } + if (kind === 'coordinate') { return normalizeCoordinateSelector(raw, field); } + if (kind === 'strand' || kind === 'strand_base') { return normalizeStrandSelector(raw, kind, field); } throw new QueryError(`${field}.kind is unsupported`, { code: 'invalid_coordinate', context: { field, kind } }); } @@ -522,43 +685,46 @@ function extractSelectorKind(raw) { * @param {string} field - Field name for error context * @returns {Record} */ +/** + * Normalizes a 'live' selector. + * @param {Record} raw + * @param {string} field + * @returns {LiveSelector} + */ function normalizeLiveSelector(raw, field) { const r = /** @type {{ ceiling?: unknown }} */ (raw); - return { kind: 'live', ceiling: normalizeLamportCeiling(r.ceiling, `${field}.ceiling`) }; + return new LiveSelector(normalizeLamportCeiling(r.ceiling, `${field}.ceiling`)); } /** - * Normalizes a 'strand' or 'strand_base' kind selector. - * - * @param {Record} raw - Parsed selector record - * @param {string} kind - The selector kind - * @param {string} field - Field name for error context - * @returns {Record} + * Normalizes a 'strand' or 'strand_base' selector. + * @param {Record} raw + * @param {string} kind + * @param {string} field + * @returns {StrandSelector|StrandBaseSelector} */ function normalizeStrandSelector(raw, kind, field) { const r = /** @type {{ strandId?: unknown, ceiling?: unknown }} */ (raw); - return { - kind, - strandId: normalizeRequiredString(r.strandId, `${field}.strandId`), - ceiling: normalizeLamportCeiling(r.ceiling, `${field}.ceiling`), - }; + const strandId = normalizeRequiredString(r.strandId, `${field}.strandId`); + const ceiling = normalizeLamportCeiling(r.ceiling, `${field}.ceiling`); + return kind === 'strand_base' + ? new StrandBaseSelector(strandId, ceiling) + : new StrandSelector(strandId, ceiling); } /** - * Normalizes a 'coordinate' kind selector. - * - * @param {Record} raw - Parsed selector record - * @param {string} field - Field name for error context - * @returns {Record} + * Normalizes a 'coordinate' selector. + * @param {Record} raw + * @param {string} field + * @returns {CoordinateSelector} */ function normalizeCoordinateSelector(raw, field) { const r = /** @type {{ frontier?: unknown, ceiling?: unknown }} */ (raw); const f = /** @type {Map|Record} */ (r.frontier); - return { - kind: 'coordinate', - frontier: normalizeFrontierRecord(f, `${field}.frontier`), - ceiling: normalizeLamportCeiling(r.ceiling, `${field}.ceiling`), - }; + return new CoordinateSelector( + normalizeFrontierRecord(f, `${field}.frontier`), + normalizeLamportCeiling(r.ceiling, `${field}.ceiling`), + ); } /** @@ -608,9 +774,9 @@ function buildStrandMetadata(strandId, descriptor) { * strand?: Record * }} params * @param {VisibleStateScopeV1|null} scope - * @returns {Promise>} + * @returns {Promise} */ -async function finalizeComparisonSide(graph, params, scope) { +async function finalizeSide(graph, params, scope) { const { requested, state, patchEntries, coordinateKind, lamportCeiling, strand } = params; const scopedState = scopeMaterializedStateV5(state, scope); const scopedPatchEntries = scopePatchEntriesV1(patchEntries, scope); @@ -621,7 +787,7 @@ async function finalizeComparisonSide(graph, params, scope) { const stateHash = await computeStateHashV5(scopedState, { crypto: graph._crypto, codec: graph._codec }); const patchShas = uniqueSortedPatchShas(scopedPatchEntries); - return { + return new ResolvedComparisonSide({ requested, state: scopedState, patchEntries: scopedPatchEntries, @@ -637,161 +803,9 @@ async function finalizeComparisonSide(graph, params, scope) { summary: summarizeVisibleState(reader, scopedPatchEntries.length), ...(strand !== undefined ? { strand } : {}), }, - }; + }); } -/** - * Resolves the 'live' coordinate side. - * - * @param {import('../WarpRuntime.js').default} graph - * @param {NormalizedSelector} selector - * @param {VisibleStateScopeV1|null} scope - * @param {{ liveFrontier?: Map|null }} [opts] - Resolution options - * @private - */ - -/** - * Resolves a comparison side for coordinate/strand comparisons. - * - * Holds the graph, scope, pre-captured live frontier, and the side's - * role ('left'/'right'/'source'/'target') as instance state. - * The role tag prevents accidentally swapping sides in downstream calls. - */ -class ComparisonSideResolver { - /** @type {import('../WarpRuntime.js').default} */ - _graph; - - /** @type {VisibleStateScopeV1|null} */ - _scope; - - /** @type {Map|null} */ - _liveFrontier; - - /** @type {string} */ - role; - - /** - * Creates a resolver bound to a graph, scope, and role. - * @param {string} role - 'left', 'right', 'source', or 'target' - * @param {import('../WarpRuntime.js').default} graph - * @param {{ scope: VisibleStateScopeV1|null, liveFrontier?: Map|null }} opts - */ - constructor(role, graph, opts) { - this.role = role; - this._graph = graph; - this._scope = opts.scope; - this._liveFrontier = opts.liveFrontier ?? null; - } - - /** - * Dispatches to the appropriate resolver based on selector kind. - * @param {NormalizedSelector} selector - * @returns {Promise} - */ - async resolve(selector) { - if (selector.kind === 'live') { - return await this._resolveLive(selector); - } - if (selector.kind === 'coordinate') { - return await this._resolveCoordinate(selector); - } - if (selector.kind === 'strand') { - return await this._resolveStrand(selector); - } - return await this._resolveStrandBase(selector); - } - - /** - * Resolves a 'live' side using the captured or fresh frontier. - * @param {NormalizedSelector} selector - * @returns {Promise} - */ - async _resolveLive(selector) { - const ceiling = selector.ceiling ?? null; - const requestedFrontier = this._liveFrontier ?? /** @type {Map} */ (await this._graph.getFrontier()); - const requestedRecord = normalizeFrontierRecord(requestedFrontier, 'live.frontier'); - const state = await this._graph.materializeCoordinate({ - frontier: frontierRecordToMap(requestedRecord), - ...optionalCeiling(ceiling), - }); - const patchEntries = await collectPatchEntriesForFrontier(this._graph, requestedRecord, ceiling); - return /** @type {ResolvedComparisonSide} */ (await finalizeComparisonSide(this._graph, { - requested: { kind: 'live', ...optionalCeiling(ceiling) }, - state, patchEntries, coordinateKind: 'frontier', lamportCeiling: ceiling, - }, this._scope)); - } - - /** - * Resolves an explicit 'coordinate' side. - * @param {NormalizedSelector} selector - * @returns {Promise} - */ - async _resolveCoordinate(selector) { - const ceiling = selector.ceiling ?? null; - const frontier = /** @type {Record} */ (selector.frontier ?? {}); - const state = await this._graph.materializeCoordinate({ - frontier: frontierRecordToMap(frontier), - ...optionalCeiling(ceiling), - }); - const patchEntries = await collectPatchEntriesForFrontier(this._graph, frontier, ceiling); - return /** @type {ResolvedComparisonSide} */ (await finalizeComparisonSide(this._graph, { - requested: { ...buildCoordinateRequest(frontier, ceiling), kind: 'coordinate' }, - state, patchEntries, coordinateKind: 'frontier', lamportCeiling: ceiling, - }, this._scope)); - } - - /** - * Resolves a 'strand' coordinate side. - * @param {NormalizedSelector} selector - * @returns {Promise} - */ - async _resolveStrand(selector) { - const ceiling = selector.ceiling ?? null; - const strandId = /** @type {string} */ (selector.strandId ?? ''); - const strands = new StrandService({ graph: this._graph }); - const descriptor = await strands.getOrThrow(strandId); - const state = /** @type {WarpStateV5} */ (await callInternalRuntimeMethod( - this._graph, 'materializeStrand', strandId, - ceiling === null ? undefined : { ceiling }, - )); - const patchEntries = await strands.getPatchEntries( - strandId, ceiling === null ? undefined : { ceiling }, - ); - return /** @type {ResolvedComparisonSide} */ (await finalizeComparisonSide(this._graph, { - requested: { kind: 'strand', strandId, ...optionalCeiling(ceiling) }, - state, patchEntries, coordinateKind: 'strand', lamportCeiling: ceiling, - strand: buildStrandMetadata(strandId, descriptor), - }, this._scope)); - } - - /** - * Resolves a 'strand_base' coordinate side. - * @param {NormalizedSelector} selector - * @returns {Promise} - */ - async _resolveStrandBase(selector) { - const ceiling = selector.ceiling ?? null; - const strandId = /** @type {string} */ (selector.strandId ?? ''); - const strands = new StrandService({ graph: this._graph }); - const descriptor = await strands.getOrThrow(strandId); - const effectiveCeiling = combineCeilings(descriptor.baseObservation.lamportCeiling, ceiling); - const state = await this._graph.materializeCoordinate({ - frontier: descriptor.baseObservation.frontier, - ...optionalCeiling(effectiveCeiling), - }); - const patchEntries = await collectPatchEntriesForFrontier(this._graph, descriptor.baseObservation.frontier, effectiveCeiling); - return /** @type {ResolvedComparisonSide} */ (await finalizeComparisonSide(this._graph, { - requested: { - kind: 'strand_base', strandId, - frontier: { ...descriptor.baseObservation.frontier }, - baseLamportCeiling: descriptor.baseObservation.lamportCeiling, - ...optionalCeiling(ceiling), - }, - state, patchEntries, coordinateKind: 'strand_base', lamportCeiling: effectiveCeiling, - strand: buildStrandMetadata(strandId, /** @type {StrandDescriptorV1} */ (descriptor)), - }, this._scope)); - } -} /** * Checks whether a value is a strand-shaped object with kind 'strand'. @@ -1029,8 +1043,8 @@ async function planCoordinateTransferImpl(graph, options) { right: /** @type {CoordinateComparisonSelectorV1} */ (/** @type {unknown} */ (normalizedTarget)), ...(scope !== null && scope !== undefined ? { scope } : {}), }); - const sourceSide = await new ComparisonSideResolver('source', graph, { scope, liveFrontier }).resolve(normalizedSource); - const targetSide = await new ComparisonSideResolver('target', graph, { scope, liveFrontier }).resolve(normalizedTarget); + const sourceSide = await normalizedSource.resolve(graph, scope, liveFrontier); + const targetSide = await normalizedTarget.resolve(graph, scope, liveFrontier); /** Loads node content blob by OID. @type {(nodeId: string, meta: { oid: string }) => Promise} */ const loadNodeContent = async (_nodeId, meta) => await readContentBlobByOid(graph, meta.oid); /** Loads edge content blob by OID. @type {(edge: unknown, meta: { oid: string }) => Promise} */ @@ -1095,8 +1109,8 @@ async function compareCoordinatesImpl(graph, options) { const liveFrontier = (normalizedLeft.kind === 'live' || normalizedRight.kind === 'live') ? /** @type {Map} */ (await graph.getFrontier()) : null; - const left = await new ComparisonSideResolver('left', graph, { scope, liveFrontier }).resolve(normalizedLeft); - const right = await new ComparisonSideResolver('right', graph, { scope, liveFrontier }).resolve(normalizedRight); + const left = await normalizedLeft.resolve(graph, scope, liveFrontier); + const right = await normalizedRight.resolve(graph, scope, liveFrontier); const visiblePatchDivergence = buildPatchDivergenceImpl(left.patchEntries, right.patchEntries, targetId); const visibleState = compareVisibleStateV5(left.state, right.state, { targetId }); diff --git a/src/domain/services/JoinReducer.js b/src/domain/services/JoinReducer.js index 8cc4592b..661356e5 100644 --- a/src/domain/services/JoinReducer.js +++ b/src/domain/services/JoinReducer.js @@ -222,11 +222,65 @@ function requireDot(op) { // ============================================================================ /** - * @typedef {Object} OpOutcomeResult - * @property {string} target - The entity ID or key affected - * @property {'applied'|'superseded'|'redundant'} result - Outcome - * @property {string} [reason] - Explanation when superseded + * OpOutcomeResult — base class for CRDT operation outcomes. + * Subclasses carry outcome-specific data instead of fragile reason strings. */ +export class OpOutcomeResult { + /** @type {string} The entity ID or key affected */ + target; + + /** @type {'applied'|'superseded'|'redundant'} */ + result; + + /** + * Creates an OpOutcomeResult. + * @param {string} target + * @param {'applied'|'superseded'|'redundant'} result + */ + constructor(target, result) { + this.target = target; + this.result = result; + } +} + +/** The operation was applied to the state. */ +export class OpApplied extends OpOutcomeResult { + /** Creates an OpApplied. + * @param {string} target + */ + constructor(target) { + super(target, 'applied'); + } +} + +/** The operation was overridden by a concurrent write with a higher EventId. */ +export class OpSuperseded extends OpOutcomeResult { + /** @type {import('../utils/EventId.js').EventId} The winning EventId */ + winner; + + /** @type {string} Human-readable explanation */ + reason; + + /** Creates an OpSuperseded. + * @param {string} target + * @param {import('../utils/EventId.js').EventId} winner + */ + constructor(target, winner) { + super(target, 'superseded'); + this.winner = winner; + this.reason = `LWW: writer ${winner.writerId} at lamport ${winner.lamport} wins`; + } +} + +/** The operation had no effect (already present in state). */ +export class OpRedundant extends OpOutcomeResult { + /** Creates an OpRedundant. + * @param {string} target + */ + constructor(target) { + super(target, 'redundant'); + } +} /** * @typedef {Object} OpStrategy @@ -439,7 +493,7 @@ const blobValueStrategy = { const blobOp = /** @type {{ oid?: string }} */ (op); const blobOid = blobOp.oid; const blobTarget = (typeof blobOid === 'string' && blobOid.length > 0) ? blobOid : '*'; - return { target: blobTarget, result: /** @type {'applied'} */ ('applied') }; + return new OpApplied(blobTarget); }, snapshot() { return {}; }, accumulate() { /* no-op */ }, @@ -512,15 +566,15 @@ const VALID_RECEIPT_OPS = new Set(OP_TYPES); * * @param {import('../crdt/ORSet.js').ORSet} orset - The node OR-Set containing alive nodes * @param {{node: string, dot: import('../crdt/Dot.js').Dot}} op - The NodeAdd operation - * @returns {{target: string, result: 'applied'|'redundant'}} Outcome with node ID as target + * @returns {OpApplied|OpRedundant} Outcome with node ID as target */ function nodeAddOutcome(orset, op) { const encoded = encodeDot(op.dot); const existingDots = orset.entries.get(op.node); if (existingDots && existingDots.has(encoded)) { - return { target: op.node, result: 'redundant' }; + return new OpRedundant(op.node); } - return { target: op.node, result: 'applied' }; + return new OpApplied(op.node); } /** @@ -533,7 +587,7 @@ function nodeAddOutcome(orset, op) { * * @param {import('../crdt/ORSet.js').ORSet} orset - The node OR-Set containing alive nodes * @param {{node?: string, observedDots: string[] | Set}} op - The NodeRemove operation - * @returns {{target: string, result: 'applied'|'redundant'}} Outcome with node ID (or '*') as target + * @returns {OpApplied|OpRedundant} Outcome with node ID (or '*') as target */ function nodeRemoveOutcome(orset, op) { // Build a reverse index (dot → elementId) for the observed dots to avoid @@ -564,15 +618,15 @@ function nodeRemoveOutcome(orset, op) { * @param {import('../crdt/ORSet.js').ORSet} orset - The edge OR-Set containing alive edges * @param {{from: string, to: string, label: string, dot: import('../crdt/Dot.js').Dot}} op - The EdgeAdd operation * @param {string} edgeKey - Pre-encoded edge key (from\0to\0label format) - * @returns {{target: string, result: 'applied'|'redundant'}} Outcome with encoded edge key as target + * @returns {OpApplied|OpRedundant} Outcome with encoded edge key as target */ function edgeAddOutcome(orset, op, edgeKey) { const encoded = encodeDot(op.dot); const existingDots = orset.entries.get(edgeKey); if (existingDots && existingDots.has(encoded)) { - return { target: edgeKey, result: 'redundant' }; + return new OpRedundant(edgeKey); } - return { target: edgeKey, result: 'applied' }; + return new OpApplied(edgeKey); } /** @@ -588,7 +642,7 @@ function edgeAddOutcome(orset, op, edgeKey) { * * @param {import('../crdt/ORSet.js').ORSet} orset - The edge OR-Set containing alive edges * @param {{from?: string, to?: string, label?: string, observedDots: string[] | Set}} op - The EdgeRemove operation - * @returns {{target: string, result: 'applied'|'redundant'}} Outcome with encoded edge key (or '*') as target + * @returns {OpApplied|OpRedundant} Outcome with encoded edge key (or '*') as target */ function edgeRemoveOutcome(orset, op) { // Build a reverse index (dot → elementId) for the observed dots to avoid @@ -632,29 +686,25 @@ function edgeRemoveOutcome(orset, op) { * @param {Map>} propMap - The properties map keyed by encoded prop keys * @param {string} key - Pre-encoded property key (node or edge) * @param {import('../utils/EventId.js').EventId} eventId - The event ID for this operation, used for LWW comparison - * @returns {{target: string, result: 'applied'|'superseded'|'redundant', reason?: string}} + * @returns {OpOutcomeResult} * Outcome with encoded prop key as target; includes reason when superseded */ function propOutcomeForKey(propMap, key, eventId) { const current = propMap.get(key); if (!current) { - return { target: key, result: 'applied' }; + return new OpApplied(key); } const cmp = compareEventIds(eventId, current.eventId); if (cmp > 0) { - return { target: key, result: 'applied' }; + return new OpApplied(key); } if (cmp < 0) { const winner = current.eventId; - return { - target: key, - result: 'superseded', - reason: `LWW: writer ${winner.writerId} at lamport ${winner.lamport} wins`, - }; + return new OpSuperseded(key, winner); } - return { target: key, result: 'redundant' }; + return new OpRedundant(key); } /** @@ -663,7 +713,7 @@ function propOutcomeForKey(propMap, key, eventId) { * @param {Map>} propMap * @param {{node: string, key: string}} op - The PropSet or NodePropSet operation * @param {import('../utils/EventId.js').EventId} eventId - * @returns {{target: string, result: 'applied'|'superseded'|'redundant', reason?: string}} + * @returns {OpOutcomeResult} */ function propSetOutcome(propMap, op, eventId) { return propOutcomeForKey(propMap, encodePropKey(op.node, op.key), eventId); @@ -675,7 +725,7 @@ function propSetOutcome(propMap, op, eventId) { * @param {Map>} propMap * @param {{from: string, to: string, label: string, key: string}} op - The EdgePropSet operation * @param {import('../utils/EventId.js').EventId} eventId - * @returns {{target: string, result: 'applied'|'superseded'|'redundant', reason?: string}} + * @returns {OpOutcomeResult} */ function edgePropSetOutcome(propMap, op, eventId) { return propOutcomeForKey(propMap, encodeEdgePropKey(op.from, op.to, op.label, op.key), eventId); @@ -892,7 +942,7 @@ export function applyWithReceipt(state, patch, patchSha) { } /** @type {import('../types/TickReceipt.js').OpOutcome} */ const entry = { op: receiptOp, target: outcome.target, result: /** @type {'applied'|'superseded'|'redundant'} */ (outcome.result) }; - if (typeof outcome.reason === 'string' && outcome.reason.length > 0) { + if (outcome instanceof OpSuperseded && outcome.reason.length > 0) { entry.reason = outcome.reason; } opResults.push(entry); From fe34c3babefefa1526d8a057ac12ec24f27c7c75 Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 19:06:25 -0700 Subject: [PATCH 69/73] docs: update CHANGELOG with selector hierarchy, OpOutcome subclasses, ForkController hardening --- CHANGELOG.md | 3 +++ 1 file changed, 3 insertions(+) diff --git a/CHANGELOG.md b/CHANGELOG.md index 88dc460c..a1d3b047 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -35,6 +35,9 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 - **WarpApp/WarpCore content methods** — replaced direct function imports from `query.methods.js` with `callInternalRuntimeMethod()` delegation, which correctly resolves dynamically wired prototype methods. - **11 typedef-to-class promotions (NO_DOGS_NO_MASTERS)** — replaced phantom `@typedef {Object}` shapes with real JavaScript classes: `WarpStateV5`, `Dot`, `EventId`, `EffectEmission`, `EffectCoordinate`, `DeliveryObservation`, `TickReceipt`, `PatchDiff`, `LWWRegister`, `BTR`, `TrustState`. Each class has a constructor, validates inputs where applicable, and supports `instanceof`. Factory functions retained for backward compatibility. - **CBOR codec canonical key sorting for class instances** — both `CborCodec` and `defaultCodec` now sort keys for all object types (not just plain objects), using `instanceof` checks to skip built-in CBOR-native types (Uint8Array, Date, Set, Map, RegExp). This decouples class field declaration order from wire format, matching Echo's Rust canonical encoder behavior. +- **Comparison pipeline class hierarchy** — `NormalizedSelector` is now a base class with 4 subclasses (`LiveSelector`, `CoordinateSelector`, `StrandSelector`, `StrandBaseSelector`), each implementing `resolve()` directly. Eliminates kind-switch dispatch. `ResolvedComparisonSide` promoted to class. Live frontier captured once for consistency across both sides. +- **OpOutcome subclass hierarchy** — `OpOutcomeResult` base class with `OpApplied`, `OpSuperseded`, `OpRedundant` subclasses. `OpSuperseded` carries the winning `EventId` as a structured field instead of a formatted string. `VerificationResult` promoted to class. +- **ForkController hardening** — fork ref creation rolls back on `WarpRuntime.open` failure; `_isAncestor` uses visited-Set cycle detection instead of false-positive MAX_WALK counter; backfill rejection throws typed `ForkError` with `E_FORK_BACKFILL_REJECTED` and `E_FORK_WRITER_DIVERGED` codes. ### Added From 742c0016f90dd52c50a6677832f867e7500f9aa6 Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 19:07:54 -0700 Subject: [PATCH 70/73] docs: add Systems-Style JavaScript manifesto Engineering standard for how JavaScript is written in git-warp. Ten rules: classes not typedefs, validation in constructors, subclasses not switches, instanceof for runtime checks, unknown means the class hasn't been built yet, factory functions are shims, serialization is the codec's problem, structured data not strings, errors are domain classes, modules are the encapsulation boundary. This is infrastructure. The code must be honest at runtime. --- docs/SYSTEMS_STYLE_JAVASCRIPT.md | 303 +++++++++++++++++++++++++++++++ 1 file changed, 303 insertions(+) create mode 100644 docs/SYSTEMS_STYLE_JAVASCRIPT.md diff --git a/docs/SYSTEMS_STYLE_JAVASCRIPT.md b/docs/SYSTEMS_STYLE_JAVASCRIPT.md new file mode 100644 index 00000000..ce5a334b --- /dev/null +++ b/docs/SYSTEMS_STYLE_JAVASCRIPT.md @@ -0,0 +1,303 @@ +# Systems-Style JavaScript + +How to write JavaScript for infrastructure that lasts. + +This document is the engineering standard for `@git-stunts/git-warp` +and all `git-stunts` / `flyingrobots` repositories. It is not a style +guide — it is a set of structural decisions that determine whether the +code is honest or lying. + +--- + +## The core premise + +JavaScript is a real programming language. It has classes, inheritance, +`instanceof`, constructors, and proper encapsulation via module scope. +It does not need TypeScript's phantom type system to be safe. It needs +discipline. + +**Every domain concept is a class.** If you're writing a `@typedef`, +stop. If you're returning a plain object `{ target, result }`, stop. +If you're writing `normalizeX(unknown)`, stop. You are building a +class. Build the class. + +--- + +## The rules + +### 1. Classes, not typedefs + +A `@typedef {Object}` is a lie. It exists only at type-check time. It +provides no runtime validation, no `instanceof`, no constructor, no +methods. It is a comment pretending to be a contract. + +```javascript +// BAD — phantom type, vanishes at runtime +/** @typedef {Object} Dot + * @property {string} writerId + * @property {number} counter */ +function createDot(writerId, counter) { + return { writerId, counter }; +} + +// GOOD — real class, validates, exists at runtime +class Dot { + constructor(writerId, counter) { + if (typeof writerId !== 'string' || writerId.length === 0) { + throw new Error('writerId must be a non-empty string'); + } + if (!Number.isInteger(counter) || counter <= 0) { + throw new Error('counter must be a positive integer'); + } + this.writerId = writerId; + this.counter = counter; + } +} +``` + +The class IS the validation. The constructor IS the normalizer. The +instance IS the proof that the data is good. + +### 2. Validation lives in constructors + +If you have a function called `normalizeX()`, `assertX()`, or +`validateX()` that takes `unknown` and returns a known type — that +function is a constructor. It validates input, produces a trusted +output, and the caller uses the output downstream. That is what +constructors do. + +```javascript +// BAD — standalone validator, trusted output is a plain object +function normalizeLamportCeiling(value, field) { + if (value === null || value === undefined) { return null; } + if (!Number.isInteger(value) || value < 0) { + throw new Error(`${field} must be non-negative integer`); + } + return value; +} + +// GOOD — value object, validated on construction +class LamportCeiling { + constructor(value) { + if (value !== null && (!Number.isInteger(value) || value < 0)) { + throw new Error('LamportCeiling must be null or non-negative integer'); + } + this.value = value; + } +} +``` + +After `new LamportCeiling(x)` succeeds, you never check the value +again. The constructor did the work. Every consumer trusts the instance. + +### 3. Subclasses, not switches + +If you're writing `if (x.kind === 'live') ... else if (x.kind === +'strand') ...`, you have a class hierarchy hiding behind a string +discriminant. The dispatch logic belongs in the class, not in every +consumer. + +```javascript +// BAD — every consumer switches on kind +function resolve(selector) { + if (selector.kind === 'live') { + return resolveLive(selector); + } + if (selector.kind === 'strand') { + return resolveStrand(selector); + } +} + +// GOOD — the selector resolves itself +class LiveSelector extends NormalizedSelector { + async resolve(graph, scope, liveFrontier) { + // resolution logic lives here + } +} + +class StrandSelector extends NormalizedSelector { + async resolve(graph, scope) { + // resolution logic lives here + } +} + +// Consumer just calls: +const result = await selector.resolve(graph, scope, liveFrontier); +``` + +One call. No switch. The subclass knows what it is. If you add a new +kind, you add a new subclass — you don't hunt for every switch +statement in the codebase. + +### 4. `instanceof` is the runtime type check + +JavaScript has `instanceof`. Use it. It works at runtime. It survives +serialization boundaries when you reconstruct from the right class. It +is honest. + +```javascript +// BAD — checking a string tag +if (outcome.result === 'superseded') { + console.log(outcome.reason); +} + +// GOOD — checking the actual type +if (outcome instanceof OpSuperseded) { + console.log(outcome.winner.writerId); +} +``` + +`instanceof` tells you what the object IS. String comparison tells you +what the object CLAIMS to be. + +### 5. `unknown` means you haven't built the class yet + +If a function parameter is typed `unknown`, that function is admitting +it doesn't know what it's working with. At system boundaries (CBOR +decode, network input, user input), `unknown` is honest. Everywhere +else, it is a sign that the class hasn't been written yet. + +The fix is always the same: define the class, construct it at the +boundary, and pass the instance downstream. + +```javascript +// BAD — unknown flows through the whole pipeline +function processRecord(record) { // record is unknown + const type = record['recordType']; // bracket access, no safety + const id = record['recordId']; // more bracket access + // ... +} + +// GOOD — construct at boundary, trust everywhere after +const record = new TrustRecord(zodParsed.data); // validates +processRecord(record); // record.recordType is a real field +``` + +### 6. Factory functions are backward-compat shims + +If a factory function (`createDot`, `createEventId`) exists alongside +a class, the factory is a shim for callers that haven't updated yet. +New code uses the constructor directly. Factories delegate to +constructors — they never contain logic. + +```javascript +// Factory is a one-liner that delegates +function createDot(writerId, counter) { + return new Dot(writerId, counter); +} +``` + +If the factory contains validation, transformation, or branching that +the constructor doesn't, the constructor is incomplete. + +### 7. Serialization is the codec's problem + +CBOR key ordering, JSON canonicalization, and wire format concerns do +not belong in class field declarations. The codec sorts keys at encode +time. Classes declare fields in whatever order makes domain sense. + +```javascript +// BAD — fields ordered alphabetically for CBOR +class Dot { + counter; // alphabetical, not logical + writerId; // alphabetical, not logical +} + +// GOOD — fields ordered by domain meaning +class Dot { + writerId; // who created this operation + counter; // which operation from that writer +} +``` + +The CBOR codec runs `Object.keys(obj).sort()` before encoding. The +class doesn't know or care about serialization order. + +### 8. Structured data, not formatted strings + +If a string contains structured information (who won, at what lamport, +from which writer), that string is a class that got flattened. Extract +the structure. + +```javascript +// BAD — structured data encoded as a string +return { + target: key, + result: 'superseded', + reason: `LWW: writer ${winner.writerId} at lamport ${winner.lamport} wins`, +}; + +// GOOD — structured data as fields +return new OpSuperseded(key, winner); +// winner is an EventId instance — structured, inspectable, programmatic +``` + +Strings are for humans. Fields are for machines. If a machine needs +to read the data, it should be a field. + +### 9. Errors are domain classes + +`new Error('something went wrong')` is a raw error. It has no code, +no context, no machine-readable identity. Domain errors are classes +that extend `WarpError` with a `code` field, a `context` object, and +a `name` that supports `instanceof`. + +```javascript +// BAD +throw new Error('Backfill rejected'); + +// GOOD +throw new ForkError('Backfill rejected', { + code: 'E_FORK_BACKFILL_REJECTED', + context: { writerId, relation, ckHead }, +}); +``` + +Every `new Error()` in domain code is a bug. Every catch site that +parses `err.message` is a bug. Use the class. + +### 10. The module is the encapsulation boundary + +JavaScript doesn't have `private` at the language level (private +fields `#x` exist but have proxy/testing friction). The module is +the encapsulation boundary. If a function or class is not exported, +it is private. If it is exported, it is public. + +Don't fake privacy with naming conventions (`_privateMethod`) when +module scope provides it for free. Export what consumers need. Keep +everything else module-private. + +--- + +## What this eliminates + +When every rule is followed, the following patterns disappear from +the codebase: + +- `@typedef {Object}` for any constructable concept +- `normalizeX(unknown)` standalone validator functions +- `assertX(unknown)` standalone guard functions +- `if (x.kind === 'foo')` dispatch switches (subclasses handle it) +- `Record` as a function parameter type +- `/** @type {X} */ (plainObject)` cast-to-shut-up patterns +- Raw `new Error()` in domain code +- Formatted strings carrying structured data +- Bracket access `obj['field']` on known shapes + +What remains is classes, constructors, `instanceof`, and module scope. +That's JavaScript. That's enough. + +--- + +## This is infrastructure + +`git-warp` is a multi-writer CRDT graph database. It stores data as +Git commits. It runs on Node, Bun, and Deno. It handles cryptographic +verification, distributed replication, and deterministic replay. + +This is not a React component. This is not a REST API handler. This +is infrastructure. The code must be honest, inspectable, and safe at +runtime — not just at type-check time. + +TypeScript's phantom types vanish at runtime. JavaScript classes exist +at runtime. For infrastructure, that difference is everything. From 83fa6dc0c7563bd5ef9be73098b9168dbd0f0d45 Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 20:37:52 -0700 Subject: [PATCH 71/73] =?UTF-8?q?docs:=20System-Style=20JavaScript=20?= =?UTF-8?q?=E2=80=94=20the=20real=20manifesto?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Complete rewrite of the engineering standard. Not a style guide — a doctrine for writing JavaScript infrastructure that remains honest under execution, replay, migration, debugging, replication, failure, and time. Rule 0: Runtime Truth Wins. The Hierarchy of Truth. The Object Model. Seven Principles. The Anti-Shape-Soup Doctrine. The Review Checklist. Co-authored-by: James Ross --- docs/SYSTEMS_STYLE_JAVASCRIPT.md | 624 +++++++++++++++++++++---------- 1 file changed, 425 insertions(+), 199 deletions(-) diff --git a/docs/SYSTEMS_STYLE_JAVASCRIPT.md b/docs/SYSTEMS_STYLE_JAVASCRIPT.md index ce5a334b..383615f7 100644 --- a/docs/SYSTEMS_STYLE_JAVASCRIPT.md +++ b/docs/SYSTEMS_STYLE_JAVASCRIPT.md @@ -1,303 +1,529 @@ -# Systems-Style JavaScript +# System-Style JavaScript -How to write JavaScript for infrastructure that lasts. +**How to write JavaScript infrastructure that lasts.** -This document is the engineering standard for `@git-stunts/git-warp` -and all `git-stunts` / `flyingrobots` repositories. It is not a style -guide — it is a set of structural decisions that determine whether the -code is honest or lying. +This is the engineering standard for `git-stunts` and all `flyingrobots` repositories. It is not about semicolons, quotes, or formatting trivia. It is a doctrine for writing JavaScript infrastructure code that remains honest under execution, replay, migration, debugging, replication, failure, and time. --- -## The core premise +## Rule 0: Runtime Truth Wins -JavaScript is a real programming language. It has classes, inheritance, -`instanceof`, constructors, and proper encapsulation via module scope. -It does not need TypeScript's phantom type system to be safe. It needs -discipline. +When the program is running, one question matters above all others: -**Every domain concept is a class.** If you're writing a `@typedef`, -stop. If you're returning a plain object `{ target, result }`, stop. -If you're writing `normalizeX(unknown)`, stop. You are building a -class. Build the class. +> What is actually true right now, in memory, under execution? + +If the answer depends on comments, conventions, vanished types, wishful thinking, or editor vibes, the code is lying. + +Trusted domain values must be created through runtime construction, parsing, or validation that establishes their invariants. Once established, those invariants must be preserved for as long as the value remains trusted. + +This rule outranks documentation, build steps, editor hints, static overlays, compile-time tooling, team folklore, and "but the linter said it was fine." + +### What This Means in Practice + +Infrastructure cannot afford fake contracts: + +- A type that vanishes at runtime is not an authoritative contract. +- A comment describing a shape is not an authoritative contract. +- A plain object that "should" have valid fields is not an authoritative contract. +- An IDE tooltip is not an authoritative contract. +- A compile step is not an authoritative contract. + +These tools can be useful. None of them outrank the runtime. + +### Why It Matters Here + +Infrastructure code touches persistence, replication, cryptographic verification, conflict resolution, deterministic replay, failure handling, system boundaries, long-lived state, version migration, and auditability. This is not view-layer glue. Mushy assumptions here turn into real bugs with long half-lives. + +--- + +## The Hierarchy of Truth + +When layers disagree, authority flows in this order: + +1. **Runtime domain model** — constructors, invariants, methods, error types +2. **Boundary schemas and parsers** — Zod, CBOR decoders, protocol validators +3. **Tests** — the executable specification +4. **JSDoc and design docs** — human-facing explanations of the runtime model +5. **IDE and static tooling** — editor navigation, refactoring support +6. **TypeScript** — useful dialect, not final authority + +--- + +## Scope + +This standard is optimized for: + +- Infrastructure code with strong invariants +- Long-lived systems with explicit boundaries +- Direct execution workflows portable across hosts +- Browser-capable cores +- JavaScript-first repositories +- Code that must be teachable, legible, and publishable + +It is not a claim that every JavaScript project should follow this exact approach. It is a claim that, for this family of repositories, runtime-backed domain modeling beats soft shape trust. + +--- + +## Language Policy + +### JavaScript Is the Default + +JavaScript is chosen deliberately. It is not perfect — parts of it are cursed and deserve open mockery — but it offers a rare combination: + +- Fast to write and change +- Highly portable +- Backed by a flexible object model +- Direct to execute +- Expressive enough for serious infrastructure +- Widely understood + +Many of these projects are built not just to run, but to be read, explained, taught from, and used as reference implementations. JavaScript lowers the barrier to entry for readers in a way few other languages can match. That readability is not a side benefit — it is part of the design. + +Fun matters too. A language that feels pleasant to iterate in yields tighter feedback loops, more experiments, and more finished work. That is sound engineering economics. + +### TypeScript: Allowed, Not Authoritative + +TypeScript is a useful typed dialect that improves editor workflows, refactoring, and external compatibility. What this standard rejects is elevating TypeScript to the role of final authority. + +TypeScript may help with editor navigation, consumer ergonomics, and static checks. It does not replace runtime validation, preserve runtime invariants, or excuse weak domain modeling. + +The true sources of truth remain the runtime domain types, boundary parsing, and tests. TypeScript is allowed. TypeScript is not king. + +Use TypeScript where it helps. Never confuse it with the source of truth. + +### Escape Hatch: Rust via WebAssembly + +When JavaScript is insufficient — tight CPU-bound loops, memory-sensitive systems, unsafe parsing of hostile binary inputs, cryptographic kernels — use Rust. + +Rust provides memory safety without garbage collection, explicit ownership, excellent performance, and strong WebAssembly support. It is the recommended companion when the problem outgrows JavaScript. + +**The architecture split:** + +| Layer | Language | Role | +|-------|----------|------| +| Core domain logic | JavaScript | Default. Portable. Browser-ready. | +| Performance-critical kernels | Rust → Wasm | When safety/speed constraints justify it | +| Host adapters | JavaScript | Node, Deno, browser — behind ports | +| Orchestration | JavaScript | Glue between cores and hosts | --- -## The rules +## Architecture -### 1. Classes, not typedefs +### Browser-First Portability -A `@typedef {Object}` is a lie. It exists only at type-check time. It -provides no runtime validation, no `instanceof`, no constructor, no -methods. It is a comment pretending to be a contract. +The browser is the most universal deployment platform and the ultimate portability test. Core logic prefers web-platform-friendly primitives: ```javascript -// BAD — phantom type, vanishes at runtime -/** @typedef {Object} Dot - * @property {string} writerId - * @property {number} counter */ -function createDot(writerId, counter) { - return { writerId, counter }; +// ✅ Portable +const bytes = new TextEncoder().encode(text); +const arr = new Uint8Array(buffer); +const url = new URL(path, base); + +// ❌ Node-only — belongs in adapters +const buf = Buffer.from(text, 'utf8'); +const resolved = require('path').resolve(p); +``` + +### Hexagonal Architecture Is Mandatory + +Core domain logic must never depend directly on Node globals, filesystem APIs, `process`, `Buffer`, or host-specific calls. Those belong behind adapter ports. + +```javascript +// ✅ Core speaks in portable terms +class ReplicaEngine { + constructor(storage, clock, codec) { + // storage, clock, codec are ports — capabilities, not implementations + this._storage = storage; + this._clock = clock; + this._codec = codec; + } + + async applyOp(op) { + const timestamp = this._clock.now(); + const bytes = this._codec.encode(op); + await this._storage.put(op.key, bytes, timestamp); + } } -// GOOD — real class, validates, exists at runtime -class Dot { - constructor(writerId, counter) { - if (typeof writerId !== 'string' || writerId.length === 0) { - throw new Error('writerId must be a non-empty string'); - } - if (!Number.isInteger(counter) || counter <= 0) { - throw new Error('counter must be a positive integer'); - } - this.writerId = writerId; - this.counter = counter; +// ✅ Adapter implements the port for a specific host +class NodeFsStorageAdapter { + async put(key, bytes, timestamp) { + const filePath = path.join(this._root, key); + await fs.writeFile(filePath, bytes); + } +} + +// ✅ Browser adapter implements the same port +class IndexedDbStorageAdapter { + async put(key, bytes, timestamp) { + const tx = this._db.transaction('store', 'readwrite'); + await tx.objectStore('store').put({ key, bytes, timestamp }); } } ``` -The class IS the validation. The constructor IS the normalizer. The -instance IS the proof that the data is good. +**Core rule:** Core logic should not know that Node exists. -### 2. Validation lives in constructors +--- -If you have a function called `normalizeX()`, `assertX()`, or -`validateX()` that takes `unknown` and returns a known type — that -function is a constructor. It validates input, produces a trusted -output, and the caller uses the output downstream. That is what -constructors do. +## The Object Model -```javascript -// BAD — standalone validator, trusted output is a plain object -function normalizeLamportCeiling(value, field) { - if (value === null || value === undefined) { return null; } - if (!Number.isInteger(value) || value < 0) { - throw new Error(`${field} must be non-negative integer`); - } - return value; -} +System-style JavaScript organizes code around four categories of runtime-backed objects: -// GOOD — value object, validated on construction -class LamportCeiling { - constructor(value) { - if (value !== null && (!Number.isInteger(value) || value < 0)) { - throw new Error('LamportCeiling must be null or non-negative integer'); +### Value Objects — Meaningful domain values with invariants + +```javascript +class ObjectId { + constructor(hex) { + if (typeof hex !== 'string' || !/^[0-9a-f]{40,64}$/.test(hex)) { + throw new InvalidObjectId(hex); } - this.value = value; + this._hex = hex; + Object.freeze(this); } + + toString() { return this._hex; } + equals(other) { return other instanceof ObjectId && other._hex === this._hex; } } ``` -After `new LamportCeiling(x)` succeeds, you never check the value -again. The constructor did the work. Every consumer trusts the instance. +### Entities — Identity and lifecycle + +```javascript +class Replica { + constructor(id, clock) { + this._id = ReplicaId.from(id); + this._clock = clock; + this._log = []; + } -### 3. Subclasses, not switches + append(op) { + const validated = Op.from(op); // boundary validation + this._log.push(validated); + return this._clock.tick(); + } +} +``` -If you're writing `if (x.kind === 'live') ... else if (x.kind === -'strand') ...`, you have a class hierarchy hiding behind a string -discriminant. The dispatch logic belongs in the class, not in every -consumer. +### Results and Outcomes — Runtime-backed domain types, not tagged unions ```javascript -// BAD — every consumer switches on kind -function resolve(selector) { - if (selector.kind === 'live') { - return resolveLive(selector); - } - if (selector.kind === 'strand') { - return resolveStrand(selector); +class OpApplied { + constructor(op, timestamp) { + this.op = op; + this.timestamp = timestamp; + Object.freeze(this); } } -// GOOD — the selector resolves itself -class LiveSelector extends NormalizedSelector { - async resolve(graph, scope, liveFrontier) { - // resolution logic lives here +class OpSuperseded { + constructor(op, winner) { + this.op = op; + this.winner = winner; + Object.freeze(this); } } -class StrandSelector extends NormalizedSelector { - async resolve(graph, scope) { - // resolution logic lives here +// Runtime dispatch — not tag switching +if (outcome instanceof OpSuperseded) { + return outcome.winner; +} +``` + +### Errors — Domain failures are first-class objects + +```javascript +class InvalidObjectId extends DomainError { + constructor(value) { + super(`Invalid commit hash: ${typeof value === 'string' ? value.slice(0, 16) + '…' : typeof value}`); + this.name = 'InvalidObjectId'; + this.value = value; } } -// Consumer just calls: -const result = await selector.resolve(graph, scope, liveFrontier); +// ✅ Branch on type +if (err instanceof InvalidObjectId) { /* ... */ } + +// ❌ Never parse messages +if (err.message.includes('invalid')) { /* raccoon-in-a-dumpster energy */ } ``` -One call. No switch. The subclass knows what it is. If you add a new -kind, you add a new subclass — you don't hunt for every switch -statement in the codebase. +--- -### 4. `instanceof` is the runtime type check +## Principles -JavaScript has `instanceof`. Use it. It works at runtime. It survives -serialization boundaries when you reconstruct from the right class. It -is honest. +These are the load-bearing architectural commitments. Violating any of these is a design-level issue. + +### P1: Domain Concepts Require Runtime-Backed Forms + +If a concept has invariants, identity, or behavior, it must have a runtime-backed representation — usually a class. A typedef or plain object is insufficient. ```javascript -// BAD — checking a string tag -if (outcome.result === 'superseded') { - console.log(outcome.reason); +// ❌ Shape trust — nothing enforces this at runtime +/** @typedef {{ writerId: string, lamport: number }} EventId */ + +// ✅ Runtime-backed — invariants established on construction +class EventId { + constructor(writerId, lamport) { + this._writerId = WriterId.from(writerId); + this._lamport = Lamport.from(lamport); + Object.freeze(this); + } } +``` -// GOOD — checking the actual type -if (outcome instanceof OpSuperseded) { - console.log(outcome.winner.writerId); -} +### P2: Validation Happens at Boundaries and Construction Points + +Untrusted input becomes trusted data only through constructors or dedicated parse methods. Constructors establish invariants; they perform no I/O or async work. + +```javascript +// Boundary: raw bytes → validated domain object +const decoded = cborDecode(bytes); +const parsed = EventIdSchema.parse(decoded); // schema rejects malformed input +const eventId = new EventId(parsed.writerId, parsed.lamport); // constructor establishes invariants ``` -`instanceof` tells you what the object IS. String comparison tells you -what the object CLAIMS to be. +### P3: Behavior Belongs on the Type That Owns It + +Avoid switching on `kind`/`type` tags. Put behavior on the owning type. + +```javascript +// ❌ External switch on tags +function describe(outcome) { + switch (outcome.type) { + case 'applied': return `Applied at ${outcome.timestamp}`; + case 'superseded': return `Beaten by ${outcome.winner}`; + } +} + +// ✅ Behavior lives on the type +class OpApplied { + describe() { return `Applied at ${this.timestamp}`; } +} -### 5. `unknown` means you haven't built the class yet +class OpSuperseded { + describe() { return `Beaten by ${this.winner}`; } +} +``` -If a function parameter is typed `unknown`, that function is admitting -it doesn't know what it's working with. At system boundaries (CBOR -decode, network input, user input), `unknown` is honest. Everywhere -else, it is a sign that the class hasn't been written yet. +### P4: Schemas Belong at Boundaries, Not in the Core -The fix is always the same: define the class, construct it at the -boundary, and pass the instance downstream. +Use schemas (e.g., Zod) to reject malformed input at the edge. Domain types own behavior and invariants inside the boundary. ```javascript -// BAD — unknown flows through the whole pipeline -function processRecord(record) { // record is unknown - const type = record['recordType']; // bracket access, no safety - const id = record['recordId']; // more bracket access - // ... +// ✅ Edge: schema validates untrusted input +const ReplicaConfigSchema = z.object({ + id: z.string().uuid(), + maxLogSize: z.number().int().positive(), +}); + +// ✅ Core: domain type provides behavior +class ReplicaConfig { + constructor(id, maxLogSize) { + this._id = ReplicaId.from(id); + this._maxLogSize = maxLogSize; + Object.freeze(this); + } + + allowsAppend(currentSize) { + return currentSize < this._maxLogSize; + } } -// GOOD — construct at boundary, trust everywhere after -const record = new TrustRecord(zodParsed.data); // validates -processRecord(record); // record.recordType is a real field +// ✅ Boundary glue +function parseReplicaConfig(raw) { + const data = ReplicaConfigSchema.parse(raw); + return new ReplicaConfig(data.id, data.maxLogSize); +} ``` -### 6. Factory functions are backward-compat shims +### P5: Serialization Is the Codec's Problem -If a factory function (`createDot`, `createEventId`) exists alongside -a class, the factory is a shim for callers that haven't updated yet. -New code uses the constructor directly. Factories delegate to -constructors — they never contain logic. +The byte layer (CBOR/JSON/etc.) stays separate from the meaning layer. Domain types do not know how they are encoded. ```javascript -// Factory is a one-liner that delegates -function createDot(writerId, counter) { - return new Dot(writerId, counter); +// ✅ Codec handles the wire format +class EventCodec { + encode(event) { + return cborEncode({ + writerId: event.writerId.toString(), + lamport: event.lamport.value, + payload: event.payload, + }); + } + + decode(bytes) { + const raw = cborDecode(bytes); + return new Event( + WriterId.from(raw.writerId), + Lamport.from(raw.lamport), + raw.payload + ); + } } ``` -If the factory contains validation, transformation, or branching that -the constructor doesn't, the constructor is incomplete. +### P6: Single Source of Truth -### 7. Serialization is the codec's problem +Do not duplicate the same contract across JSDoc, TypeScript, and validators. Define the runtime model first. Everything else derives from or documents it. -CBOR key ordering, JSON canonicalization, and wire format concerns do -not belong in class field declarations. The codec sorts keys at encode -time. Classes declare fields in whatever order makes domain sense. +### P7: Runtime Dispatch Over Tag Switching + +Inside a coherent runtime, `instanceof` is often the correct dispatch mechanism. ```javascript -// BAD — fields ordered alphabetically for CBOR -class Dot { - counter; // alphabetical, not logical - writerId; // alphabetical, not logical +// ✅ Direct dispatch +if (outcome instanceof OpSuperseded) { + return outcome.winner; } -// GOOD — fields ordered by domain meaning -class Dot { - writerId; // who created this operation - counter; // which operation from that writer +// ✅ Policy objects instead of option flags +const replayPolicy = ReplayPolicy.speculativeForkAllowed(); +const result = await replayer.replaySegment(segment, replayPolicy); +``` + +**Cross-realm caveat:** `instanceof` breaks across realm boundaries (iframes, web workers, multiple module instances). When values cross realms, use a protocol-based check (e.g., `Symbol.for`-keyed brand, or a static `.is()` method) instead: + +```javascript +class EventId { + static _brand = Symbol.for('flyingrobots.EventId'); + + get [EventId._brand]() { return true; } + + static is(value) { + return value != null && value[EventId._brand] === true; + } } + +// Works across realms +if (EventId.is(maybeEventId)) { /* ... */ } ``` -The CBOR codec runs `Object.keys(obj).sort()` before encoding. The -class doesn't know or care about serialization order. +--- + +## Practices + +These are concrete coding disciplines. Most are linter-enforceable. Violations should fail CI. -### 8. Structured data, not formatted strings +### `any` Is Banished; `unknown` Is Quarantined -If a string contains structured information (who won, at what lamport, -from which writer), that string is a class that got flattened. Extract -the structure. +`any` is surrender. `unknown` is acceptable only at raw edges and must be eliminated through parsing immediately. + +### Trusted Values Must Preserve Integrity + +Use `Object.freeze()`, private fields, or defensive copying to protect invariants after construction. + +### Error Type Is Primary; Codes Are Optional Metadata + +Use specific error classes. Never branch on `err.message`. Error codes are fine as boundary metadata for consumers, but internal logic dispatches on type. + +### Parameter Objects Must Add Semantic Value + +Public APIs should not accept anonymous bags of options. If you're reaching for an options object, consider whether it should be a named policy or config type. ```javascript -// BAD — structured data encoded as a string -return { - target: key, - result: 'superseded', - reason: `LWW: writer ${winner.writerId} at lamport ${winner.lamport} wins`, -}; - -// GOOD — structured data as fields -return new OpSuperseded(key, winner); -// winner is an EventId instance — structured, inspectable, programmatic +// ❌ Options sludge +await replayer.replay(segment, { allowFork: true, maxRetries: 3, strict: false }); + +// ✅ Named policy +const policy = ReplayPolicy.speculativeForkAllowed({ maxRetries: 3 }); +await replayer.replaySegment(segment, policy); ``` -Strings are for humans. Fields are for machines. If a machine needs -to read the data, it should be a field. +### Raw Objects May Carry Bytes, Not Meaning -### 9. Errors are domain classes +Plain objects are for decoded payloads or logging. If something has invariants or behavior, it gets a class. -`new Error('something went wrong')` is a raw error. It has no code, -no context, no machine-readable identity. Domain errors are classes -that extend `WarpError` with a `code` field, a `context` object, and -a `name` that supports `instanceof`. +### Magic Numbers and Strings Are Banished + +Give semantic numbers a named constant. Centralize strings used for identifiers, events, or config keys. ```javascript -// BAD -throw new Error('Backfill rejected'); +// ❌ +if (log.length > 8192) { /* ... */ } -// GOOD -throw new ForkError('Backfill rejected', { - code: 'E_FORK_BACKFILL_REJECTED', - context: { writerId, relation, ckHead }, -}); +// ✅ +const MAX_LOG_ENTRIES = 8192; +if (log.length > MAX_LOG_ENTRIES) { /* ... */ } ``` -Every `new Error()` in domain code is a bug. Every catch site that -parses `err.message` is a bug. Use the class. +### Boolean Trap Parameters Are Banished -### 10. The module is the encapsulation boundary +Use named parameter objects or separate methods instead of positional booleans. -JavaScript doesn't have `private` at the language level (private -fields `#x` exist but have proxy/testing friction). The module is -the encapsulation boundary. If a function or class is not exported, -it is private. If it is exported, it is public. +```javascript +// ❌ What does `true` mean here? +engine.compact(log, true); + +// ✅ Intention is legible +engine.compact(log, { preserveTombstones: true }); +// or +engine.compactPreservingTombstones(log); +``` + +### Structured Data Stays Structured + +Machines must not be forced to parse prose to recover data. If something is structured, keep it structured. -Don't fake privacy with naming conventions (`_privateMethod`) when -module scope provides it for free. Export what consumers need. Keep -everything else module-private. +### Module Scope Is the First Privacy Boundary + +If it is not exported, it is private. Use module boundaries deliberately. + +### JSDoc Documents the Runtime Model; It Does Not Replace It + +JSDoc exists to explain the actual runtime behavior and contracts. It must never substitute for runtime-backed types, validation, or invariants. --- -## What this eliminates +## Tooling Discipline + +### Lint Is Law -When every rule is followed, the following patterns disappear from -the codebase: +- Lint errors fail CI. +- Suppressions require a documented justification. +- Enforce hardest on: unsafe coercion, floating promises, raw `Error` objects, and host-specific API leakage into core code. -- `@typedef {Object}` for any constructable concept -- `normalizeX(unknown)` standalone validator functions -- `assertX(unknown)` standalone guard functions -- `if (x.kind === 'foo')` dispatch switches (subclasses handle it) -- `Record` as a function parameter type -- `/** @type {X} */ (plainObject)` cast-to-shut-up patterns -- Raw `new Error()` in domain code -- Formatted strings carrying structured data -- Bracket access `obj['field']` on known shapes +### TypeScript Rules -What remains is classes, constructors, `instanceof`, and module scope. -That's JavaScript. That's enough. +When TypeScript is used: + +- It remains subordinate to runtime validation. +- It must not be treated as a substitute for domain modeling. +- `any` is banned. `unknown` at raw edges only, eliminated immediately. +- Type-only constructs must not create a false sense of safety that the runtime does not back up. + +--- + +## The Anti-Shape-Soup Doctrine + +Most bad JavaScript infrastructure stems from weak modeling. The discipline is: + +1. **Name** the concept. +2. **Construct** the concept — with validated invariants. +3. **Protect** the invariant — freeze, encapsulate, defend. +4. **Attach** the behavior — on the type that owns it. +5. **Guard** the boundary — schemas at the edge, domain types inside. +6. **Separate** the codec — serialization is not the domain's problem. +7. **Isolate** the host — Node behind adapters, core stays portable. +8. **Document** the runtime — JSDoc explains what actually exists. +9. **Test** the truth — executable specification, not wishful coverage. --- -## This is infrastructure +## Review Checklist -`git-warp` is a multi-writer CRDT graph database. It stores data as -Git commits. It runs on Node, Bun, and Deno. It handles cryptographic -verification, distributed replication, and deterministic replay. +Before merging, ask: -This is not a React component. This is not a REST API handler. This -is infrastructure. The code must be honest, inspectable, and safe at -runtime — not just at type-check time. +- Is this a real domain concept? Where is its runtime-backed form? +- Where is `unknown` eliminated? +- Does construction establish trust? +- Does behavior live on the type that owns it? +- Is anyone parsing `err.message` like a raccoon in a dumpster? +- Are there magic numbers or strings? +- Could this logic run in a browser? +- Is tooling fiction being mistaken for architecture? + +--- -TypeScript's phantom types vanish at runtime. JavaScript classes exist -at runtime. For infrastructure, that difference is everything. +This is infrastructure. Code cannot rely on costumes or pretend that comments are contracts. JavaScript is enough — not because it is magical, but because runtime truth beats phantom certainty every time. From 9b35171d302e1670c9bd2be2b6fd480fd8ab5d12 Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 20:42:12 -0700 Subject: [PATCH 72/73] =?UTF-8?q?docs:=20System-Style=20JavaScript=20?= =?UTF-8?q?=E2=80=94=20final=20draft?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Refined structure, tightened language, added Hierarchy of Truth, Language Policy (JS default, TS allowed not authoritative, Rust escape hatch), Architecture section (browser-first, hexagonal mandatory), four-category Object Model, cross-realm branding note, Anti-Shape-Soup Doctrine, Review Checklist. Co-authored-by: James Ross --- docs/SYSTEMS_STYLE_JAVASCRIPT.md | 217 ++++++++++--------------------- 1 file changed, 70 insertions(+), 147 deletions(-) diff --git a/docs/SYSTEMS_STYLE_JAVASCRIPT.md b/docs/SYSTEMS_STYLE_JAVASCRIPT.md index 383615f7..5f9a79d3 100644 --- a/docs/SYSTEMS_STYLE_JAVASCRIPT.md +++ b/docs/SYSTEMS_STYLE_JAVASCRIPT.md @@ -2,15 +2,13 @@ **How to write JavaScript infrastructure that lasts.** -This is the engineering standard for `git-stunts` and all `flyingrobots` repositories. It is not about semicolons, quotes, or formatting trivia. It is a doctrine for writing JavaScript infrastructure code that remains honest under execution, replay, migration, debugging, replication, failure, and time. +This is the engineering standard for **`git-stunts`** and all **`flyingrobots`** repositories. It is **not** a conventional style guide about semicolons, quotes, or formatting trivia. It is a doctrine for writing JavaScript infrastructure code that remains honest under execution, replay, migration, debugging, replication, failure, and time. ---- - -## Rule 0: Runtime Truth Wins +### Rule 0: Runtime Truth Wins When the program is running, one question matters above all others: -> What is actually true right now, in memory, under execution? +**What is actually true right now, in memory, under execution?** If the answer depends on comments, conventions, vanished types, wishful thinking, or editor vibes, the code is lying. @@ -34,9 +32,7 @@ These tools can be useful. None of them outrank the runtime. Infrastructure code touches persistence, replication, cryptographic verification, conflict resolution, deterministic replay, failure handling, system boundaries, long-lived state, version migration, and auditability. This is not view-layer glue. Mushy assumptions here turn into real bugs with long half-lives. ---- - -## The Hierarchy of Truth +### The Hierarchy of Truth When layers disagree, authority flows in this order: @@ -47,9 +43,7 @@ When layers disagree, authority flows in this order: 5. **IDE and static tooling** — editor navigation, refactoring support 6. **TypeScript** — useful dialect, not final authority ---- - -## Scope +### Scope This standard is optimized for: @@ -60,13 +54,11 @@ This standard is optimized for: - JavaScript-first repositories - Code that must be teachable, legible, and publishable -It is not a claim that every JavaScript project should follow this exact approach. It is a claim that, for this family of repositories, runtime-backed domain modeling beats soft shape trust. - ---- +It is not a claim that every JavaScript project should follow this exact approach. It **is** a claim that, for this family of repositories, runtime-backed domain modeling beats soft shape trust. -## Language Policy +### Language Policy -### JavaScript Is the Default +#### JavaScript Is the Default JavaScript is chosen deliberately. It is not perfect — parts of it are cursed and deserve open mockery — but it offers a rare combination: @@ -81,36 +73,34 @@ Many of these projects are built not just to run, but to be read, explained, tau Fun matters too. A language that feels pleasant to iterate in yields tighter feedback loops, more experiments, and more finished work. That is sound engineering economics. -### TypeScript: Allowed, Not Authoritative +#### TypeScript: Allowed, Not Authoritative TypeScript is a useful typed dialect that improves editor workflows, refactoring, and external compatibility. What this standard rejects is elevating TypeScript to the role of final authority. -TypeScript may help with editor navigation, consumer ergonomics, and static checks. It does not replace runtime validation, preserve runtime invariants, or excuse weak domain modeling. +TypeScript may help with editor navigation, consumer ergonomics, and static checks. It does **not** replace runtime validation, preserve runtime invariants, or excuse weak domain modeling. -The true sources of truth remain the runtime domain types, boundary parsing, and tests. TypeScript is allowed. TypeScript is not king. +The true sources of truth remain the runtime domain types, boundary parsing, and tests. **TypeScript is allowed. TypeScript is not king.** Use TypeScript where it helps. Never confuse it with the source of truth. -### Escape Hatch: Rust via WebAssembly +#### Escape Hatch: Rust via WebAssembly When JavaScript is insufficient — tight CPU-bound loops, memory-sensitive systems, unsafe parsing of hostile binary inputs, cryptographic kernels — use Rust. Rust provides memory safety without garbage collection, explicit ownership, excellent performance, and strong WebAssembly support. It is the recommended companion when the problem outgrows JavaScript. -**The architecture split:** +**Preferred architecture split:** -| Layer | Language | Role | -|-------|----------|------| -| Core domain logic | JavaScript | Default. Portable. Browser-ready. | -| Performance-critical kernels | Rust → Wasm | When safety/speed constraints justify it | -| Host adapters | JavaScript | Node, Deno, browser — behind ports | -| Orchestration | JavaScript | Glue between cores and hosts | +| Layer | Language | Role | +|--------------------------|-------------------|-------------------------------------------| +| Core domain logic | JavaScript | Default. Portable. Browser-ready. | +| Performance-critical kernels | Rust → Wasm | When safety/speed constraints justify it | +| Host adapters | JavaScript | Node, Deno, browser — behind ports | +| Orchestration | JavaScript | Glue between cores and hosts | ---- +### Architecture -## Architecture - -### Browser-First Portability +#### Browser-First Portability The browser is the most universal deployment platform and the ultimate portability test. Core logic prefers web-platform-friendly primitives: @@ -125,10 +115,12 @@ const buf = Buffer.from(text, 'utf8'); const resolved = require('path').resolve(p); ``` -### Hexagonal Architecture Is Mandatory +#### Hexagonal Architecture Is Mandatory Core domain logic must never depend directly on Node globals, filesystem APIs, `process`, `Buffer`, or host-specific calls. Those belong behind adapter ports. +**Core rule:** Core logic should not know that Node exists. Node-only facilities must remain exclusively in adapter implementations. + ```javascript // ✅ Core speaks in portable terms class ReplicaEngine { @@ -163,15 +155,11 @@ class IndexedDbStorageAdapter { } ``` -**Core rule:** Core logic should not know that Node exists. - ---- +### The Object Model -## The Object Model +System-style JavaScript organizes code around four categories of **runtime-backed** objects: -System-style JavaScript organizes code around four categories of runtime-backed objects: - -### Value Objects — Meaningful domain values with invariants +**Value Objects** — Meaningful domain values with invariants ```javascript class ObjectId { @@ -188,7 +176,7 @@ class ObjectId { } ``` -### Entities — Identity and lifecycle +**Entities** — Identity and lifecycle ```javascript class Replica { @@ -206,7 +194,7 @@ class Replica { } ``` -### Results and Outcomes — Runtime-backed domain types, not tagged unions +**Results and Outcomes** — Runtime-backed domain types, not tagged unions ```javascript class OpApplied { @@ -231,12 +219,12 @@ if (outcome instanceof OpSuperseded) { } ``` -### Errors — Domain failures are first-class objects +**Errors** — Domain failures are first-class objects ```javascript class InvalidObjectId extends DomainError { constructor(value) { - super(`Invalid commit hash: ${typeof value === 'string' ? value.slice(0, 16) + '…' : typeof value}`); + super(`Invalid object ID: ${typeof value === 'string' ? value.slice(0, 16) + '…' : typeof value}`); this.name = 'InvalidObjectId'; this.value = value; } @@ -249,14 +237,11 @@ if (err instanceof InvalidObjectId) { /* ... */ } if (err.message.includes('invalid')) { /* raccoon-in-a-dumpster energy */ } ``` ---- - -## Principles +### Principles These are the load-bearing architectural commitments. Violating any of these is a design-level issue. -### P1: Domain Concepts Require Runtime-Backed Forms - +**P1: Domain Concepts Require Runtime-Backed Forms** If a concept has invariants, identity, or behavior, it must have a runtime-backed representation — usually a class. A typedef or plain object is insufficient. ```javascript @@ -273,8 +258,7 @@ class EventId { } ``` -### P2: Validation Happens at Boundaries and Construction Points - +**P2: Validation Happens at Boundaries and Construction Points** Untrusted input becomes trusted data only through constructors or dedicated parse methods. Constructors establish invariants; they perform no I/O or async work. ```javascript @@ -284,8 +268,7 @@ const parsed = EventIdSchema.parse(decoded); // schema rejects malformed inp const eventId = new EventId(parsed.writerId, parsed.lamport); // constructor establishes invariants ``` -### P3: Behavior Belongs on the Type That Owns It - +**P3: Behavior Belongs on the Type That Owns It** Avoid switching on `kind`/`type` tags. Put behavior on the owning type. ```javascript @@ -307,8 +290,7 @@ class OpSuperseded { } ``` -### P4: Schemas Belong at Boundaries, Not in the Core - +**P4: Schemas Belong at Boundaries, Not in the Core** Use schemas (e.g., Zod) to reject malformed input at the edge. Domain types own behavior and invariants inside the boundary. ```javascript @@ -338,8 +320,7 @@ function parseReplicaConfig(raw) { } ``` -### P5: Serialization Is the Codec's Problem - +**P5: Serialization Is the Codec's Problem** The byte layer (CBOR/JSON/etc.) stays separate from the meaning layer. Domain types do not know how they are encoded. ```javascript @@ -364,12 +345,10 @@ class EventCodec { } ``` -### P6: Single Source of Truth - +**P6: Single Source of Truth** Do not duplicate the same contract across JSDoc, TypeScript, and validators. Define the runtime model first. Everything else derives from or documents it. -### P7: Runtime Dispatch Over Tag Switching - +**P7: Runtime Dispatch Over Tag Switching** Inside a coherent runtime, `instanceof` is often the correct dispatch mechanism. ```javascript @@ -383,44 +362,23 @@ const replayPolicy = ReplayPolicy.speculativeForkAllowed(); const result = await replayer.replaySegment(segment, replayPolicy); ``` -**Cross-realm caveat:** `instanceof` breaks across realm boundaries (iframes, web workers, multiple module instances). When values cross realms, use a protocol-based check (e.g., `Symbol.for`-keyed brand, or a static `.is()` method) instead: +**Cross-realm note:** `instanceof` breaks across realm boundaries (iframes, web workers, multiple module instances). When values cross realms, use branding instead: ```javascript -class EventId { - static _brand = Symbol.for('flyingrobots.EventId'); - - get [EventId._brand]() { return true; } - - static is(value) { - return value != null && value[EventId._brand] === true; - } -} - -// Works across realms -if (EventId.is(maybeEventId)) { /* ... */ } +// When values cross realm boundaries, brand instead of instanceof +static _brand = Symbol.for('flyingrobots.EventId'); +get [EventId._brand]() { return true; } +static is(v) { return v != null && v[EventId._brand] === true; } ``` ---- - -## Practices +### Practices These are concrete coding disciplines. Most are linter-enforceable. Violations should fail CI. -### `any` Is Banished; `unknown` Is Quarantined - -`any` is surrender. `unknown` is acceptable only at raw edges and must be eliminated through parsing immediately. - -### Trusted Values Must Preserve Integrity - -Use `Object.freeze()`, private fields, or defensive copying to protect invariants after construction. - -### Error Type Is Primary; Codes Are Optional Metadata - -Use specific error classes. Never branch on `err.message`. Error codes are fine as boundary metadata for consumers, but internal logic dispatches on type. - -### Parameter Objects Must Add Semantic Value - -Public APIs should not accept anonymous bags of options. If you're reaching for an options object, consider whether it should be a named policy or config type. +- **`any` is banished; `unknown` is quarantined** — `any` is surrender. `unknown` is acceptable only at raw edges and must be eliminated through parsing immediately. +- **Trusted values must preserve integrity** — Use `Object.freeze()`, private fields, or defensive copying to protect invariants after construction. +- **Error type is primary; codes are optional metadata** — Use specific error classes. Never branch on `err.message`. Error codes are fine as boundary metadata. +- **Parameter objects must add semantic value** — Public APIs should not accept anonymous bags of options. ```javascript // ❌ Options sludge @@ -431,26 +389,9 @@ const policy = ReplayPolicy.speculativeForkAllowed({ maxRetries: 3 }); await replayer.replaySegment(segment, policy); ``` -### Raw Objects May Carry Bytes, Not Meaning - -Plain objects are for decoded payloads or logging. If something has invariants or behavior, it gets a class. - -### Magic Numbers and Strings Are Banished - -Give semantic numbers a named constant. Centralize strings used for identifiers, events, or config keys. - -```javascript -// ❌ -if (log.length > 8192) { /* ... */ } - -// ✅ -const MAX_LOG_ENTRIES = 8192; -if (log.length > MAX_LOG_ENTRIES) { /* ... */ } -``` - -### Boolean Trap Parameters Are Banished - -Use named parameter objects or separate methods instead of positional booleans. +- **Raw objects may carry bytes, not meaning** — Plain objects are for decoded payloads or logging only. +- **Magic numbers and strings are banished** — Give semantic numbers a named constant. Centralize strings used for identifiers, events, or config keys. +- **Boolean trap parameters are banished** — Use named parameter objects or separate methods. ```javascript // ❌ What does `true` mean here? @@ -462,56 +403,40 @@ engine.compact(log, { preserveTombstones: true }); engine.compactPreservingTombstones(log); ``` -### Structured Data Stays Structured - -Machines must not be forced to parse prose to recover data. If something is structured, keep it structured. - -### Module Scope Is the First Privacy Boundary - -If it is not exported, it is private. Use module boundaries deliberately. - -### JSDoc Documents the Runtime Model; It Does Not Replace It +- **Structured data stays structured** — Machines must not be forced to parse prose to recover data. +- **Module scope is the first privacy boundary** — If it is not exported, it is private. +- **JSDoc documents the runtime model; it does not replace it** — JSDoc explains actual runtime behavior and contracts. It must never substitute for runtime-backed types or validation. -JSDoc exists to explain the actual runtime behavior and contracts. It must never substitute for runtime-backed types, validation, or invariants. +### Tooling Discipline ---- - -## Tooling Discipline - -### Lint Is Law +**Lint is law.** - Lint errors fail CI. - Suppressions require a documented justification. - Enforce hardest on: unsafe coercion, floating promises, raw `Error` objects, and host-specific API leakage into core code. -### TypeScript Rules - -When TypeScript is used: +**When TypeScript is used:** - It remains subordinate to runtime validation. - It must not be treated as a substitute for domain modeling. - `any` is banned. `unknown` at raw edges only, eliminated immediately. - Type-only constructs must not create a false sense of safety that the runtime does not back up. ---- - -## The Anti-Shape-Soup Doctrine +### The Anti-Shape-Soup Doctrine Most bad JavaScript infrastructure stems from weak modeling. The discipline is: -1. **Name** the concept. -2. **Construct** the concept — with validated invariants. -3. **Protect** the invariant — freeze, encapsulate, defend. -4. **Attach** the behavior — on the type that owns it. -5. **Guard** the boundary — schemas at the edge, domain types inside. -6. **Separate** the codec — serialization is not the domain's problem. -7. **Isolate** the host — Node behind adapters, core stays portable. -8. **Document** the runtime — JSDoc explains what actually exists. -9. **Test** the truth — executable specification, not wishful coverage. - ---- +1. Name the concept. +2. Construct the concept — with validated invariants. +3. Protect the invariant — freeze, encapsulate, defend. +4. Attach the behavior — on the type that owns it. +5. Guard the boundary — schemas at the edge, domain types inside. +6. Separate the codec — serialization is not the domain's problem. +7. Isolate the host — Node behind adapters, core stays portable. +8. Document the runtime — JSDoc explains what actually exists. +9. Test the truth — executable specification, not wishful coverage. -## Review Checklist +### Review Checklist Before merging, ask: @@ -524,6 +449,4 @@ Before merging, ask: - Could this logic run in a browser? - Is tooling fiction being mistaken for architecture? ---- - -This is infrastructure. Code cannot rely on costumes or pretend that comments are contracts. JavaScript is enough — not because it is magical, but because runtime truth beats phantom certainty every time. +**This is infrastructure.** Code cannot rely on costumes or pretend that comments are contracts. JavaScript is enough — not because it is magical, but because runtime truth beats phantom certainty every time. From dd63b5ba6574426b94c3572a7b8e8a1c16dfe9e7 Mon Sep 17 00:00:00 2001 From: James Ross Date: Thu, 2 Apr 2026 20:47:50 -0700 Subject: [PATCH 73/73] fix: repair CONTRIBUTING.md link path and manifesto code sample syntax MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit - CONTRIBUTING.md: ../../METHOD.md → ../METHOD.md (one level up from .github/, not two) - SYSTEMS_STYLE_JAVASCRIPT.md: wrapped cross-realm branding snippet in a class body so the markdown code-sample linter can parse it --- .github/CONTRIBUTING.md | 2 +- docs/SYSTEMS_STYLE_JAVASCRIPT.md | 9 +++++---- 2 files changed, 6 insertions(+), 5 deletions(-) diff --git a/.github/CONTRIBUTING.md b/.github/CONTRIBUTING.md index fa11943d..8fe8677f 100644 --- a/.github/CONTRIBUTING.md +++ b/.github/CONTRIBUTING.md @@ -8,7 +8,7 @@ - `docs/method/retro//` — closed cycle retrospectives No milestones. No ROADMAP. Cycles are the unit of work. -See [METHOD.md](../../METHOD.md) for the full process. +See [METHOD.md](../METHOD.md) for the full process. ## Cycle Process diff --git a/docs/SYSTEMS_STYLE_JAVASCRIPT.md b/docs/SYSTEMS_STYLE_JAVASCRIPT.md index 5f9a79d3..0fa01546 100644 --- a/docs/SYSTEMS_STYLE_JAVASCRIPT.md +++ b/docs/SYSTEMS_STYLE_JAVASCRIPT.md @@ -365,10 +365,11 @@ const result = await replayer.replaySegment(segment, replayPolicy); **Cross-realm note:** `instanceof` breaks across realm boundaries (iframes, web workers, multiple module instances). When values cross realms, use branding instead: ```javascript -// When values cross realm boundaries, brand instead of instanceof -static _brand = Symbol.for('flyingrobots.EventId'); -get [EventId._brand]() { return true; } -static is(v) { return v != null && v[EventId._brand] === true; } +class EventId { + static _brand = Symbol.for('flyingrobots.EventId'); + get [EventId._brand]() { return true; } + static is(v) { return v != null && v[EventId._brand] === true; } +} ``` ### Practices