diff --git a/README.md b/README.md
index f2557bc..c1d6ff4 100644
--- a/README.md
+++ b/README.md
@@ -1,329 +1,138 @@
# @versatly/workgraph
-Agent-first workgraph workspace for multi-agent collaboration.
-
-`@versatly/workgraph` is the standalone coordination core for multi-agent execution. It focuses only on:
-
-- Dynamic primitive registry (`thread`, `space`, `decision`, `lesson`, `fact`, `agent`, plus custom types)
-- Append-only event ledger (`.workgraph/ledger.jsonl`)
-- Ledger claim index (`.workgraph/ledger-index.json`) for fast ownership queries
-- Tamper-evident ledger hash-chain (`.workgraph/ledger-chain.json`)
-- Markdown-native primitive store
-- Thread lifecycle coordination (claim/release/block/unblock/done/decompose)
-- Space-scoped thread scheduling (`--space`)
-- Generated markdown command center (`workgraph command-center`)
-- Native skill primitive lifecycle (`workgraph skill write/load/propose/promote`)
-- Primitive-registry manifest + auto-generated `.base` files
-- Orientation loop commands (`workgraph status/brief/checkpoint/intake`)
-- Deterministic context lenses (`workgraph lens list/show`) for real-time situational awareness
-- Multi-filter primitive query (`workgraph query ...`)
-- Core + QMD-compatible keyword search (`workgraph search ...`)
-- Obsidian Kanban board generation/sync (`workgraph board generate|sync`)
-- Wiki-link graph intelligence (`workgraph graph index|hygiene|neighborhood|impact|context|edges|export`)
-- Policy party registry and sensitive transition gates
-- Programmatic dispatch contract (`workgraph dispatch ...`) with explicit status transitions, lease heartbeats, and timeout-aware adapter cancellation
-- Programmable trigger engine with composable conditions, idempotent dispatch bridging, and safety-gated high-impact actions
-- MCP write surface for trigger CRUD/fire, dispatch, autonomy, and mission orchestration
-- JSON-friendly CLI for agent orchestration
-
-No memory-category scaffolding, no qmd dependency, no observational-memory pipeline.
+`@versatly/workgraph` is a focused multi-agent coordination workspace built around four pillars:
-## Install
-
-```bash
-npm install @versatly/workgraph
-```
+1. **context graph**
+2. **thread collaboration**
+3. **MCP exposure**
+4. **actor registration**
-Or global CLI:
+The codebase has been narrowed to favor a smaller, more coherent system over a broad control-plane product surface.
-```bash
-npm install -g @versatly/workgraph
-```
+## What it does
-## Agent-first CLI
+- stores coordination primitives as markdown with frontmatter
+- maintains an append-only ledger for thread and actor activity
+- models collaboration through:
+ - `thread`
+ - `conversation`
+ - `plan-step`
+ - thread-scoped context entries
+- exposes read/write/collaboration operations over MCP
+- supports actor registration, registration requests/reviews, credentials, and presence heartbeats
+- builds context graph views from registry-backed primitives and wiki-link graph analysis
-```bash
-# Initialize pure workgraph workspace
-workgraph init ./wg-space --json
+## Workspace packages
-# Define custom primitive
-workgraph primitive define command-center \
- --description "Agent ops cockpit" \
- --fields owner:string \
- --fields panel_refs:list \
- --json
-
-# Create and route thread work
-workgraph thread create "Ship command center" \
- --goal "Production-ready multi-agent command center" \
- --priority high \
- --actor agent-lead \
- --json
-
-workgraph thread next --claim --actor agent-worker --json
-workgraph status --json
-workgraph brief --actor agent-worker --json
-workgraph lens list --json
-workgraph lens show my-work --actor agent-worker --json
-workgraph query --type thread --status open --limit 10 --json
-workgraph search "auth" --mode auto --json
-workgraph checkpoint "Completed API layer" --next "implement tests" --actor agent-worker --json
-workgraph board generate --output "ops/Workgraph Board.md" --json
-workgraph graph hygiene --json
-workgraph graph neighborhood ship-feature --depth 2 --json
-workgraph graph impact ship-feature --json
-workgraph graph context ship-feature --budget 2000 --json
-workgraph graph edges ship-feature --json
-workgraph graph export ship-feature --depth 2 --format md --json
-workgraph dispatch create "Review blockers" --actor agent-lead --json
-workgraph dispatch mark run_123 --status succeeded --output "Review complete" --actor agent-lead --json
-workgraph dispatch create-execute "Close all ready threads in platform space" \
- --actor agent-lead \
- --agents agent-a,agent-b,agent-c \
- --space spaces/platform \
- --json
-workgraph trigger fire triggers/escalate-blocked.md --event-key "thread-blocked-001" --actor agent-lead --json
-workgraph onboarding update onboarding/onboarding-for-agent-architect.md --status paused --actor agent-lead --json
-workgraph mcp serve -w /path/to/workspace --actor agent-ops --read-only
-workgraph ledger show --count 20 --json
-workgraph command-center --output "ops/Command Center.md" --json
-workgraph bases generate --refresh-registry --json
-```
+- `packages/kernel` — context graph, thread collaboration, auth, and registration domain logic
+- `packages/cli` — focused CLI over the retained kernel workflows
+- `packages/mcp-server` — stdio + HTTP MCP server
+- `packages/sdk` — curated public exports
-### JSON contract
-
-All commands support `--json` and emit:
-
-- Success: `{ "ok": true, "data": ... }`
-- Failure: `{ "ok": false, "error": "..." }` (non-zero exit)
-
-This is intended for robust parsing by autonomous agents.
-
-### Monorepo layout
-
-The repository is now fully organized as a pnpm workspaces monorepo while preserving
-the published `@versatly/workgraph` package compatibility surface.
-
-Legacy root `src/` compatibility wrappers have been removed. Package-owned modules
-under `packages/*` are the only implementation source of truth.
-
-Key workspace packages:
-
-- `packages/kernel` — domain state machine and coordination core
-- `packages/cli` — command surface over kernel workflows
-- `packages/sdk` — curated public package surface
-- `packages/control-api` — REST, SSE, webhook gateway, and HTTP MCP hosting
-- `packages/runtime-adapter-core` — reusable dispatch contracts and generic transports
-- `packages/adapter-claude-code` — Claude Code-specific execution adapter
-- `packages/adapter-cursor-cloud` — Cursor Cloud-style execution adapter
-- `packages/mcp-server` — stdio + HTTP MCP transport and tool registration
-- `packages/testkit` — contract fixtures and schema validation helpers
-- `packages/search-qmd-adapter` — search compatibility seam
-- `packages/obsidian-integration` — editor-facing projections and exports
-- `packages/skills` — package-level skill distribution surface
-
-Package ownership and layering are documented in `docs/PACKAGE_BOUNDARIES.md`.
-
-Migration notes: see `docs/MIGRATION.md`.
-Live workspace repair runbook: see `docs/INVARIANT_REPAIR_PLAYBOOK.md`.
-Realtime control-api SSE contract: see `docs/SSE_EVENTS.md`.
-Current architecture execution roadmap: see `docs/ARCHITECTURE_ROADMAP.md`.
-
-### Reliability and autonomy hardening
-
-Recent hardening focused on making unattended operation safer rather than just
-adding more commands:
-
-- dispatch runs now maintain leases while executing and propagate timeout/cancel
- intent into adapter execution contracts
-- autonomy cycles now repair dispatch state, reconcile expired leases, recover
- thread claim/reference drift, and run mission orchestration passes as part of
- the same control loop
-- trigger actions can now express composable boolean conditions (`all` / `any`
- / `not`) and route risky `shell` / `update-primitive` actions through safety
- rails
-- MCP now exposes trigger create/update/delete/fire tools in addition to the
- trigger engine cycle surface
-
-### Development workflow (contributors)
+## Install
```bash
-pnpm install
-pnpm run ci
+npm install @versatly/workgraph
```
-The default `pnpm run test` script now uses `scripts/run-tests.mjs`, a hardened
-Vitest wrapper that enforces deterministic process exit in CI (especially on
-Windows where lingering `esbuild` children can keep `vitest run` alive after
-all test files report complete).
-
-- `pnpm run test`: hardened runner (recommended for CI/local reliability)
-- `pnpm run test:vitest`: raw Vitest invocation (useful for debugging Vitest itself)
-
-Optional tuning knobs:
-
-- `WORKGRAPH_TEST_EXIT_GRACE_MS`: grace period after all file results are
- observed before forced process-tree cleanup (default `15000`)
-- `WORKGRAPH_TEST_MAX_RUNTIME_MS`: hard timeout for the full run (default
- `1200000`)
-
-### Demo vault generator
-
-Generate the large Obsidian demo workspace used for stress-testing:
+Global CLI:
```bash
-pnpm run demo:workspace
-pnpm run demo:obsidian-setup
+npm install -g @versatly/workgraph
```
-Runbook: `docs/OBSIDIAN_DEMO.md`.
-
-### Space-scoped scheduling
+## Quick start
```bash
-workgraph thread create "Implement auth middleware" \
- --goal "Protect private routes" \
- --space spaces/backend.md \
- --actor agent-api \
- --json
-
-workgraph thread list --space spaces/backend --ready --json
-workgraph thread next --space spaces/backend --claim --actor agent-api --json
+pnpm install
+pnpm run build
```
-### Auto-generate `.base` files from primitive registry
+Initialize a workspace:
```bash
-# Sync .workgraph/primitive-registry.yaml
-workgraph bases sync-registry --json
-
-# Generate canonical primitive .base files
-workgraph bases generate --json
-
-# Include non-canonical (agent-defined) primitives
-workgraph bases generate --all --refresh-registry --json
+workgraph init ./wg-space --json
```
-### Graph intelligence workflows
+Create and work a thread:
```bash
-# Build/refresh graph index first (optional but useful)
-workgraph graph index --json
-
-# Multi-hop neighborhood around a primitive slug/path
-workgraph graph neighborhood ship-feature --depth 2 --json
-
-# Reverse-link blast radius (what references this primitive)
-workgraph graph impact ship-feature --json
-
-# Auto-assemble markdown context bundle within token budget (chars/4)
-workgraph graph context ship-feature --budget 2000 --json
-
-# Inspect typed relationship edges for one primitive
-workgraph graph edges ship-feature --json
+workgraph thread create "Ship collaboration flow" \
+ --goal "Implement the retained MCP collaboration surface" \
+ --actor agent-lead \
+ --json
-# Export a markdown subgraph for handoff/sharing
-workgraph graph export ship-feature --depth 2 --format md --json
+workgraph thread next --claim --actor agent-worker --json
+workgraph thread done threads/ship-collaboration-flow.md \
+ --actor agent-worker \
+ --output "Completed https://github.com/Versatly/workgraph/pull/123" \
+ --json
```
-### Ledger query, blame, and tamper detection
+Explore the context graph:
```bash
-workgraph ledger query --actor agent-worker --op claim --json
-workgraph ledger blame threads/auth.md --json
-workgraph ledger verify --strict --json
+workgraph graph index --json
+workgraph graph neighborhood ship-collaboration-flow --depth 2 --json
+workgraph graph context ship-collaboration-flow --budget 2000 --json
+workgraph query --type thread --status open --json
+workgraph search "registration" --json
```
-### Native skill lifecycle (shared vault / Tailscale)
+Register and review actors:
```bash
-# with shared vault env (e.g. tailscale-mounted path)
-export WORKGRAPH_SHARED_VAULT=/mnt/tailscale/company-workgraph
-
-workgraph skill write "workgraph-manual" \
- --body-file ./skills/workgraph-manual.md \
- --owner agent-architect \
- --actor agent-architect \
+workgraph agent request agent-1 \
+ --role roles/contributor.md \
+ --actor agent-1 \
--json
-workgraph skill propose workgraph-manual --actor agent-reviewer --space spaces/platform --json
-workgraph skill promote workgraph-manual --actor agent-lead --json
-workgraph skill load workgraph-manual --json
-workgraph skill list --updated-since 2026-02-27T00:00:00.000Z --json
-workgraph skill history workgraph-manual --limit 10 --json
-workgraph skill diff workgraph-manual --json
-```
-
-### Optional Clawdapus integration
-
-List supported optional integrations:
+workgraph agent review agent-registration-requests/agent-1-123.md \
+ --decision approved \
+ --actor admin-reviewer \
+ --json
-```bash
-workgraph integration list --json
+workgraph agent heartbeat agent-1 --status online --actor agent-1 --json
```
-Install by integration ID (extensible pattern for future integrations):
+Run MCP:
```bash
-workgraph integration install clawdapus \
- --actor agent-architect \
- --json
-```
+workgraph mcp serve -w ./wg-space --actor agent-ops
-Refresh from upstream later (or use the `integration clawdapus` alias):
-
-```bash
-workgraph integration install clawdapus --force --actor agent-architect --json
+workgraph serve -w ./wg-space --actor agent-ops --port 8787
```
-## Legacy memory stacks vs Workgraph primitives
+## JSON contract
-`@versatly/workgraph` is **execution coordination only**.
+All CLI commands support `--json`:
-- Use it for: ownership, decomposition, dependency management, typed coordination primitives.
-- Do not use it for: long-term memory categories (`decisions/`, `people/`, `projects/` memory workflows), qmd semantic retrieval pipelines, observer/reflector memory compression.
+- success: `{ "ok": true, "data": ... }`
+- failure: `{ "ok": false, "error": "..." }`
-This split keeps the workgraph package focused, portable, and shell-agent-native.
-
-## Migrating from mixed memory/workgraph vaults
-
-1. Initialize a clean workgraph workspace:
- ```bash
- workgraph init ./coordination-space --json
- ```
-2. Recreate only coordination entities as workgraph primitives (`thread`, `space`, custom types).
-3. Move or archive memory-specific folders outside the coordination workspace.
-4. Generate a control plane note for humans/agents:
- ```bash
- workgraph command-center --output "ops/Command Center.md" --json
- ```
+This contract is intended for reliable automation.
## Programmatic API
```ts
-import { registry, thread, store, ledger, workspace } from '@versatly/workgraph';
+import { registry, thread, workspace } from '@versatly/workgraph';
workspace.initWorkspace('/tmp/wg');
-registry.defineType('/tmp/wg', 'milestone', 'Release checkpoint', {
- thread_refs: { type: 'list', default: [] },
- target_date: { type: 'date' },
+registry.defineType('/tmp/wg', 'note', 'Shared context note', {
+ context_refs: { type: 'list', default: [] },
}, 'agent-architect');
-const t = thread.createThread('/tmp/wg', 'Build Auth', 'JWT and refresh flow', 'agent-lead');
+const t = thread.createThread('/tmp/wg', 'Build auth', 'Ship actor registration flow', 'agent-lead');
thread.claim('/tmp/wg', t.path, 'agent-worker');
-thread.done('/tmp/wg', t.path, 'agent-worker', 'Shipped');
```
-## Publish (package-only)
-
-From this directory:
+## Development
```bash
-pnpm run ci
-pnpm publish --access public
+pnpm run typecheck
+pnpm run test
+pnpm run build
```
-## Skill guide
-
-See `SKILL.md` for the full operational playbook optimized for autonomous agents (including pi-mono compatibility guidance).
+The repository uses the hardened `scripts/run-tests.mjs` wrapper for reliable Vitest exits.
diff --git a/apps/web-control-plane/README.md b/apps/web-control-plane/README.md
deleted file mode 100644
index 81ad5a1..0000000
--- a/apps/web-control-plane/README.md
+++ /dev/null
@@ -1,3 +0,0 @@
-# WorkGraph Web Control Plane (Planned)
-
-This app is intentionally scaffolded as a placeholder for later phases.
diff --git a/apps/web-control-plane/app.js b/apps/web-control-plane/app.js
deleted file mode 100644
index 5e6475e..0000000
--- a/apps/web-control-plane/app.js
+++ /dev/null
@@ -1,43 +0,0 @@
-async function loadProjection(name) {
- const response = await fetch(`/api/projections/${name}`);
- if (!response.ok) {
- throw new Error(`Failed to load projection ${name}: ${response.status}`);
- }
- const body = await response.json();
- if (!body.ok) {
- throw new Error(body.error || `Projection ${name} returned an error.`);
- }
- return body.projection;
-}
-
-function renderJson(target, value) {
- target.textContent = JSON.stringify(value, null, 2);
-}
-
-function renderSummaryCards(target, summary) {
- target.innerHTML = '';
- for (const [key, value] of Object.entries(summary || {})) {
- const card = document.createElement('div');
- card.className = 'card';
- card.innerHTML = `
${key}
${String(value)}
`;
- target.appendChild(card);
- }
-}
-
-async function boot() {
- const root = document.getElementById('projection-root');
- const summaryRoot = document.getElementById('projection-summary');
- const projectionName = document.body.dataset.projection;
- if (!root || !summaryRoot || !projectionName) return;
- try {
- const projection = await loadProjection(projectionName);
- renderSummaryCards(summaryRoot, projection.summary || projection.projections || {});
- renderJson(root, projection);
- } catch (error) {
- root.textContent = error instanceof Error ? error.message : String(error);
- }
-}
-
-window.addEventListener('DOMContentLoaded', () => {
- void boot();
-});
diff --git a/apps/web-control-plane/autonomy-health.html b/apps/web-control-plane/autonomy-health.html
deleted file mode 100644
index 5437381..0000000
--- a/apps/web-control-plane/autonomy-health.html
+++ /dev/null
@@ -1,14 +0,0 @@
-
-
-
-
-
- Autonomy Health
-
-
-
-
-
-
-
-
diff --git a/apps/web-control-plane/federation-status.html b/apps/web-control-plane/federation-status.html
deleted file mode 100644
index c160e7b..0000000
--- a/apps/web-control-plane/federation-status.html
+++ /dev/null
@@ -1,14 +0,0 @@
-
-
-
-
-
- Federation Status
-
-
-
-
-
-
-
-
diff --git a/apps/web-control-plane/index.html b/apps/web-control-plane/index.html
deleted file mode 100644
index 172f9c5..0000000
--- a/apps/web-control-plane/index.html
+++ /dev/null
@@ -1,26 +0,0 @@
-
-
-
-
-
- WorkGraph Control Plane
-
-
-
-
- WorkGraph Operator Control Plane
- Operator-facing projections for dispatch, transport, federation, triggers, autonomy, and missions.
-
-
-
-
Active runs, stale runs, and failed reconciliations.
-
Blocked threads, escalations, and policy violations.
-
Mission completion and milestones.
-
Outbox depth, dead-letter state, and delivery success.
-
Remote workspace compatibility and sync status.
-
Trigger states, cooldowns, and errors.
-
Autonomy daemon status and heartbeat.
-
-
-
-
diff --git a/apps/web-control-plane/mission-progress.html b/apps/web-control-plane/mission-progress.html
deleted file mode 100644
index b4df69d..0000000
--- a/apps/web-control-plane/mission-progress.html
+++ /dev/null
@@ -1,14 +0,0 @@
-
-
-
-
-
- Mission Progress
-
-
-
-
-
-
-
-
diff --git a/apps/web-control-plane/package.json b/apps/web-control-plane/package.json
deleted file mode 100644
index 38b0125..0000000
--- a/apps/web-control-plane/package.json
+++ /dev/null
@@ -1,6 +0,0 @@
-{
- "name": "@versatly/workgraph-web-control-plane",
- "version": "0.1.0",
- "private": true,
- "type": "module"
-}
diff --git a/apps/web-control-plane/risk-dashboard.html b/apps/web-control-plane/risk-dashboard.html
deleted file mode 100644
index 6c4450b..0000000
--- a/apps/web-control-plane/risk-dashboard.html
+++ /dev/null
@@ -1,14 +0,0 @@
-
-
-
-
-
- Risk Dashboard
-
-
-
-
-
-
-
-
diff --git a/apps/web-control-plane/run-health.html b/apps/web-control-plane/run-health.html
deleted file mode 100644
index 138a88e..0000000
--- a/apps/web-control-plane/run-health.html
+++ /dev/null
@@ -1,14 +0,0 @@
-
-
-
-
-
- Run Health
-
-
-
-
-
-
-
-
diff --git a/apps/web-control-plane/style.css b/apps/web-control-plane/style.css
deleted file mode 100644
index 98b630f..0000000
--- a/apps/web-control-plane/style.css
+++ /dev/null
@@ -1,49 +0,0 @@
-body {
- font-family: Inter, ui-sans-serif, system-ui, -apple-system, BlinkMacSystemFont, "Segoe UI", sans-serif;
- margin: 0;
- background: #0b1020;
- color: #eef2ff;
-}
-
-a {
- color: #93c5fd;
-}
-
-header, main {
- max-width: 1200px;
- margin: 0 auto;
- padding: 24px;
-}
-
-.cards {
- display: grid;
- grid-template-columns: repeat(auto-fit, minmax(220px, 1fr));
- gap: 16px;
-}
-
-.card {
- background: #16203a;
- border: 1px solid #334155;
- border-radius: 12px;
- padding: 16px;
-}
-
-.nav {
- display: flex;
- flex-wrap: wrap;
- gap: 12px;
- margin-bottom: 24px;
-}
-
-pre {
- background: #020617;
- border: 1px solid #334155;
- border-radius: 12px;
- padding: 16px;
- overflow: auto;
- white-space: pre-wrap;
-}
-
-.muted {
- color: #94a3b8;
-}
diff --git a/apps/web-control-plane/transport-health.html b/apps/web-control-plane/transport-health.html
deleted file mode 100644
index 283d695..0000000
--- a/apps/web-control-plane/transport-health.html
+++ /dev/null
@@ -1,14 +0,0 @@
-
-
-
-
-
- Transport Health
-
-
-
-
-
-
-
-
diff --git a/apps/web-control-plane/trigger-health.html b/apps/web-control-plane/trigger-health.html
deleted file mode 100644
index 2b3c92c..0000000
--- a/apps/web-control-plane/trigger-health.html
+++ /dev/null
@@ -1,14 +0,0 @@
-
-
-
-
-
- Trigger Health
-
-
-
-
-
-
-
-
diff --git a/examples/multi-agent-showcase/README.md b/examples/multi-agent-showcase/README.md
deleted file mode 100644
index 8d9eb2f..0000000
--- a/examples/multi-agent-showcase/README.md
+++ /dev/null
@@ -1,89 +0,0 @@
-# OBJ-09: Signature Multi-Agent Showcase
-
-This showcase demonstrates a full WorkGraph collaboration lifecycle with three agents:
-
-- `governance-admin` (governance + approvals)
-- `agent-intake` (triage + routing)
-- `agent-builder` (implementation)
-- `agent-reviewer` (self-assembly + QA closure)
-
-The flow is intentionally end-to-end and reproducible from a fresh workspace. Every WorkGraph CLI invocation uses `--json`.
-
-## What this demonstrates
-
-1. **Agent registration and governance**
- - Bootstrap admin registration
- - Approval-based registration requests for agents
- - Credential issuance and heartbeat publication
-
-2. **Thread lifecycle and plan-step coordination**
- - Multi-thread objective decomposition
- - Conversation and plan-step creation
- - Claim/start/progress/done transitions across multiple actors
-
-3. **Self-assembly**
- - Capability advertisement + requirements matching
- - `assembleAgent()` claims the next suitable thread
- - Existing plan-step is automatically activated for the assembled agent
-
-4. **Trigger -> run -> evidence loop**
- - Trigger creation with a structured `dispatch-run` action
- - Trigger engine cycle executes runs automatically
- - Dispatch run evidence chain is validated from CLI output
- - Ledger hash-chain integrity is verified
-
-## Run it
-
-From repo root:
-
-```bash
-node examples/multi-agent-showcase/run.mjs --json
-```
-
-Optional arguments:
-
-- `--workspace `: use a specific workspace directory
-- `--skip-build`: skip `pnpm run build` (useful in tests)
-- `--json`: emit machine-readable summary only
-
-Unix shells can still use the wrapper:
-
-```bash
-bash examples/multi-agent-showcase/run.sh --json
-```
-
-## Script breakdown
-
-- `scripts/01-governance.mjs`
- - Initializes workspace
- - Registers `governance-admin` with bootstrap token
- - Runs request/review approval flow for all collaborating agents
- - Outputs issued API keys and governance snapshot
-
-- `scripts/02-collaboration.mjs`
- - Creates threads, conversation, and plan-steps
- - Drives intake + builder thread lifecycle transitions
- - Runs self-assembly for reviewer via SDK
- - Completes reviewer plan-step and closes conversation
-
-- `scripts/03-trigger-loop.mjs`
- - Creates active trigger via SDK with `dispatch-run` action
- - Executes trigger engine loop with run execution
- - Validates run status/evidence and ledger integrity
-
-- `scripts/run-showcase.mjs`
- - Orchestrates all phases
- - Collects rollup metrics and boolean capability checks
- - Returns one final JSON report
-
-## Expected outcome
-
-The final JSON output contains:
-
-- `checks.governance`
-- `checks.selfAssemblyClaimedReviewerThread`
-- `checks.planStepCoordinated`
-- `checks.triggerRunEvidence`
-- `checks.ledgerActivity`
-
-When all checks are `true`, the showcase has completed successfully.
diff --git a/examples/multi-agent-showcase/run.mjs b/examples/multi-agent-showcase/run.mjs
deleted file mode 100644
index 2786aeb..0000000
--- a/examples/multi-agent-showcase/run.mjs
+++ /dev/null
@@ -1,13 +0,0 @@
-#!/usr/bin/env node
-
-import path from 'node:path';
-import { fileURLToPath } from 'node:url';
-import { execFileSync } from 'node:child_process';
-
-const scriptDir = path.dirname(fileURLToPath(import.meta.url));
-const scriptPath = path.join(scriptDir, 'scripts', 'run-showcase.mjs');
-
-execFileSync('node', [scriptPath, ...process.argv.slice(2)], {
- stdio: 'inherit',
- env: process.env,
-});
diff --git a/examples/multi-agent-showcase/run.sh b/examples/multi-agent-showcase/run.sh
deleted file mode 100755
index 2dcb187..0000000
--- a/examples/multi-agent-showcase/run.sh
+++ /dev/null
@@ -1,58 +0,0 @@
-#!/usr/bin/env bash
-set -euo pipefail
-
-SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
-REPO_ROOT="$(cd "${SCRIPT_DIR}/../.." && pwd)"
-
-WORKSPACE=""
-SKIP_BUILD=0
-JSON_MODE=0
-
-while [[ $# -gt 0 ]]; do
- case "$1" in
- --workspace|-w)
- if [[ $# -lt 2 ]]; then
- echo "Missing value for $1" >&2
- exit 1
- fi
- WORKSPACE="$2"
- shift 2
- ;;
- --skip-build)
- SKIP_BUILD=1
- shift
- ;;
- --json)
- JSON_MODE=1
- shift
- ;;
- *)
- echo "Unknown argument: $1" >&2
- echo "Usage: run.sh [--workspace ] [--skip-build] [--json]" >&2
- exit 1
- ;;
- esac
-done
-
-if [[ -z "${WORKSPACE}" ]]; then
- WORKSPACE="$(mktemp -d /tmp/workgraph-obj09-showcase-XXXXXX)"
-fi
-
-if [[ "${SKIP_BUILD}" -ne 1 ]]; then
- echo "[obj-09] building repository artifacts..." >&2
- (
- cd "${REPO_ROOT}"
- pnpm run build >/dev/null
- )
-fi
-
-SHOWCASE_ARGS=(--workspace "${WORKSPACE}" --json)
-if [[ "${SKIP_BUILD}" -eq 1 ]]; then
- SHOWCASE_ARGS+=(--skip-build)
-fi
-
-if [[ "${JSON_MODE}" -ne 1 ]]; then
- echo "[obj-09] running showcase in ${WORKSPACE}" >&2
-fi
-
-node "${SCRIPT_DIR}/scripts/run-showcase.mjs" "${SHOWCASE_ARGS[@]}"
diff --git a/examples/multi-agent-showcase/scripts/01-governance.mjs b/examples/multi-agent-showcase/scripts/01-governance.mjs
deleted file mode 100755
index 7215647..0000000
--- a/examples/multi-agent-showcase/scripts/01-governance.mjs
+++ /dev/null
@@ -1,165 +0,0 @@
-#!/usr/bin/env node
-
-import path from 'node:path';
-import {
- ensureBuild,
- logLine,
- resolveRepoRoot,
- resolveWorkspace,
- runCliJson,
-} from './lib/demo-utils.mjs';
-
-const AGENTS = {
- admin: 'governance-admin',
- intake: 'agent-intake',
- builder: 'agent-builder',
- reviewer: 'agent-reviewer',
-};
-
-const roleByAgent = {
- [AGENTS.intake]: 'roles/contributor.md',
- [AGENTS.builder]: 'roles/contributor.md',
- [AGENTS.reviewer]: 'roles/viewer.md',
-};
-
-async function main() {
- const repoRoot = resolveRepoRoot(import.meta.url);
- const resolved = resolveWorkspace(process.argv.slice(2));
- if (!resolved.skipBuild) {
- logLine('building dist artifacts', resolved.json);
- await ensureBuild(repoRoot);
- }
- const workspacePath = resolved.workspacePath;
-
- logLine('initializing workspace', resolved.json);
- const init = await runCliJson(repoRoot, ['init', workspacePath, '--json']);
- const bootstrapTrustToken = String(init.data.bootstrapTrustToken);
-
- logLine('registering governance admin', resolved.json);
- const adminRegistration = await runCliJson(repoRoot, [
- 'agent',
- 'register',
- AGENTS.admin,
- '-w',
- workspacePath,
- '--token',
- bootstrapTrustToken,
- '--role',
- 'roles/admin.md',
- '--capabilities',
- 'policy:manage,agent:approve-registration,agent:register,dispatch:run,thread:claim,thread:manage',
- '--actor',
- AGENTS.admin,
- '--json',
- ]);
- const adminApiKey = String(adminRegistration.data.apiKey ?? '');
-
- const approvals = [];
- for (const agent of [AGENTS.intake, AGENTS.builder, AGENTS.reviewer]) {
- logLine(`requesting registration for ${agent}`, resolved.json);
- const request = await runCliJson(
- repoRoot,
- [
- 'agent',
- 'request',
- agent,
- '-w',
- workspacePath,
- '--actor',
- agent,
- '--role',
- roleByAgent[agent],
- '--capabilities',
- 'thread:claim,thread:manage,dispatch:run,agent:heartbeat',
- '--note',
- `OBJ-09 demo onboarding for ${agent}`,
- '--json',
- ],
- { env: adminApiKey ? { WORKGRAPH_API_KEY: adminApiKey } : undefined },
- );
- const requestPath = String(request.data.request.path);
-
- logLine(`approving registration for ${agent}`, resolved.json);
- const review = await runCliJson(
- repoRoot,
- [
- 'agent',
- 'review',
- requestPath,
- '-w',
- workspacePath,
- '--decision',
- 'approved',
- '--actor',
- AGENTS.admin,
- '--role',
- roleByAgent[agent],
- '--capabilities',
- 'thread:claim,thread:manage,dispatch:run,agent:heartbeat',
- '--json',
- ],
- { env: adminApiKey ? { WORKGRAPH_API_KEY: adminApiKey } : undefined },
- );
- approvals.push({
- agent,
- requestPath,
- approvalPath: String(review.data.approval.path),
- apiKey: String(review.data.apiKey ?? ''),
- });
- }
-
- logLine('publishing initial agent heartbeats', resolved.json);
- for (const approval of approvals) {
- await runCliJson(
- repoRoot,
- [
- 'agent',
- 'heartbeat',
- approval.agent,
- '-w',
- workspacePath,
- '--actor',
- approval.agent,
- '--status',
- 'online',
- '--capabilities',
- 'thread:claim,thread:manage,dispatch:run,agent:heartbeat',
- '--json',
- ],
- { env: approval.apiKey ? { WORKGRAPH_API_KEY: approval.apiKey } : undefined },
- );
- }
-
- const agents = await runCliJson(
- repoRoot,
- ['agent', 'list', '-w', workspacePath, '--json'],
- { env: adminApiKey ? { WORKGRAPH_API_KEY: adminApiKey } : undefined },
- );
- const credentials = await runCliJson(
- repoRoot,
- ['agent', 'credential-list', '-w', workspacePath, '--json'],
- { env: adminApiKey ? { WORKGRAPH_API_KEY: adminApiKey } : undefined },
- );
-
- const output = {
- workspacePath,
- bootstrapTrustToken,
- admin: {
- actor: AGENTS.admin,
- apiKey: adminApiKey,
- credentialId: String(adminRegistration.data.credential?.id ?? ''),
- },
- approvals,
- governanceSnapshot: {
- agentCount: Number(agents.data.count ?? 0),
- credentialCount: Number(credentials.data.count ?? 0),
- },
- };
- process.stdout.write(`${JSON.stringify(output, null, 2)}\n`);
-}
-
-main().catch((error) => {
- const message = error instanceof Error ? error.message : String(error);
- process.stderr.write(`${message}\n`);
- process.exit(1);
-});
diff --git a/examples/multi-agent-showcase/scripts/02-collaboration.mjs b/examples/multi-agent-showcase/scripts/02-collaboration.mjs
deleted file mode 100755
index a863735..0000000
--- a/examples/multi-agent-showcase/scripts/02-collaboration.mjs
+++ /dev/null
@@ -1,471 +0,0 @@
-#!/usr/bin/env node
-
-import {
- ensureBuild,
- loadSdk,
- logLine,
- resolveRepoRoot,
- runCliJson,
-} from './lib/demo-utils.mjs';
-
-async function main() {
- const args = parseArgs(process.argv.slice(2));
- if (!args.workspacePath) {
- throw new Error('Missing required --workspace argument.');
- }
-
- const repoRoot = resolveRepoRoot(import.meta.url);
- if (!args.skipBuild) {
- logLine('building dist artifacts', args.json);
- await ensureBuild(repoRoot);
- }
- const sdk = await loadSdk(repoRoot);
-
- const workspacePath = args.workspacePath;
- const apiKeyEnvByActor = {
- [args.adminActor]: args.adminApiKey,
- [args.intakeActor]: args.intakeApiKey,
- [args.builderActor]: args.builderApiKey,
- [args.reviewerActor]: args.reviewerApiKey,
- };
-
- logLine('creating lifecycle threads', args.json);
- const intakeThread = await runCliJson(
- repoRoot,
- [
- 'thread',
- 'create',
- 'OBJ-09 intake triage',
- '-w',
- workspacePath,
- '--goal',
- 'Collect triage context and route implementation work',
- '--priority',
- 'high',
- '--actor',
- args.adminActor,
- '--tags',
- 'obj-09,intake',
- '--json',
- ],
- { env: toApiKeyEnv(args.adminApiKey) },
- );
- const intakeThreadPath = String(intakeThread.data.thread.path);
-
- const builderThread = await runCliJson(
- repoRoot,
- [
- 'thread',
- 'create',
- 'OBJ-09 implementation',
- '-w',
- workspacePath,
- '--goal',
- 'Implement coordinated fix and capture build evidence',
- '--priority',
- 'high',
- '--deps',
- intakeThreadPath,
- '--actor',
- args.adminActor,
- '--tags',
- 'obj-09,implementation',
- '--json',
- ],
- { env: toApiKeyEnv(args.adminApiKey) },
- );
- const builderThreadPath = String(builderThread.data.thread.path);
-
- const reviewerThread = await runCliJson(
- repoRoot,
- [
- 'thread',
- 'create',
- 'OBJ-09 verification',
- '-w',
- workspacePath,
- '--goal',
- 'Verify the fix and close the coordination loop',
- '--priority',
- 'medium',
- '--deps',
- builderThreadPath,
- '--actor',
- args.adminActor,
- '--tags',
- 'obj-09,verification',
- '--json',
- ],
- { env: toApiKeyEnv(args.adminApiKey) },
- );
- const reviewerThreadPath = String(reviewerThread.data.thread.path);
-
- logLine('creating conversation and plan steps', args.json);
- const conversation = await runCliJson(
- repoRoot,
- [
- 'conversation',
- 'create',
- 'OBJ-09 execution room',
- '-w',
- workspacePath,
- '--actor',
- args.adminActor,
- '--threads',
- `${intakeThreadPath},${builderThreadPath},${reviewerThreadPath}`,
- '--tags',
- 'obj-09,multi-agent',
- '--status',
- 'active',
- '--json',
- ],
- { env: toApiKeyEnv(args.adminApiKey) },
- );
- const conversationPath = String(conversation.data.conversation.path);
-
- const intakePlanStep = await runCliJson(
- repoRoot,
- [
- 'plan-step',
- 'create',
- conversationPath,
- 'Triage incoming issue and hand off implementation',
- '-w',
- workspacePath,
- '--actor',
- args.adminActor,
- '--thread',
- intakeThreadPath,
- '--assignee',
- args.intakeActor,
- '--order',
- '1',
- '--json',
- ],
- { env: toApiKeyEnv(args.adminApiKey) },
- );
- const intakeStepPath = String(intakePlanStep.data.step.path);
-
- const builderPlanStep = await runCliJson(
- repoRoot,
- [
- 'plan-step',
- 'create',
- conversationPath,
- 'Implement and validate coordinated fix',
- '-w',
- workspacePath,
- '--actor',
- args.adminActor,
- '--thread',
- builderThreadPath,
- '--assignee',
- args.builderActor,
- '--order',
- '2',
- '--json',
- ],
- { env: toApiKeyEnv(args.adminApiKey) },
- );
- const builderStepPath = String(builderPlanStep.data.step.path);
-
- const reviewerPlanStep = await runCliJson(
- repoRoot,
- [
- 'plan-step',
- 'create',
- conversationPath,
- 'Run independent QA verification',
- '-w',
- workspacePath,
- '--actor',
- args.adminActor,
- '--thread',
- reviewerThreadPath,
- '--assignee',
- args.reviewerActor,
- '--order',
- '3',
- '--json',
- ],
- { env: toApiKeyEnv(args.adminApiKey) },
- );
- const reviewerStepPath = String(reviewerPlanStep.data.step.path);
-
- logLine('running intake and builder lifecycle', args.json);
- await runCliJson(
- repoRoot,
- ['dispatch', 'claim', intakeThreadPath, '-w', workspacePath, '--actor', args.intakeActor, '--json'],
- { env: toApiKeyEnv(args.intakeApiKey) },
- );
- await runCliJson(
- repoRoot,
- ['plan-step', 'start', intakeStepPath, '-w', workspacePath, '--actor', args.intakeActor, '--json'],
- { env: toApiKeyEnv(args.intakeApiKey) },
- );
- await runCliJson(
- repoRoot,
- ['plan-step', 'progress', intakeStepPath, '100', '-w', workspacePath, '--actor', args.intakeActor, '--json'],
- { env: toApiKeyEnv(args.intakeApiKey) },
- );
- await runCliJson(
- repoRoot,
- [
- 'thread',
- 'done',
- intakeThreadPath,
- '-w',
- workspacePath,
- '--actor',
- args.intakeActor,
- '--output',
- 'Triage completed with evidence https://github.com/versatly/workgraph/pull/obj-09-intake',
- '--json',
- ],
- { env: toApiKeyEnv(args.intakeApiKey) },
- );
- await runCliJson(
- repoRoot,
- ['plan-step', 'done', intakeStepPath, '-w', workspacePath, '--actor', args.intakeActor, '--json'],
- { env: toApiKeyEnv(args.intakeApiKey) },
- );
-
- await runCliJson(
- repoRoot,
- ['dispatch', 'claim', builderThreadPath, '-w', workspacePath, '--actor', args.builderActor, '--json'],
- { env: toApiKeyEnv(args.builderApiKey) },
- );
- await runCliJson(
- repoRoot,
- ['plan-step', 'start', builderStepPath, '-w', workspacePath, '--actor', args.builderActor, '--json'],
- { env: toApiKeyEnv(args.builderApiKey) },
- );
- await runCliJson(
- repoRoot,
- ['plan-step', 'progress', builderStepPath, '75', '-w', workspacePath, '--actor', args.builderActor, '--json'],
- { env: toApiKeyEnv(args.builderApiKey) },
- );
- await runCliJson(
- repoRoot,
- [
- 'thread',
- 'done',
- builderThreadPath,
- '-w',
- workspacePath,
- '--actor',
- args.builderActor,
- '--output',
- 'Implementation completed with verification logs https://github.com/versatly/workgraph/pull/obj-09-build',
- '--json',
- ],
- { env: toApiKeyEnv(args.builderApiKey) },
- );
- await runCliJson(
- repoRoot,
- ['plan-step', 'done', builderStepPath, '-w', workspacePath, '--actor', args.builderActor, '--json'],
- { env: toApiKeyEnv(args.builderApiKey) },
- );
-
- logLine('advertising reviewer capabilities and running self-assembly', args.json);
- await runCliJson(
- repoRoot,
- [
- 'primitive',
- 'update',
- reviewerThreadPath,
- '-w',
- workspacePath,
- '--actor',
- args.reviewerActor,
- '--set',
- 'required_capabilities=quality:review',
- '--set',
- 'required_skills=qa-verification',
- '--set',
- 'required_adapters=shell-worker',
- '--json',
- ],
- { env: toApiKeyEnv(args.reviewerApiKey) },
- );
- await runCliJson(
- repoRoot,
- [
- 'agent',
- 'heartbeat',
- args.reviewerActor,
- '-w',
- workspacePath,
- '--actor',
- args.reviewerActor,
- '--status',
- 'online',
- '--current-task',
- reviewerThreadPath,
- '--capabilities',
- 'thread:claim,thread:manage,dispatch:run,quality:review,skill:qa-verification,adapter:shell-worker',
- '--json',
- ],
- { env: toApiKeyEnv(args.reviewerApiKey) },
- );
-
- const selfAssembly = sdk.agentSelfAssembly.assembleAgent(
- workspacePath,
- args.reviewerActor,
- {
- credentialToken: args.reviewerApiKey,
- advertise: {
- capabilities: ['quality:review'],
- skills: ['qa-verification'],
- adapters: ['shell-worker'],
- },
- createPlanStepIfMissing: true,
- recoverStaleClaims: true,
- },
- );
-
- await runCliJson(
- repoRoot,
- ['plan-step', 'progress', reviewerStepPath, '100', '-w', workspacePath, '--actor', args.reviewerActor, '--json'],
- { env: toApiKeyEnv(args.reviewerApiKey) },
- );
- await runCliJson(
- repoRoot,
- ['plan-step', 'done', reviewerStepPath, '-w', workspacePath, '--actor', args.reviewerActor, '--json'],
- { env: toApiKeyEnv(args.reviewerApiKey) },
- );
- await runCliJson(
- repoRoot,
- [
- 'thread',
- 'done',
- reviewerThreadPath,
- '-w',
- workspacePath,
- '--actor',
- args.reviewerActor,
- '--output',
- 'QA sign-off completed with green checks https://github.com/versatly/workgraph/pull/obj-09-qa',
- '--json',
- ],
- { env: toApiKeyEnv(args.reviewerApiKey) },
- );
- await runCliJson(
- repoRoot,
- [
- 'conversation',
- 'message',
- conversationPath,
- 'All coordination plan-steps completed by intake, builder, and reviewer agents.',
- '-w',
- workspacePath,
- '--actor',
- args.adminActor,
- '--kind',
- 'decision',
- '--thread',
- reviewerThreadPath,
- '--json',
- ],
- { env: toApiKeyEnv(args.adminApiKey) },
- );
-
- const conversationState = await runCliJson(
- repoRoot,
- ['conversation', 'state', conversationPath, '-w', workspacePath, '--json'],
- { env: toApiKeyEnv(args.adminApiKey) },
- );
- const readyThreads = await runCliJson(
- repoRoot,
- ['thread', 'list', '-w', workspacePath, '--ready', '--json'],
- { env: toApiKeyEnv(args.adminApiKey) },
- );
-
- const output = {
- workspacePath,
- conversationPath,
- threadPaths: {
- intakeThreadPath,
- builderThreadPath,
- reviewerThreadPath,
- },
- planStepPaths: {
- intakeStepPath,
- builderStepPath,
- reviewerStepPath,
- },
- selfAssembly: {
- agentName: selfAssembly.agentName,
- claimedThreadPath: selfAssembly.claimedThread?.path,
- planStepPath: selfAssembly.planStep?.path,
- warnings: selfAssembly.warnings,
- },
- conversationSummary: conversationState.data.summary,
- readyThreadCount: Number(readyThreads.data.count ?? 0),
- actorApiKeys: apiKeyEnvByActor,
- };
- process.stdout.write(`${JSON.stringify(output, null, 2)}\n`);
-}
-
-function parseArgs(args) {
- const parsed = {
- workspacePath: '',
- adminActor: 'governance-admin',
- intakeActor: 'agent-intake',
- builderActor: 'agent-builder',
- reviewerActor: 'agent-reviewer',
- adminApiKey: '',
- intakeApiKey: '',
- builderApiKey: '',
- reviewerApiKey: '',
- skipBuild: false,
- json: false,
- };
- for (let idx = 0; idx < args.length; idx += 1) {
- const arg = String(args[idx] ?? '');
- if ((arg === '--workspace' || arg === '-w') && idx + 1 < args.length) {
- parsed.workspacePath = String(args[idx + 1]);
- idx += 1;
- continue;
- }
- if (arg === '--admin-api-key' && idx + 1 < args.length) {
- parsed.adminApiKey = String(args[idx + 1]);
- idx += 1;
- continue;
- }
- if (arg === '--intake-api-key' && idx + 1 < args.length) {
- parsed.intakeApiKey = String(args[idx + 1]);
- idx += 1;
- continue;
- }
- if (arg === '--builder-api-key' && idx + 1 < args.length) {
- parsed.builderApiKey = String(args[idx + 1]);
- idx += 1;
- continue;
- }
- if (arg === '--reviewer-api-key' && idx + 1 < args.length) {
- parsed.reviewerApiKey = String(args[idx + 1]);
- idx += 1;
- continue;
- }
- if (arg === '--skip-build') {
- parsed.skipBuild = true;
- continue;
- }
- if (arg === '--json') {
- parsed.json = true;
- }
- }
- return parsed;
-}
-
-function toApiKeyEnv(apiKey) {
- if (!apiKey) return undefined;
- return { WORKGRAPH_API_KEY: apiKey };
-}
-
-main().catch((error) => {
- const message = error instanceof Error ? error.message : String(error);
- process.stderr.write(`${message}\n`);
- process.exit(1);
-});
diff --git a/examples/multi-agent-showcase/scripts/03-trigger-loop.mjs b/examples/multi-agent-showcase/scripts/03-trigger-loop.mjs
deleted file mode 100755
index 654e391..0000000
--- a/examples/multi-agent-showcase/scripts/03-trigger-loop.mjs
+++ /dev/null
@@ -1,247 +0,0 @@
-#!/usr/bin/env node
-
-import {
- ensureBuild,
- loadSdk,
- logLine,
- resolveRepoRoot,
- runCliJson,
-} from './lib/demo-utils.mjs';
-
-async function main() {
- const args = parseArgs(process.argv.slice(2));
- if (!args.workspacePath) {
- throw new Error('Missing required --workspace argument.');
- }
-
- const repoRoot = resolveRepoRoot(import.meta.url);
- if (!args.skipBuild) {
- logLine('building dist artifacts', args.json);
- await ensureBuild(repoRoot);
- }
- const sdk = await loadSdk(repoRoot);
- const workspacePath = args.workspacePath;
-
- logLine('creating active trigger for thread-complete events', args.json);
- const shellCommand = `"${process.execPath}" -e "console.log('obj09-trigger-ok'); console.log('https://github.com/versatly/workgraph/pull/obj-09-trigger');"`;
- const trigger = sdk.store.create(
- workspacePath,
- 'trigger',
- {
- title: 'OBJ-09 thread completion trigger',
- status: 'active',
- condition: {
- type: 'event',
- event: 'thread-complete',
- },
- action: {
- type: 'dispatch-run',
- objective: 'React to completed thread {{matched_event_latest_target}}',
- adapter: 'shell-worker',
- context: {
- shell_command: shellCommand,
- },
- },
- cooldown: 0,
- tags: ['obj-09', 'trigger'],
- },
- '# OBJ-09 Trigger\n\nDispatches a shell-worker run after thread completion events.\n',
- args.adminActor,
- );
-
- // First cycle initializes event cursor for deterministic behavior.
- await runCliJson(
- repoRoot,
- [
- 'trigger',
- 'engine',
- 'run',
- '-w',
- workspacePath,
- '--actor',
- args.adminActor,
- '--execute-runs',
- '--agents',
- `${args.intakeActor},${args.builderActor},${args.reviewerActor}`,
- '--max-steps',
- '40',
- '--step-delay-ms',
- '0',
- '--timeout-ms',
- '30000',
- '--json',
- ],
- { env: toApiKeyEnv(args.adminApiKey) },
- );
-
- logLine('creating a source thread and completing it', args.json);
- const sourceThread = await runCliJson(
- repoRoot,
- [
- 'thread',
- 'create',
- 'OBJ-09 trigger source',
- '-w',
- workspacePath,
- '--goal',
- 'Emit one completion event for trigger execution',
- '--actor',
- args.adminActor,
- '--priority',
- 'high',
- '--tags',
- 'obj-09,trigger-source',
- '--json',
- ],
- { env: toApiKeyEnv(args.adminApiKey) },
- );
- const sourceThreadPath = String(sourceThread.data.thread.path);
-
- await runCliJson(
- repoRoot,
- ['thread', 'claim', sourceThreadPath, '-w', workspacePath, '--actor', args.intakeActor, '--json'],
- { env: toApiKeyEnv(args.intakeApiKey) },
- );
- await runCliJson(
- repoRoot,
- [
- 'thread',
- 'done',
- sourceThreadPath,
- '-w',
- workspacePath,
- '--actor',
- args.intakeActor,
- '--output',
- 'Trigger source completed for OBJ-09 evidence loop https://github.com/versatly/workgraph/pull/obj-09-trigger-source',
- '--json',
- ],
- { env: toApiKeyEnv(args.intakeApiKey) },
- );
-
- logLine('running trigger-run-evidence loop', args.json);
- const secondCycle = await runCliJson(
- repoRoot,
- [
- 'trigger',
- 'engine',
- 'run',
- '-w',
- workspacePath,
- '--actor',
- args.adminActor,
- '--execute-runs',
- '--agents',
- `${args.intakeActor},${args.builderActor},${args.reviewerActor}`,
- '--max-steps',
- '40',
- '--step-delay-ms',
- '0',
- '--timeout-ms',
- '30000',
- '--json',
- ],
- { env: toApiKeyEnv(args.adminApiKey) },
- );
-
- const executedRuns = Array.isArray(secondCycle.data.executedRuns) ? secondCycle.data.executedRuns : [];
- const triggeredRun = executedRuns[0];
- if (!triggeredRun || !triggeredRun.runId) {
- throw new Error('Expected at least one executed run from trigger engine.');
- }
- const runId = String(triggeredRun.runId);
-
- const runStatus = await runCliJson(
- repoRoot,
- ['dispatch', 'status', runId, '-w', workspacePath, '--json'],
- { env: toApiKeyEnv(args.adminApiKey) },
- );
- const runLogs = await runCliJson(
- repoRoot,
- ['dispatch', 'logs', runId, '-w', workspacePath, '--json'],
- { env: toApiKeyEnv(args.adminApiKey) },
- );
- const ledgerSnapshot = await runCliJson(
- repoRoot,
- ['ledger', 'show', '-w', workspacePath, '--count', '20', '--json'],
- { env: toApiKeyEnv(args.adminApiKey) },
- );
-
- const output = {
- workspacePath,
- triggerPath: trigger.path,
- sourceThreadPath,
- triggerLoop: {
- runId,
- status: String(runStatus.data.run.status),
- evidenceCount: Number(runStatus.data.run?.evidenceChain?.count ?? 0),
- logEntries: Array.isArray(runLogs.data.logs) ? runLogs.data.logs.length : 0,
- cycleFired: Number(secondCycle.data.cycle?.fired ?? 0),
- },
- ledgerSnapshotCount: Number(ledgerSnapshot.data.count ?? 0),
- };
- process.stdout.write(`${JSON.stringify(output, null, 2)}\n`);
-}
-
-function parseArgs(args) {
- const parsed = {
- workspacePath: '',
- adminActor: 'governance-admin',
- intakeActor: 'agent-intake',
- builderActor: 'agent-builder',
- reviewerActor: 'agent-reviewer',
- adminApiKey: '',
- intakeApiKey: '',
- builderApiKey: '',
- reviewerApiKey: '',
- skipBuild: false,
- json: false,
- };
- for (let idx = 0; idx < args.length; idx += 1) {
- const arg = String(args[idx] ?? '');
- if ((arg === '--workspace' || arg === '-w') && idx + 1 < args.length) {
- parsed.workspacePath = String(args[idx + 1]);
- idx += 1;
- continue;
- }
- if (arg === '--admin-api-key' && idx + 1 < args.length) {
- parsed.adminApiKey = String(args[idx + 1]);
- idx += 1;
- continue;
- }
- if (arg === '--intake-api-key' && idx + 1 < args.length) {
- parsed.intakeApiKey = String(args[idx + 1]);
- idx += 1;
- continue;
- }
- if (arg === '--builder-api-key' && idx + 1 < args.length) {
- parsed.builderApiKey = String(args[idx + 1]);
- idx += 1;
- continue;
- }
- if (arg === '--reviewer-api-key' && idx + 1 < args.length) {
- parsed.reviewerApiKey = String(args[idx + 1]);
- idx += 1;
- continue;
- }
- if (arg === '--skip-build') {
- parsed.skipBuild = true;
- continue;
- }
- if (arg === '--json') {
- parsed.json = true;
- }
- }
- return parsed;
-}
-
-function toApiKeyEnv(apiKey) {
- if (!apiKey) return undefined;
- return { WORKGRAPH_API_KEY: apiKey };
-}
-
-main().catch((error) => {
- const message = error instanceof Error ? error.message : String(error);
- process.stderr.write(`${message}\n`);
- process.exit(1);
-});
diff --git a/examples/multi-agent-showcase/scripts/lib/demo-utils.mjs b/examples/multi-agent-showcase/scripts/lib/demo-utils.mjs
deleted file mode 100644
index 8ea5e9f..0000000
--- a/examples/multi-agent-showcase/scripts/lib/demo-utils.mjs
+++ /dev/null
@@ -1,130 +0,0 @@
-#!/usr/bin/env node
-
-import fs from 'node:fs';
-import os from 'node:os';
-import path from 'node:path';
-import { execFile } from 'node:child_process';
-import { promisify } from 'node:util';
-import { fileURLToPath, pathToFileURL } from 'node:url';
-
-const execFileAsync = promisify(execFile);
-
-export function resolveRepoRoot(fromImportMetaUrl) {
- let current = path.resolve(path.dirname(fileURLToPath(fromImportMetaUrl)));
- for (let depth = 0; depth < 8; depth += 1) {
- const pkgPath = path.join(current, 'package.json');
- if (fs.existsSync(pkgPath)) {
- try {
- const pkg = JSON.parse(fs.readFileSync(pkgPath, 'utf-8'));
- if (pkg && pkg.name === '@versatly/workgraph') {
- return current;
- }
- } catch {
- // Keep traversing upward.
- }
- }
- const parent = path.dirname(current);
- if (parent === current) break;
- current = parent;
- }
- throw new Error('Unable to resolve WorkGraph repository root from showcase script location.');
-}
-
-export function resolveWorkspace(args) {
- const parsed = parseArgs(args);
- if (parsed.workspace) {
- return {
- workspacePath: path.resolve(parsed.workspace),
- providedByUser: true,
- json: parsed.json,
- skipBuild: parsed.skipBuild,
- };
- }
- const workspacePath = fs.mkdtempSync(path.join(os.tmpdir(), 'workgraph-obj09-showcase-'));
- return {
- workspacePath,
- providedByUser: false,
- json: parsed.json,
- skipBuild: parsed.skipBuild,
- };
-}
-
-export async function runCliJson(repoRoot, args, options = {}) {
- const cliPath = path.join(repoRoot, 'bin', 'workgraph.js');
- const fullArgs = args.includes('--json') ? [...args] : [...args, '--json'];
- const env = {
- ...process.env,
- ...(options.env ?? {}),
- };
- const { stdout, stderr } = await execFileAsync('node', [cliPath, ...fullArgs], {
- cwd: repoRoot,
- env,
- maxBuffer: 10 * 1024 * 1024,
- });
- const output = String(stdout ?? '').trim();
- let parsed;
- try {
- parsed = JSON.parse(output);
- } catch (error) {
- const detail = error instanceof Error ? error.message : String(error);
- throw new Error(`CLI output was not valid JSON (${fullArgs.join(' ')}): ${detail}\n${output}`);
- }
- if (!parsed || parsed.ok !== true) {
- const rendered = JSON.stringify(parsed, null, 2);
- const err = String(stderr ?? '').trim();
- throw new Error(`CLI command failed (${fullArgs.join(' ')}): ${rendered}${err ? `\n${err}` : ''}`);
- }
- return parsed;
-}
-
-export async function ensureBuild(repoRoot) {
- const distCli = path.join(repoRoot, 'dist', 'cli.js');
- const distIndex = path.join(repoRoot, 'dist', 'index.js');
- if (fs.existsSync(distCli) && fs.existsSync(distIndex)) {
- return;
- }
- await execFileAsync(resolvePnpmCommand(), ['run', 'build'], {
- cwd: repoRoot,
- env: process.env,
- maxBuffer: 20 * 1024 * 1024,
- });
-}
-
-export async function loadSdk(repoRoot) {
- const sdkUrl = pathToFileURL(path.join(repoRoot, 'dist', 'index.js')).href;
- return import(sdkUrl);
-}
-
-export function logLine(message, jsonMode) {
- if (!jsonMode) {
- process.stderr.write(`${message}\n`);
- }
-}
-
-function parseArgs(args) {
- const parsed = {
- workspace: '',
- json: false,
- skipBuild: false,
- };
- for (let idx = 0; idx < args.length; idx += 1) {
- const arg = String(args[idx] ?? '');
- if ((arg === '--workspace' || arg === '-w') && idx + 1 < args.length) {
- parsed.workspace = String(args[idx + 1]);
- idx += 1;
- continue;
- }
- if (arg === '--json') {
- parsed.json = true;
- continue;
- }
- if (arg === '--skip-build') {
- parsed.skipBuild = true;
- }
- }
- return parsed;
-}
-
-function resolvePnpmCommand() {
- return process.platform === 'win32' ? 'pnpm.cmd' : 'pnpm';
-}
diff --git a/examples/multi-agent-showcase/scripts/run-showcase.mjs b/examples/multi-agent-showcase/scripts/run-showcase.mjs
deleted file mode 100755
index 10bcaa8..0000000
--- a/examples/multi-agent-showcase/scripts/run-showcase.mjs
+++ /dev/null
@@ -1,151 +0,0 @@
-#!/usr/bin/env node
-
-import path from 'node:path';
-import { execFile } from 'node:child_process';
-import { promisify } from 'node:util';
-import { fileURLToPath } from 'node:url';
-import {
- ensureBuild,
- logLine,
- resolveRepoRoot,
- resolveWorkspace,
- runCliJson,
-} from './lib/demo-utils.mjs';
-
-const execFileAsync = promisify(execFile);
-
-async function main() {
- const repoRoot = resolveRepoRoot(import.meta.url);
- const resolved = resolveWorkspace(process.argv.slice(2));
- const workspacePath = resolved.workspacePath;
-
- if (!resolved.skipBuild) {
- logLine('building dist artifacts', resolved.json);
- await ensureBuild(repoRoot);
- }
-
- const scriptDir = path.resolve(path.dirname(fileURLToPath(import.meta.url)));
- logLine('phase 1/3: governance and registration', resolved.json);
- const governance = await runScriptJson(scriptDir, '01-governance.mjs', [
- '--workspace',
- workspacePath,
- '--json',
- ...(resolved.skipBuild ? ['--skip-build'] : []),
- ]);
-
- const approvalByAgent = new Map();
- for (const approval of governance.approvals ?? []) {
- approvalByAgent.set(String(approval.agent), String(approval.apiKey ?? ''));
- }
-
- logLine('phase 2/3: collaborative execution with self-assembly', resolved.json);
- const collaboration = await runScriptJson(scriptDir, '02-collaboration.mjs', [
- '--workspace',
- workspacePath,
- '--admin-api-key',
- String(governance.admin?.apiKey ?? ''),
- '--intake-api-key',
- String(approvalByAgent.get('agent-intake') ?? ''),
- '--builder-api-key',
- String(approvalByAgent.get('agent-builder') ?? ''),
- '--reviewer-api-key',
- String(approvalByAgent.get('agent-reviewer') ?? ''),
- '--json',
- ...(resolved.skipBuild ? ['--skip-build'] : []),
- ]);
-
- logLine('phase 3/3: trigger -> run -> evidence loop', resolved.json);
- const triggerLoop = await runScriptJson(scriptDir, '03-trigger-loop.mjs', [
- '--workspace',
- workspacePath,
- '--admin-api-key',
- String(governance.admin?.apiKey ?? ''),
- '--intake-api-key',
- String(approvalByAgent.get('agent-intake') ?? ''),
- '--builder-api-key',
- String(approvalByAgent.get('agent-builder') ?? ''),
- '--reviewer-api-key',
- String(approvalByAgent.get('agent-reviewer') ?? ''),
- '--json',
- ...(resolved.skipBuild ? ['--skip-build'] : []),
- ]);
-
- const threadList = await runCliJson(
- repoRoot,
- ['thread', 'list', '-w', workspacePath, '--json'],
- {
- env: governance.admin?.apiKey ? { WORKGRAPH_API_KEY: String(governance.admin.apiKey) } : undefined,
- },
- );
- const dispatchRuns = await runCliJson(
- repoRoot,
- ['dispatch', 'list', '-w', workspacePath, '--json'],
- {
- env: governance.admin?.apiKey ? { WORKGRAPH_API_KEY: String(governance.admin.apiKey) } : undefined,
- },
- );
- const ledgerRecent = await runCliJson(
- repoRoot,
- ['ledger', 'show', '-w', workspacePath, '--count', '25', '--json'],
- {
- env: governance.admin?.apiKey ? { WORKGRAPH_API_KEY: String(governance.admin.apiKey) } : undefined,
- },
- );
-
- const demoChecks = {
- governance: Number(governance.governanceSnapshot?.agentCount ?? 0) >= 4,
- selfAssemblyClaimedReviewerThread:
- String(collaboration.selfAssembly?.claimedThreadPath ?? '') === String(collaboration.threadPaths?.reviewerThreadPath ?? ''),
- planStepCoordinated:
- String(collaboration.selfAssembly?.planStepPath ?? '') === String(collaboration.planStepPaths?.reviewerStepPath ?? ''),
- triggerRunEvidence:
- String(triggerLoop.triggerLoop?.status ?? '') === 'succeeded'
- && Number(triggerLoop.triggerLoop?.evidenceCount ?? 0) > 0,
- ledgerActivity:
- Number(triggerLoop.ledgerSnapshotCount ?? 0) > 0,
- };
- const pass = Object.values(demoChecks).every(Boolean);
-
- const output = {
- ok: pass,
- workspacePath,
- providedWorkspacePath: resolved.providedByUser,
- checks: demoChecks,
- phases: {
- governance,
- collaboration,
- triggerLoop,
- },
- rollup: {
- threadCount: Number(threadList.data.count ?? 0),
- runCount: Array.isArray(dispatchRuns.data.runs) ? dispatchRuns.data.runs.length : 0,
- ledgerEntryCount: Number(ledgerRecent.data.count ?? 0),
- },
- };
-
- process.stdout.write(`${JSON.stringify(output, null, 2)}\n`);
- if (!pass) {
- process.exitCode = 1;
- }
-}
-
-async function runScriptJson(scriptDir, scriptName, args) {
- const scriptPath = path.join(scriptDir, scriptName);
- const { stdout, stderr } = await execFileAsync('node', [scriptPath, ...args], {
- maxBuffer: 10 * 1024 * 1024,
- env: process.env,
- });
- const output = String(stdout ?? '').trim();
- try {
- return JSON.parse(output);
- } catch (error) {
- const detail = error instanceof Error ? error.message : String(error);
- throw new Error(`Script ${scriptName} did not emit valid JSON: ${detail}\nstdout:\n${output}\nstderr:\n${String(stderr ?? '')}`);
- }
-}
-
-main().catch((error) => {
- const message = error instanceof Error ? error.message : String(error);
- process.stderr.write(`${message}\n`);
- process.exit(1);
-});
diff --git a/package.json b/package.json
index b21f90f..44c3671 100644
--- a/package.json
+++ b/package.json
@@ -1,10 +1,12 @@
{
"name": "@versatly/workgraph",
"version": "3.2.2",
- "description": "Agent-first workgraph workspace for multi-agent coordination with dynamic primitives, append-only ledger, and markdown-native storage.",
+ "description": "Context graph, thread collaboration, MCP exposure, and actor registration for multi-agent workspaces.",
"workspaces": [
- "packages/*",
- "apps/*"
+ "packages/kernel",
+ "packages/cli",
+ "packages/mcp-server",
+ "packages/sdk"
],
"packageManager": "pnpm@10.26.0",
"type": "module",
@@ -24,13 +26,6 @@
"types": "./dist/mcp-http-server.d.ts",
"import": "./dist/mcp-http-server.js"
},
- "./server": {
- "types": "./dist/server.d.ts",
- "import": "./dist/server.js"
- },
- "./server-entry": {
- "import": "./dist/server-entry.js"
- },
"./cli": {
"types": "./dist/cli.d.ts",
"import": "./dist/cli.js"
@@ -54,18 +49,16 @@
"test": "node scripts/run-tests.mjs",
"test:vitest": "vitest run --config vitest.config.ts",
"test:packages": "pnpm -r --if-present run test",
- "demo:workspace": "pnpm run --silent build && node scripts/generate-demo-workspace.mjs /tmp/workgraph-obsidian-demo",
- "demo:obsidian-setup": "pnpm run --silent build && node scripts/setup-obsidian-demo.mjs /tmp/workgraph-obsidian-demo",
"ci": "pnpm run typecheck && pnpm run typecheck:packages && pnpm run test && pnpm run build",
"prepublishOnly": "pnpm run ci"
},
"keywords": [
"workgraph",
+ "context-graph",
"multi-agent",
- "agent-coordination",
- "ledger",
- "markdown",
- "primitives"
+ "thread-collaboration",
+ "mcp",
+ "actor-registration"
],
"author": "Versatly",
"license": "MIT",
@@ -89,7 +82,6 @@
"zod": "^4.3.6"
},
"devDependencies": {
- "@versatly/workgraph-mcp-server": "workspace:*",
"@types/node": "^20.11.0",
"ajv": "^8.18.0",
"ajv-formats": "^3.0.1",
diff --git a/packages/adapter-claude-code/package.json b/packages/adapter-claude-code/package.json
deleted file mode 100644
index 0da999f..0000000
--- a/packages/adapter-claude-code/package.json
+++ /dev/null
@@ -1,15 +0,0 @@
-{
- "name": "@versatly/workgraph-adapter-claude-code",
- "version": "0.1.0",
- "private": true,
- "type": "module",
- "scripts": {
- "typecheck": "tsc --noEmit -p tsconfig.json"
- },
- "main": "src/index.ts",
- "types": "src/index.ts",
- "dependencies": {
- "@versatly/workgraph-adapter-shell-worker": "workspace:*",
- "@versatly/workgraph-runtime-adapter-core": "workspace:*"
- }
-}
diff --git a/packages/adapter-claude-code/src/adapter.ts b/packages/adapter-claude-code/src/adapter.ts
deleted file mode 100644
index 63bca5e..0000000
--- a/packages/adapter-claude-code/src/adapter.ts
+++ /dev/null
@@ -1,136 +0,0 @@
-import {
- ShellWorkerAdapter,
-} from '@versatly/workgraph-adapter-shell-worker';
-import type {
- DispatchAdapter,
- DispatchAdapterCreateInput,
- DispatchAdapterExecutionInput,
- DispatchAdapterExecutionResult,
- DispatchAdapterLogEntry,
- DispatchAdapterRunStatus,
-} from '@versatly/workgraph-runtime-adapter-core';
-
-/**
- * Claude Code adapter backed by the shell worker transport.
- *
- * This keeps runtime orchestration in-kernel while allowing concrete execution
- * through a production command template configured per environment.
- */
-export class ClaudeCodeAdapter implements DispatchAdapter {
- name = 'claude-code';
- private readonly shellAdapter = new ShellWorkerAdapter();
-
- async create(input: DispatchAdapterCreateInput): Promise {
- return this.shellAdapter.create(input);
- }
-
- async status(runId: string): Promise {
- return this.shellAdapter.status(runId);
- }
-
- async followup(runId: string, actor: string, input: string): Promise {
- return this.shellAdapter.followup(runId, actor, input);
- }
-
- async stop(runId: string, actor: string): Promise {
- return this.shellAdapter.stop(runId, actor);
- }
-
- async logs(runId: string): Promise {
- return this.shellAdapter.logs(runId);
- }
-
- async execute(input: DispatchAdapterExecutionInput): Promise {
- const template = readString(input.context?.claude_command_template)
- ?? process.env.WORKGRAPH_CLAUDE_COMMAND_TEMPLATE;
-
- if (!template) {
- return {
- status: 'failed',
- error: [
- 'claude-code adapter requires a command template.',
- 'Set context.claude_command_template or WORKGRAPH_CLAUDE_COMMAND_TEMPLATE.',
- 'Template tokens: {workspace}, {run_id}, {actor}, {objective}, {prompt}, {prompt_shell}.',
- 'Example: claude -p {prompt_shell}',
- ].join(' '),
- logs: [
- {
- ts: new Date().toISOString(),
- level: 'error',
- message: 'Missing Claude command template.',
- },
- ],
- };
- }
-
- const prompt = buildPrompt(input);
- const command = applyTemplate(template, {
- workspace: input.workspacePath,
- run_id: input.runId,
- actor: input.actor,
- objective: input.objective,
- prompt,
- prompt_shell: quoteForShell(prompt),
- });
-
- const context = {
- ...input.context,
- shell_command: command,
- shell_cwd: readString(input.context?.shell_cwd) ?? input.workspacePath,
- shell_timeout_ms: input.context?.shell_timeout_ms ?? process.env.WORKGRAPH_CLAUDE_TIMEOUT_MS,
- };
-
- const result = await this.shellAdapter.execute({
- ...input,
- context,
- });
- const logs = [
- {
- ts: new Date().toISOString(),
- level: 'info' as const,
- message: 'claude-code adapter dispatched shell execution from command template.',
- },
- ...(result.logs ?? []),
- ];
- return {
- ...result,
- logs,
- metrics: {
- ...(result.metrics ?? {}),
- adapter: 'claude-code',
- },
- };
- }
-}
-
-function buildPrompt(input: DispatchAdapterExecutionInput): string {
- const extraInstructions = readString(input.context?.claude_instructions);
- const sections = [
- `Workgraph run id: ${input.runId}`,
- `Actor: ${input.actor}`,
- `Objective: ${input.objective}`,
- `Workspace: ${input.workspacePath}`,
- ];
- if (extraInstructions) {
- sections.push(`Instructions: ${extraInstructions}`);
- }
- return sections.join('\n');
-}
-
-function applyTemplate(template: string, values: Record): string {
- let rendered = template;
- for (const [key, value] of Object.entries(values)) {
- rendered = rendered.replaceAll(`{${key}}`, value);
- }
- return rendered;
-}
-
-function quoteForShell(value: string): string {
- return `'${value.replace(/'/g, `'\\''`)}'`;
-}
-
-function readString(value: unknown): string | undefined {
- if (typeof value !== 'string') return undefined;
- const trimmed = value.trim();
- return trimmed.length > 0 ? trimmed : undefined;
-}
diff --git a/packages/adapter-claude-code/src/index.ts b/packages/adapter-claude-code/src/index.ts
deleted file mode 100644
index ddec7b5..0000000
--- a/packages/adapter-claude-code/src/index.ts
+++ /dev/null
@@ -1 +0,0 @@
-export * from './adapter.js';
diff --git a/packages/adapter-claude-code/tsconfig.json b/packages/adapter-claude-code/tsconfig.json
deleted file mode 100644
index 79e486b..0000000
--- a/packages/adapter-claude-code/tsconfig.json
+++ /dev/null
@@ -1,8 +0,0 @@
-{
- "extends": "../../tsconfig.base.json",
- "compilerOptions": {
- "composite": true,
- "noEmit": true
- },
- "include": ["src/**/*"]
-}
diff --git a/packages/adapter-cursor-cloud/package.json b/packages/adapter-cursor-cloud/package.json
deleted file mode 100644
index 46e3070..0000000
--- a/packages/adapter-cursor-cloud/package.json
+++ /dev/null
@@ -1,15 +0,0 @@
-{
- "name": "@versatly/workgraph-adapter-cursor-cloud",
- "version": "0.1.0",
- "private": true,
- "type": "module",
- "scripts": {
- "typecheck": "tsc --noEmit -p tsconfig.json"
- },
- "main": "src/index.ts",
- "types": "src/index.ts",
- "dependencies": {
- "@versatly/workgraph-kernel": "workspace:*",
- "@versatly/workgraph-runtime-adapter-core": "workspace:*"
- }
-}
diff --git a/packages/adapter-cursor-cloud/src/adapter.test.ts b/packages/adapter-cursor-cloud/src/adapter.test.ts
deleted file mode 100644
index 146c0cb..0000000
--- a/packages/adapter-cursor-cloud/src/adapter.test.ts
+++ /dev/null
@@ -1,155 +0,0 @@
-import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest';
-import { CursorCloudAdapter } from './adapter.js';
-import type {
- DispatchAdapterCancelInput,
- DispatchAdapterDispatchInput,
- DispatchAdapterPollInput,
-} from '@versatly/workgraph-runtime-adapter-core';
-
-function makeDispatchInput(overrides: Partial = {}): DispatchAdapterDispatchInput {
- return {
- workspacePath: '/workspace/demo',
- runId: 'run_cursor_external_1',
- actor: 'agent-cursor',
- objective: 'Dispatch cursor task externally',
- context: {
- cursor_cloud_api_base_url: 'https://cursor.example/api',
- },
- followups: [],
- ...overrides,
- };
-}
-
-function makePollInput(overrides: Partial = {}): DispatchAdapterPollInput {
- return {
- workspacePath: '/workspace/demo',
- runId: 'run_cursor_external_1',
- actor: 'agent-cursor',
- objective: 'Poll cursor task externally',
- context: {
- cursor_cloud_api_base_url: 'https://cursor.example/api',
- },
- external: {
- provider: 'cursor-cloud',
- externalRunId: 'cursor-agent-123',
- correlationKeys: ['run_cursor_external_1'],
- },
- ...overrides,
- };
-}
-
-function makeCancelInput(overrides: Partial = {}): DispatchAdapterCancelInput {
- return {
- workspacePath: '/workspace/demo',
- runId: 'run_cursor_external_1',
- actor: 'agent-cursor',
- objective: 'Cancel cursor task externally',
- context: {
- cursor_cloud_api_base_url: 'https://cursor.example/api',
- },
- external: {
- provider: 'cursor-cloud',
- externalRunId: 'cursor-agent-123',
- correlationKeys: ['run_cursor_external_1'],
- },
- ...overrides,
- };
-}
-
-function mockResponse(options: { ok: boolean; status: number; text: string; statusText?: string }): Response {
- return {
- ok: options.ok,
- status: options.status,
- statusText: options.statusText ?? '',
- text: async () => options.text,
- } as Response;
-}
-
-describe('CursorCloudAdapter external broker mode', () => {
- const fetchMock = vi.fn();
-
- beforeEach(() => {
- vi.restoreAllMocks();
- fetchMock.mockReset();
- vi.stubGlobal('fetch', fetchMock);
- });
-
- afterEach(() => {
- vi.unstubAllGlobals();
- });
-
- it('dispatches external runs and returns provider correlation metadata', async () => {
- fetchMock.mockResolvedValueOnce(mockResponse({
- ok: true,
- status: 202,
- text: JSON.stringify({
- id: 'cursor-agent-123',
- status: 'queued',
- agentId: 'cursor-agent-primary',
- }),
- }));
-
- const adapter = new CursorCloudAdapter();
- const result = await adapter.dispatch!(makeDispatchInput());
-
- expect(fetchMock).toHaveBeenCalledWith(
- 'https://cursor.example/api/runs',
- expect.objectContaining({
- method: 'POST',
- headers: expect.objectContaining({
- 'content-type': 'application/json',
- }),
- }),
- );
- expect(result.acknowledged).toBe(true);
- expect(result.status).toBe('queued');
- expect(result.external).toMatchObject({
- provider: 'cursor-cloud',
- externalRunId: 'cursor-agent-123',
- externalAgentId: 'cursor-agent-primary',
- });
- });
-
- it('polls and cancels external runs using provider endpoints', async () => {
- fetchMock
- .mockResolvedValueOnce(mockResponse({
- ok: true,
- status: 200,
- text: JSON.stringify({
- status: 'running',
- updatedAt: '2026-03-11T10:00:00.000Z',
- }),
- }))
- .mockResolvedValueOnce(mockResponse({
- ok: true,
- status: 202,
- text: JSON.stringify({
- status: 'cancelled',
- }),
- }));
-
- const adapter = new CursorCloudAdapter();
- const polled = await adapter.poll!(makePollInput());
- const cancelled = await adapter.cancel!(makeCancelInput());
-
- expect(fetchMock).toHaveBeenNthCalledWith(
- 1,
- 'https://cursor.example/api/runs/cursor-agent-123',
- expect.objectContaining({
- method: 'GET',
- }),
- );
- expect(polled?.status).toBe('running');
- expect(polled?.external?.externalRunId).toBe('cursor-agent-123');
-
- expect(fetchMock).toHaveBeenNthCalledWith(
- 2,
- 'https://cursor.example/api/runs/cursor-agent-123/cancel',
- expect.objectContaining({
- method: 'POST',
- }),
- );
- expect(cancelled.acknowledged).toBe(true);
- expect(cancelled.status).toBe('cancelled');
- });
-});
diff --git a/packages/adapter-cursor-cloud/src/adapter.ts b/packages/adapter-cursor-cloud/src/adapter.ts
deleted file mode 100644
index 0fcb3c5..0000000
--- a/packages/adapter-cursor-cloud/src/adapter.ts
+++ /dev/null
@@ -1,642 +0,0 @@
-import {
- orientation as orientationModule,
- store as storeModule,
- thread as threadModule,
-} from '@versatly/workgraph-kernel';
-import type {
- DispatchAdapter,
- DispatchAdapterCancelInput,
- DispatchAdapterCreateInput,
- DispatchAdapterDispatchInput,
- DispatchAdapterExecutionInput,
- DispatchAdapterExecutionResult,
- DispatchAdapterExternalUpdate,
- DispatchAdapterLogEntry,
- DispatchAdapterPollInput,
- DispatchAdapterRunStatus,
- RunStatus,
-} from '@versatly/workgraph-runtime-adapter-core';
-
-const orientation = orientationModule;
-const store = storeModule;
-const thread = threadModule;
-
-const DEFAULT_MAX_STEPS = 200;
-const DEFAULT_STEP_DELAY_MS = 25;
-const DEFAULT_AGENT_COUNT = 3;
-const DEFAULT_EXTERNAL_TIMEOUT_MS = 30_000;
-
-export class CursorCloudAdapter implements DispatchAdapter {
- name = 'cursor-cloud';
-
- async create(_input: DispatchAdapterCreateInput): Promise {
- return {
- runId: 'adapter-managed',
- status: 'queued',
- };
- }
-
- async status(runId: string): Promise {
- return { runId, status: 'running' };
- }
-
- async followup(runId: string, _actor: string, _input: string): Promise {
- return { runId, status: 'running' };
- }
-
- async stop(runId: string, _actor: string): Promise {
- return { runId, status: 'cancelled' };
- }
-
- async logs(_runId: string): Promise {
- return [];
- }
-
- async dispatch(input: DispatchAdapterDispatchInput): Promise {
- const config = resolveCursorBrokerConfig(input.context);
- if (!config) {
- throw new Error('cursor-cloud external broker requires cursor_cloud_api_base_url or cursor_cloud_dispatch_url.');
- }
- const now = new Date().toISOString();
- const payload = {
- runId: input.runId,
- actor: input.actor,
- objective: input.objective,
- workspacePath: input.workspacePath,
- context: input.context ?? {},
- followups: input.followups ?? [],
- external: input.external ?? null,
- ts: now,
- };
- const response = await fetchJson(config.dispatchUrl, {
- method: 'POST',
- headers: buildCursorHeaders(config),
- body: JSON.stringify(payload),
- signal: input.abortSignal,
- }, config.timeoutMs);
- const externalRunId = readExternalRunId(response.json);
- if (!response.ok || !externalRunId) {
- throw new Error(`cursor-cloud dispatch failed (${response.status}): ${response.text || 'missing external run id'}`);
- }
- return {
- acknowledged: true,
- acknowledgedAt: now,
- status: normalizeRunStatus(response.json?.status) ?? 'queued',
- external: {
- provider: 'cursor-cloud',
- externalRunId,
- externalAgentId: readString(response.json?.agentId) ?? readString(response.json?.agent_id),
- externalThreadId: readString(response.json?.threadId) ?? readString(response.json?.thread_id),
- correlationKeys: compactStrings([
- input.runId,
- readString(input.context?.cursor_correlation_key),
- readString(response.json?.correlationKey),
- ]),
- metadata: {
- response: response.json ?? response.text,
- },
- },
- lastKnownAt: now,
- logs: [
- {
- ts: now,
- level: 'info',
- message: `cursor-cloud dispatched external run ${externalRunId}.`,
- },
- ],
- metrics: {
- adapter: 'cursor-cloud',
- httpStatus: response.status,
- },
- metadata: {
- httpStatus: response.status,
- },
- };
- }
-
- async poll(input: DispatchAdapterPollInput): Promise {
- const config = resolveCursorBrokerConfig(input.context);
- if (!config) return null;
- const response = await fetchJson(resolveTemplate(config.statusUrlTemplate, input.external.externalRunId), {
- method: 'GET',
- headers: buildCursorHeaders(config),
- signal: input.abortSignal,
- }, config.timeoutMs);
- if (!response.ok) {
- throw new Error(`cursor-cloud poll failed (${response.status}): ${response.text || response.statusText}`);
- }
- return {
- status: normalizeRunStatus(response.json?.status),
- output: readString(response.json?.output),
- error: readString(response.json?.error),
- external: {
- provider: 'cursor-cloud',
- externalRunId: input.external.externalRunId,
- externalAgentId: readString(response.json?.agentId) ?? readString(response.json?.agent_id) ?? input.external.externalAgentId,
- externalThreadId: readString(response.json?.threadId) ?? readString(response.json?.thread_id) ?? input.external.externalThreadId,
- correlationKeys: compactStrings([
- ...(input.external.correlationKeys ?? []),
- readString(response.json?.correlationKey),
- ]),
- metadata: {
- response: response.json ?? response.text,
- },
- },
- lastKnownAt: readString(response.json?.updatedAt) ?? readString(response.json?.updated_at) ?? new Date().toISOString(),
- logs: [],
- metadata: {
- httpStatus: response.status,
- },
- };
- }
-
- async cancel(input: DispatchAdapterCancelInput): Promise {
- const config = resolveCursorBrokerConfig(input.context);
- if (!config || !input.external?.externalRunId) {
- return {
- status: 'cancelled',
- acknowledged: true,
- acknowledgedAt: new Date().toISOString(),
- external: input.external,
- };
- }
- const now = new Date().toISOString();
- const response = await fetchJson(resolveTemplate(config.cancelUrlTemplate, input.external.externalRunId), {
- method: 'POST',
- headers: buildCursorHeaders(config),
- body: JSON.stringify({
- runId: input.runId,
- actor: input.actor,
- objective: input.objective,
- externalRunId: input.external.externalRunId,
- ts: now,
- }),
- signal: input.abortSignal,
- }, config.timeoutMs);
- if (!response.ok) {
- throw new Error(`cursor-cloud cancel failed (${response.status}): ${response.text || response.statusText}`);
- }
- return {
- status: normalizeRunStatus(response.json?.status),
- acknowledged: true,
- acknowledgedAt: now,
- external: {
- provider: 'cursor-cloud',
- externalRunId: input.external.externalRunId,
- externalAgentId: input.external.externalAgentId,
- externalThreadId: input.external.externalThreadId,
- correlationKeys: input.external.correlationKeys,
- metadata: {
- response: response.json ?? response.text,
- },
- },
- lastKnownAt: now,
- metadata: {
- httpStatus: response.status,
- },
- };
- }
-
- async health(): Promise> {
- return {
- adapter: this.name,
- mode: 'dual',
- };
- }
-
- async execute(input: DispatchAdapterExecutionInput): Promise {
- const start = Date.now();
- const logs: DispatchAdapterLogEntry[] = [];
- const agentPool = normalizeAgents(input.agents, input.actor);
- const maxSteps = normalizeInt(input.maxSteps, DEFAULT_MAX_STEPS, 1, 5000);
- const stepDelayMs = normalizeInt(input.stepDelayMs, DEFAULT_STEP_DELAY_MS, 0, 5000);
- const claimedByAgent: Record = {};
- const completedByAgent: Record = {};
- let stepsExecuted = 0;
- let completionCount = 0;
- let failureCount = 0;
- let cancelled = false;
-
- for (const agent of agentPool) {
- claimedByAgent[agent] = 0;
- completedByAgent[agent] = 0;
- }
-
- pushLog(logs, 'info', `Run ${input.runId} started with agents: ${agentPool.join(', ')}`);
- pushLog(logs, 'info', `Objective: ${input.objective}`);
-
- while (stepsExecuted < maxSteps) {
- if (input.isCancelled?.()) {
- cancelled = true;
- pushLog(logs, 'warn', `Run ${input.runId} received cancellation signal.`);
- break;
- }
-
- const claimedThisRound: Array<{ agent: string; threadPath: string; goal: string }> = [];
- for (const agent of agentPool) {
- try {
- const claimed = input.space
- ? thread.claimNextReadyInSpace(input.workspacePath, agent, input.space)
- : thread.claimNextReady(input.workspacePath, agent);
- if (!claimed) {
- continue;
- }
- const path = claimed.path;
- const goal = String(claimed.fields.goal ?? claimed.fields.title ?? path);
- claimedThisRound.push({ agent, threadPath: path, goal });
- claimedByAgent[agent] += 1;
- pushLog(logs, 'info', `${agent} claimed ${path}`);
- } catch (error) {
- // Races are expected in multi-agent scheduling; recover and keep moving.
- pushLog(logs, 'warn', `${agent} claim skipped: ${errorMessage(error)}`);
- }
- }
-
- if (claimedThisRound.length === 0) {
- const readyRemaining = listReady(input.workspacePath, input.space).length;
- if (readyRemaining === 0) {
- pushLog(logs, 'info', 'No ready threads remaining; autonomous loop complete.');
- break;
- }
- if (stepDelayMs > 0) {
- await sleep(stepDelayMs);
- }
- continue;
- }
-
- await Promise.all(claimedThisRound.map(async (claimed) => {
- if (input.isCancelled?.()) {
- cancelled = true;
- return;
- }
- if (stepDelayMs > 0) {
- await sleep(stepDelayMs);
- }
- try {
- thread.done(
- input.workspacePath,
- claimed.threadPath,
- claimed.agent,
- `Completed by ${claimed.agent} during dispatch run ${input.runId}. Goal: ${claimed.goal}`,
- {
- evidence: [
- { type: 'thread-ref', value: claimed.threadPath },
- { type: 'reply-ref', value: `thread:${input.runId}` },
- ],
- },
- );
- completionCount += 1;
- completedByAgent[claimed.agent] += 1;
- pushLog(logs, 'info', `${claimed.agent} completed ${claimed.threadPath}`);
- } catch (error) {
- failureCount += 1;
- pushLog(logs, 'error', `${claimed.agent} failed to complete ${claimed.threadPath}: ${errorMessage(error)}`);
- }
- }));
-
- stepsExecuted += claimedThisRound.length;
- if (cancelled) break;
- }
-
- const readyAfter = listReady(input.workspacePath, input.space);
- const activeAfter = input.space
- ? store.threadsInSpace(input.workspacePath, input.space).filter((candidate) => candidate.fields.status === 'active')
- : store.activeThreads(input.workspacePath);
- const openAfter = input.space
- ? store.threadsInSpace(input.workspacePath, input.space).filter((candidate) => candidate.fields.status === 'open')
- : store.openThreads(input.workspacePath);
- const blockedAfter = input.space
- ? store.threadsInSpace(input.workspacePath, input.space).filter((candidate) => candidate.fields.status === 'blocked')
- : store.blockedThreads(input.workspacePath);
-
- const elapsedMs = Date.now() - start;
- const summary = renderSummary({
- objective: input.objective,
- runId: input.runId,
- completed: completionCount,
- failed: failureCount,
- stepsExecuted,
- readyRemaining: readyAfter.length,
- openRemaining: openAfter.length,
- blockedRemaining: blockedAfter.length,
- activeRemaining: activeAfter.length,
- elapsedMs,
- claimedByAgent,
- completedByAgent,
- cancelled,
- });
-
- if (input.createCheckpoint !== false) {
- try {
- orientation.checkpoint(
- input.workspacePath,
- input.actor,
- `Dispatch run ${input.runId} completed autonomous execution.`,
- {
- next: readyAfter.slice(0, 10).map((entry) => entry.path),
- blocked: blockedAfter.slice(0, 10).map((entry) => entry.path),
- tags: ['dispatch', 'autonomous-run'],
- },
- );
- pushLog(logs, 'info', `Checkpoint recorded for run ${input.runId}.`);
- } catch (error) {
- // Checkpoint creation is helpful but should not fail a completed run.
- pushLog(logs, 'warn', `Checkpoint creation skipped: ${errorMessage(error)}`);
- }
- }
-
- if (cancelled) {
- return {
- status: 'cancelled',
- output: summary,
- logs,
- metrics: {
- completed: completionCount,
- failed: failureCount,
- readyRemaining: readyAfter.length,
- openRemaining: openAfter.length,
- blockedRemaining: blockedAfter.length,
- elapsedMs,
- claimedByAgent,
- completedByAgent,
- },
- };
- }
-
- if (failureCount > 0) {
- return {
- status: 'failed',
- error: summary,
- logs,
- metrics: {
- completed: completionCount,
- failed: failureCount,
- readyRemaining: readyAfter.length,
- openRemaining: openAfter.length,
- blockedRemaining: blockedAfter.length,
- elapsedMs,
- claimedByAgent,
- completedByAgent,
- },
- };
- }
-
- const status = readyAfter.length === 0 && activeAfter.length === 0 ? 'succeeded' : 'failed';
- if (status === 'failed') {
- pushLog(logs, 'warn', 'Execution stopped with actionable work still remaining.');
- }
-
- return {
- status,
- output: summary,
- logs,
- metrics: {
- completed: completionCount,
- failed: failureCount,
- readyRemaining: readyAfter.length,
- openRemaining: openAfter.length,
- blockedRemaining: blockedAfter.length,
- elapsedMs,
- claimedByAgent,
- completedByAgent,
- },
- };
- }
-}
-
-function normalizeAgents(agents: string[] | undefined, actor: string): string[] {
- const fromInput = (agents ?? []).map((entry) => String(entry).trim()).filter(Boolean);
- if (fromInput.length > 0) return [...new Set(fromInput)];
- return Array.from({ length: DEFAULT_AGENT_COUNT }, (_, idx) => `${actor}-worker-${idx + 1}`);
-}
-
-function normalizeInt(
- rawValue: number | undefined,
- fallback: number,
- min: number,
- max: number,
-): number {
- const value = Number.isFinite(rawValue) ? Number(rawValue) : fallback;
- return Math.min(max, Math.max(min, Math.trunc(value)));
-}
-
-function pushLog(target: DispatchAdapterLogEntry[], level: DispatchAdapterLogEntry['level'], message: string): void {
- target.push({
- ts: new Date().toISOString(),
- level,
- message,
- });
-}
-
-interface CursorBrokerConfig {
- dispatchUrl: string;
- statusUrlTemplate: string;
- cancelUrlTemplate: string;
- token?: string;
- headers: Record;
- timeoutMs: number;
-}
-
-async function fetchJson(
- url: string,
- init: RequestInit,
- timeoutMs: number,
-): Promise<{
- ok: boolean;
- status: number;
- statusText: string;
- text: string;
- json: Record | null;
-}> {
- const controller = new AbortController();
- const timeout = setTimeout(() => controller.abort(), timeoutMs);
- try {
- const response = await fetch(url, {
- ...init,
- signal: init.signal ?? controller.signal,
- });
- const text = await response.text();
- return {
- ok: response.ok,
- status: response.status,
- statusText: response.statusText,
- text,
- json: safeParseJson(text),
- };
- } finally {
- clearTimeout(timeout);
- }
-}
-
-function resolveCursorBrokerConfig(context: Record | undefined): CursorBrokerConfig | null {
- const baseUrl = resolveUrl(
- context?.cursor_cloud_api_base_url,
- process.env.WORKGRAPH_CURSOR_CLOUD_API_BASE_URL,
- );
- const dispatchUrl = resolveUrl(
- context?.cursor_cloud_dispatch_url,
- baseUrl ? `${baseUrl}/runs` : undefined,
- );
- if (!dispatchUrl) return null;
- const statusUrlTemplate = readString(context?.cursor_cloud_status_url_template)
- ?? (baseUrl ? `${baseUrl}/runs/{externalRunId}` : undefined)
- ?? `${dispatchUrl.replace(/\/+$/, '')}/{externalRunId}`;
- const cancelUrlTemplate = readString(context?.cursor_cloud_cancel_url_template)
- ?? (baseUrl ? `${baseUrl}/runs/{externalRunId}/cancel` : undefined)
- ?? `${dispatchUrl.replace(/\/+$/, '')}/{externalRunId}/cancel`;
- return {
- dispatchUrl,
- statusUrlTemplate,
- cancelUrlTemplate,
- token: readString(context?.cursor_cloud_api_token) ?? readString(process.env.WORKGRAPH_CURSOR_CLOUD_API_TOKEN),
- headers: readHeaders(context?.cursor_cloud_headers),
- timeoutMs: normalizeInt(readNumber(context?.cursor_cloud_timeout_ms), DEFAULT_EXTERNAL_TIMEOUT_MS, 1_000, 120_000),
- };
-}
-
-function buildCursorHeaders(config: CursorBrokerConfig): Record {
- return {
- 'content-type': 'application/json',
- ...config.headers,
- ...(config.token ? { authorization: `Bearer ${config.token}` } : {}),
- };
-}
-
-function resolveTemplate(template: string, externalRunId: string): string {
- return template.replaceAll('{externalRunId}', externalRunId);
-}
-
-function readHeaders(value: unknown): Record {
- if (!value || typeof value !== 'object' || Array.isArray(value)) return {};
- const record = value as Record;
- const headers: Record = {};
- for (const [key, raw] of Object.entries(record)) {
- if (!key) continue;
- if (raw === undefined || raw === null) continue;
- headers[key.toLowerCase()] = String(raw);
- }
- return headers;
-}
-
-function safeParseJson(value: string): Record | null {
- if (!value.trim()) return null;
- try {
- const parsed = JSON.parse(value) as unknown;
- if (!parsed || typeof parsed !== 'object' || Array.isArray(parsed)) return null;
- return parsed as Record;
- } catch {
- return null;
- }
-}
-
-function readExternalRunId(value: Record | null): string | undefined {
- return readString(value?.externalRunId)
- ?? readString(value?.external_run_id)
- ?? readString(value?.runId)
- ?? readString(value?.run_id)
- ?? readString(value?.id)
- ?? readString(value?.agentId)
- ?? readString(value?.agent_id);
-}
-
-function resolveUrl(...values: unknown[]): string | undefined {
- for (const value of values) {
- const candidate = readString(value);
- if (!candidate) continue;
- try {
- const url = new URL(candidate);
- if (url.protocol === 'http:' || url.protocol === 'https:') {
- return url.toString();
- }
- } catch {
- continue;
- }
- }
- return undefined;
-}
-
-function normalizeRunStatus(value: unknown): RunStatus | undefined {
- const normalized = String(value ?? '').trim().toLowerCase();
- if (
- normalized === 'queued'
- || normalized === 'running'
- || normalized === 'succeeded'
- || normalized === 'failed'
- || normalized === 'cancelled'
- ) {
- return normalized;
- }
- return undefined;
-}
-
-function compactStrings(values: Array): string[] {
- return [...new Set(values.filter((entry): entry is string => Boolean(entry && entry.trim())).map((entry) => entry.trim()))];
-}
-
-function listReady(workspacePath: string, space: string | undefined) {
- return space
- ? thread.listReadyThreadsInSpace(workspacePath, space)
- : thread.listReadyThreads(workspacePath);
-}
-
-function errorMessage(error: unknown): string {
- return error instanceof Error ? error.message : String(error);
-}
-
-function sleep(ms: number): Promise {
- return new Promise((resolve) => {
- setTimeout(resolve, ms);
- });
-}
-
-function readString(value: unknown): string | undefined {
- if (typeof value !== 'string') return undefined;
- const trimmed = value.trim();
- return trimmed.length > 0 ? trimmed : undefined;
-}
-
-function readNumber(value: unknown): number | undefined {
- if (typeof value === 'number' && Number.isFinite(value)) return value;
- if (typeof value === 'string' && value.trim().length > 0) {
- const parsed = Number(value);
- if (Number.isFinite(parsed)) return parsed;
- }
- return undefined;
-}
-
-function renderSummary(data: {
- objective: string;
- runId: string;
- completed: number;
- failed: number;
- stepsExecuted: number;
- readyRemaining: number;
- openRemaining: number;
- blockedRemaining: number;
- activeRemaining: number;
- elapsedMs: number;
- claimedByAgent: Record;
- completedByAgent: Record;
- cancelled: boolean;
-}): string {
- const lines = [
- `Autonomous dispatch summary for ${data.runId}`,
- `Objective: ${data.objective}`,
- `Completed threads: ${data.completed}`,
- `Failed completions: ${data.failed}`,
- `Scheduler steps executed: ${data.stepsExecuted}`,
- `Ready remaining: ${data.readyRemaining}`,
- `Open remaining: ${data.openRemaining}`,
- `Blocked remaining: ${data.blockedRemaining}`,
- `Active remaining: ${data.activeRemaining}`,
- `Elapsed ms: ${data.elapsedMs}`,
- `Cancelled: ${data.cancelled ? 'yes' : 'no'}`,
- '',
- 'Claims by agent:',
- ...Object.entries(data.claimedByAgent).map(([agent, count]) => `- ${agent}: ${count}`),
- '',
- 'Completions by agent:',
- ...Object.entries(data.completedByAgent).map(([agent, count]) => `- ${agent}: ${count}`),
- ];
- return lines.join('\n');
-}
diff --git a/packages/adapter-cursor-cloud/src/index.ts b/packages/adapter-cursor-cloud/src/index.ts
deleted file mode 100644
index ddec7b5..0000000
--- a/packages/adapter-cursor-cloud/src/index.ts
+++ /dev/null
@@ -1 +0,0 @@
-export * from './adapter.js';
diff --git a/packages/adapter-cursor-cloud/tsconfig.json b/packages/adapter-cursor-cloud/tsconfig.json
deleted file mode 100644
index 79e486b..0000000
--- a/packages/adapter-cursor-cloud/tsconfig.json
+++ /dev/null
@@ -1,8 +0,0 @@
-{
- "extends": "../../tsconfig.base.json",
- "compilerOptions": {
- "composite": true,
- "noEmit": true
- },
- "include": ["src/**/*"]
-}
diff --git a/packages/adapter-http-webhook/package.json b/packages/adapter-http-webhook/package.json
deleted file mode 100644
index f9db226..0000000
--- a/packages/adapter-http-webhook/package.json
+++ /dev/null
@@ -1,14 +0,0 @@
-{
- "name": "@versatly/workgraph-adapter-http-webhook",
- "version": "0.1.0",
- "private": true,
- "type": "module",
- "scripts": {
- "typecheck": "tsc --noEmit -p tsconfig.json"
- },
- "main": "src/index.ts",
- "types": "src/index.ts",
- "dependencies": {
- "@versatly/workgraph-runtime-adapter-core": "workspace:*"
- }
-}
diff --git a/packages/adapter-http-webhook/src/adapter.ts b/packages/adapter-http-webhook/src/adapter.ts
deleted file mode 100644
index d4a9cb7..0000000
--- a/packages/adapter-http-webhook/src/adapter.ts
+++ /dev/null
@@ -1,242 +0,0 @@
-import type {
- DispatchAdapter,
- DispatchAdapterCreateInput,
- DispatchAdapterExecutionInput,
- DispatchAdapterExecutionResult,
- DispatchAdapterLogEntry,
- DispatchAdapterRunStatus,
-} from '@versatly/workgraph-runtime-adapter-core';
-
-const DEFAULT_POLL_MS = 1000;
-const DEFAULT_MAX_WAIT_MS = 90_000;
-
-export class HttpWebhookAdapter implements DispatchAdapter {
- name = 'http-webhook';
-
- async create(_input: DispatchAdapterCreateInput): Promise {
- return { runId: 'http-webhook-managed', status: 'queued' };
- }
-
- async status(runId: string): Promise {
- return { runId, status: 'running' };
- }
-
- async followup(runId: string, _actor: string, _input: string): Promise {
- return { runId, status: 'running' };
- }
-
- async stop(runId: string, _actor: string): Promise {
- return { runId, status: 'cancelled' };
- }
-
- async logs(_runId: string): Promise {
- return [];
- }
-
- async execute(input: DispatchAdapterExecutionInput): Promise {
- const logs: DispatchAdapterLogEntry[] = [];
- const webhookUrl = resolveUrl(input.context?.webhook_url, process.env.WORKGRAPH_DISPATCH_WEBHOOK_URL);
- if (!webhookUrl) {
- return {
- status: 'failed',
- error: 'http-webhook adapter requires context.webhook_url or WORKGRAPH_DISPATCH_WEBHOOK_URL.',
- logs,
- };
- }
-
- const token = readString(input.context?.webhook_token) ?? process.env.WORKGRAPH_DISPATCH_WEBHOOK_TOKEN;
- const headers = {
- 'content-type': 'application/json',
- ...extractHeaders(input.context?.webhook_headers),
- ...(token ? { authorization: `Bearer ${token}` } : {}),
- };
-
- const payload = {
- runId: input.runId,
- actor: input.actor,
- objective: input.objective,
- workspacePath: input.workspacePath,
- context: input.context ?? {},
- ts: new Date().toISOString(),
- };
-
- pushLog(logs, 'info', `http-webhook posting run ${input.runId} to ${webhookUrl}`);
- const response = await fetch(webhookUrl, {
- method: 'POST',
- headers,
- body: JSON.stringify(payload),
- });
- const rawText = await response.text();
- const parsed = safeParseJson(rawText);
- pushLog(logs, response.ok ? 'info' : 'error', `http-webhook response status: ${response.status}`);
-
- if (!response.ok) {
- return {
- status: 'failed',
- error: `http-webhook request failed (${response.status}): ${rawText || response.statusText}`,
- logs,
- };
- }
-
- const immediateStatus = normalizeRunStatus(parsed?.status);
- if (immediateStatus && isTerminalStatus(immediateStatus)) {
- return {
- status: immediateStatus,
- output: typeof parsed?.output === 'string' ? parsed.output : rawText,
- error: typeof parsed?.error === 'string' ? parsed.error : undefined,
- logs,
- metrics: {
- adapter: 'http-webhook',
- httpStatus: response.status,
- },
- };
- }
-
- const pollUrl = resolveUrl(parsed?.pollUrl, input.context?.webhook_status_url, process.env.WORKGRAPH_DISPATCH_WEBHOOK_STATUS_URL);
- if (!pollUrl) {
- return {
- status: 'succeeded',
- output: rawText || 'http-webhook acknowledged run successfully.',
- logs,
- metrics: {
- adapter: 'http-webhook',
- httpStatus: response.status,
- },
- };
- }
-
- const pollMs = clampInt(readNumber(input.context?.webhook_poll_ms), DEFAULT_POLL_MS, 200, 30_000);
- const maxWaitMs = clampInt(readNumber(input.context?.webhook_max_wait_ms), DEFAULT_MAX_WAIT_MS, 1000, 15 * 60_000);
- const startedAt = Date.now();
- pushLog(logs, 'info', `http-webhook polling status from ${pollUrl}`);
-
- while (Date.now() - startedAt < maxWaitMs) {
- if (input.isCancelled?.()) {
- pushLog(logs, 'warn', 'http-webhook run cancelled while polling');
- return {
- status: 'cancelled',
- output: 'http-webhook polling cancelled by dispatcher.',
- logs,
- };
- }
-
- const pollResponse = await fetch(pollUrl, {
- method: 'GET',
- headers: {
- ...headers,
- },
- });
- const pollText = await pollResponse.text();
- const pollJson = safeParseJson(pollText);
- const pollStatus = normalizeRunStatus(pollJson?.status);
- pushLog(logs, 'info', `poll status=${pollResponse.status} run_status=${pollStatus ?? 'unknown'}`);
-
- if (pollStatus && isTerminalStatus(pollStatus)) {
- return {
- status: pollStatus,
- output: typeof pollJson?.output === 'string' ? pollJson.output : pollText,
- error: typeof pollJson?.error === 'string' ? pollJson.error : undefined,
- logs,
- metrics: {
- adapter: 'http-webhook',
- pollUrl,
- pollHttpStatus: pollResponse.status,
- elapsedMs: Date.now() - startedAt,
- },
- };
- }
-
- await sleep(pollMs);
- }
-
- return {
- status: 'failed',
- error: `http-webhook polling exceeded timeout (${maxWaitMs}ms) for run ${input.runId}.`,
- logs,
- };
- }
-}
-
-function pushLog(target: DispatchAdapterLogEntry[], level: DispatchAdapterLogEntry['level'], message: string): void {
- target.push({
- ts: new Date().toISOString(),
- level,
- message,
- });
-}
-
-function readString(value: unknown): string | undefined {
- if (typeof value !== 'string') return undefined;
- const trimmed = value.trim();
- return trimmed.length > 0 ? trimmed : undefined;
-}
-
-function resolveUrl(...values: unknown[]): string | undefined {
- for (const value of values) {
- const parsed = readString(value);
- if (!parsed) continue;
- try {
- const url = new URL(parsed);
- if (url.protocol === 'http:' || url.protocol === 'https:') {
- return url.toString();
- }
- } catch {
- continue;
- }
- }
- return undefined;
-}
-
-function extractHeaders(input: unknown): Record {
- if (!input || typeof input !== 'object' || Array.isArray(input)) return {};
- const record = input as Record;
- const out: Record = {};
- for (const [key, value] of Object.entries(record)) {
- if (!key || value === undefined || value === null) continue;
- out[key.toLowerCase()] = String(value);
- }
- return out;
-}
-
-function safeParseJson(value: string): Record | null {
- if (!value || !value.trim()) return null;
- try {
- const parsed = JSON.parse(value) as unknown;
- if (!parsed || typeof parsed !== 'object' || Array.isArray(parsed)) return null;
- return parsed as Record;
- } catch {
- return null;
- }
-}
-
-function normalizeRunStatus(value: unknown): DispatchAdapterRunStatus['status'] | undefined {
- const normalized = String(value ?? '').toLowerCase();
- if (normalized === 'queued' || normalized === 'running' || normalized === 'succeeded' || normalized === 'failed' || normalized === 'cancelled') {
- return normalized;
- }
- return undefined;
-}
-
-function isTerminalStatus(status: DispatchAdapterRunStatus['status']): boolean {
- return status === 'succeeded' || status === 'failed' || status === 'cancelled';
-}
-
-function readNumber(value: unknown): number | undefined {
- if (typeof value === 'number' && Number.isFinite(value)) return value;
- if (typeof value === 'string' && value.trim().length > 0) {
- const parsed = Number(value);
- if (Number.isFinite(parsed)) return parsed;
- }
- return undefined;
-}
-
-function clampInt(value: number | undefined, fallback: number, min: number, max: number): number {
- const raw = typeof value === 'number' ? Math.trunc(value) : fallback;
- return Math.min(max, Math.max(min, raw));
-}
-
-function sleep(ms: number): Promise {
- return new Promise((resolve) => {
- setTimeout(resolve, ms);
- });
-}
diff --git a/packages/adapter-http-webhook/src/index.ts b/packages/adapter-http-webhook/src/index.ts
deleted file mode 100644
index ddec7b5..0000000
--- a/packages/adapter-http-webhook/src/index.ts
+++ /dev/null
@@ -1 +0,0 @@
-export * from './adapter.js';
diff --git a/packages/adapter-http-webhook/tsconfig.json b/packages/adapter-http-webhook/tsconfig.json
deleted file mode 100644
index 79e486b..0000000
--- a/packages/adapter-http-webhook/tsconfig.json
+++ /dev/null
@@ -1,8 +0,0 @@
-{
- "extends": "../../tsconfig.base.json",
- "compilerOptions": {
- "composite": true,
- "noEmit": true
- },
- "include": ["src/**/*"]
-}
diff --git a/packages/adapter-shell-worker/package.json b/packages/adapter-shell-worker/package.json
deleted file mode 100644
index 8f302cf..0000000
--- a/packages/adapter-shell-worker/package.json
+++ /dev/null
@@ -1,15 +0,0 @@
-{
- "name": "@versatly/workgraph-adapter-shell-worker",
- "version": "0.1.0",
- "private": true,
- "type": "module",
- "scripts": {
- "typecheck": "tsc --noEmit -p tsconfig.json"
- },
- "main": "src/index.ts",
- "types": "src/index.ts",
- "dependencies": {
- "@versatly/workgraph-adapter-cursor-cloud": "workspace:*",
- "@versatly/workgraph-runtime-adapter-core": "workspace:*"
- }
-}
diff --git a/packages/adapter-shell-worker/src/adapter.ts b/packages/adapter-shell-worker/src/adapter.ts
deleted file mode 100644
index 62146a2..0000000
--- a/packages/adapter-shell-worker/src/adapter.ts
+++ /dev/null
@@ -1,259 +0,0 @@
-import { spawn } from 'node:child_process';
-import { CursorCloudAdapter } from '../../adapter-cursor-cloud/src/adapter.js';
-import type {
- DispatchAdapter,
- DispatchAdapterCreateInput,
- DispatchAdapterExecutionInput,
- DispatchAdapterExecutionResult,
- DispatchAdapterLogEntry,
- DispatchAdapterRunStatus,
-} from '@versatly/workgraph-runtime-adapter-core';
-
-const DEFAULT_TIMEOUT_MS = 10 * 60 * 1000;
-const MAX_CAPTURE_CHARS = 12000;
-
-export class ShellWorkerAdapter implements DispatchAdapter {
- name = 'shell-worker';
- private readonly fallback = new CursorCloudAdapter();
-
- async create(_input: DispatchAdapterCreateInput): Promise {
- return { runId: 'shell-worker-managed', status: 'queued' };
- }
-
- async status(runId: string): Promise {
- return { runId, status: 'running' };
- }
-
- async followup(runId: string, _actor: string, _input: string): Promise {
- return { runId, status: 'running' };
- }
-
- async stop(runId: string, _actor: string): Promise {
- return { runId, status: 'cancelled' };
- }
-
- async logs(_runId: string): Promise {
- return [];
- }
-
- async execute(input: DispatchAdapterExecutionInput): Promise {
- const command = readString(input.context?.shell_command);
- if (!command) {
- return this.fallback.execute(input);
- }
-
- const shellCwd = readString(input.context?.shell_cwd) ?? input.workspacePath;
- const timeoutMs = clampInt(readNumber(input.context?.shell_timeout_ms), DEFAULT_TIMEOUT_MS, 1000, 60 * 60 * 1000);
- const shellEnv = readEnv(input.context?.shell_env);
- const logs: DispatchAdapterLogEntry[] = [];
- const startedAt = Date.now();
- const outputParts: string[] = [];
- const errorParts: string[] = [];
-
- pushLog(logs, 'info', `shell-worker starting command: ${command}`);
- pushLog(logs, 'info', `shell-worker cwd: ${shellCwd}`);
-
- const result = await runShellCommand({
- command,
- cwd: shellCwd,
- timeoutMs,
- env: shellEnv,
- isCancelled: input.isCancelled,
- onStdout: (chunk) => {
- outputParts.push(chunk);
- pushLog(logs, 'info', `[stdout] ${chunk.trimEnd()}`);
- },
- onStderr: (chunk) => {
- errorParts.push(chunk);
- pushLog(logs, 'warn', `[stderr] ${chunk.trimEnd()}`);
- },
- });
-
- const elapsedMs = Date.now() - startedAt;
- const stdout = truncateText(outputParts.join(''), MAX_CAPTURE_CHARS);
- const stderr = truncateText(errorParts.join(''), MAX_CAPTURE_CHARS);
-
- if (result.cancelled) {
- pushLog(logs, 'warn', `shell-worker command cancelled after ${elapsedMs}ms`);
- return {
- status: 'cancelled',
- output: formatShellOutput(command, result.exitCode, stdout, stderr, elapsedMs, true),
- logs,
- };
- }
-
- if (result.timedOut) {
- pushLog(logs, 'error', `shell-worker command timed out after ${elapsedMs}ms`);
- return {
- status: 'failed',
- error: formatShellOutput(command, result.exitCode, stdout, stderr, elapsedMs, false),
- logs,
- };
- }
-
- if (result.exitCode !== 0) {
- pushLog(logs, 'error', `shell-worker command failed with exit code ${result.exitCode}`);
- return {
- status: 'failed',
- error: formatShellOutput(command, result.exitCode, stdout, stderr, elapsedMs, false),
- logs,
- };
- }
-
- pushLog(logs, 'info', `shell-worker command succeeded in ${elapsedMs}ms`);
- return {
- status: 'succeeded',
- output: formatShellOutput(command, result.exitCode, stdout, stderr, elapsedMs, false),
- logs,
- metrics: {
- elapsedMs,
- exitCode: result.exitCode,
- adapter: 'shell-worker',
- },
- };
- }
-}
-
-interface RunShellCommandOptions {
- command: string;
- cwd: string;
- timeoutMs: number;
- env: Record;
- isCancelled?: () => boolean;
- onStdout: (chunk: string) => void;
- onStderr: (chunk: string) => void;
-}
-
-interface RunShellCommandResult {
- exitCode: number;
- timedOut: boolean;
- cancelled: boolean;
-}
-
-async function runShellCommand(options: RunShellCommandOptions): Promise {
- return new Promise((resolve) => {
- const child = spawn(options.command, {
- cwd: options.cwd,
- env: { ...process.env, ...options.env },
- shell: true,
- stdio: ['ignore', 'pipe', 'pipe'],
- });
-
- let resolved = false;
- let timedOut = false;
- let cancelled = false;
- const timeoutHandle = setTimeout(() => {
- timedOut = true;
- child.kill('SIGTERM');
- setTimeout(() => child.kill('SIGKILL'), 1500).unref();
- }, options.timeoutMs);
-
- const cancelWatcher = setInterval(() => {
- if (options.isCancelled?.()) {
- cancelled = true;
- child.kill('SIGTERM');
- }
- }, 200);
- cancelWatcher.unref();
-
- child.stdout.on('data', (chunk: Buffer) => {
- options.onStdout(chunk.toString('utf-8'));
- });
- child.stderr.on('data', (chunk: Buffer) => {
- options.onStderr(chunk.toString('utf-8'));
- });
-
- child.on('close', (code) => {
- if (resolved) return;
- resolved = true;
- clearTimeout(timeoutHandle);
- clearInterval(cancelWatcher);
- resolve({
- exitCode: typeof code === 'number' ? code : 1,
- timedOut,
- cancelled,
- });
- });
-
- child.on('error', () => {
- if (resolved) return;
- resolved = true;
- clearTimeout(timeoutHandle);
- clearInterval(cancelWatcher);
- resolve({
- exitCode: 1,
- timedOut,
- cancelled,
- });
- });
- });
-}
-
-function pushLog(target: DispatchAdapterLogEntry[], level: DispatchAdapterLogEntry['level'], message: string): void {
- target.push({
- ts: new Date().toISOString(),
- level,
- message,
- });
-}
-
-function readEnv(value: unknown): Record {
- if (!value || typeof value !== 'object' || Array.isArray(value)) return {};
- const input = value as Record;
- const result: Record = {};
- for (const [key, raw] of Object.entries(input)) {
- if (!key) continue;
- if (raw === undefined || raw === null) continue;
- result[key] = String(raw);
- }
- return result;
-}
-
-function readString(value: unknown): string | undefined {
- if (typeof value !== 'string') return undefined;
- const trimmed = value.trim();
- return trimmed.length > 0 ? trimmed : undefined;
-}
-
-function readNumber(value: unknown): number | undefined {
- if (typeof value === 'number' && Number.isFinite(value)) return value;
- if (typeof value === 'string' && value.trim().length > 0) {
- const parsed = Number(value);
- if (Number.isFinite(parsed)) return parsed;
- }
- return undefined;
-}
-
-function clampInt(value: number | undefined, fallback: number, min: number, max: number): number {
- const raw = typeof value === 'number' ? Math.trunc(value) : fallback;
- return Math.min(max, Math.max(min, raw));
-}
-
-function truncateText(value: string, limit: number): string {
- if (value.length <= limit) return value;
- return `${value.slice(0, limit)}\n...[truncated]`;
-}
-
-function formatShellOutput(
- command: string,
- exitCode: number,
- stdout: string,
- stderr: string,
- elapsedMs: number,
- cancelled: boolean,
-): string {
- const lines = [
- 'Shell worker execution summary',
- `Command: ${command}`,
- `Exit code: ${exitCode}`,
- `Elapsed ms: ${elapsedMs}`,
- `Cancelled: ${cancelled ? 'yes' : 'no'}`,
- '',
- 'STDOUT:',
- stdout || '(empty)',
- '',
- 'STDERR:',
- stderr || '(empty)',
- ];
- return lines.join('\n');
-}
diff --git a/packages/adapter-shell-worker/src/index.ts b/packages/adapter-shell-worker/src/index.ts
deleted file mode 100644
index ddec7b5..0000000
--- a/packages/adapter-shell-worker/src/index.ts
+++ /dev/null
@@ -1 +0,0 @@
-export * from './adapter.js';
diff --git a/packages/adapter-shell-worker/tsconfig.json b/packages/adapter-shell-worker/tsconfig.json
deleted file mode 100644
index 92d3d69..0000000
--- a/packages/adapter-shell-worker/tsconfig.json
+++ /dev/null
@@ -1,11 +0,0 @@
-{
- "extends": "../../tsconfig.base.json",
- "compilerOptions": {
- "composite": true,
- "noEmit": true
- },
- "include": [
- "src/**/*",
- "../adapter-cursor-cloud/src/**/*"
- ]
-}
diff --git a/packages/cli/package.json b/packages/cli/package.json
index 8ad2153..4320691 100644
--- a/packages/cli/package.json
+++ b/packages/cli/package.json
@@ -12,8 +12,6 @@
"main": "src/index.ts",
"types": "src/index.ts",
"dependencies": {
- "@modelcontextprotocol/sdk": "^1.27.1",
- "@versatly/workgraph-control-api": "workspace:*",
"@versatly/workgraph-kernel": "workspace:*",
"@versatly/workgraph-mcp-server": "workspace:*",
"commander": "^12.1.0"
diff --git a/packages/cli/src/cli.ts b/packages/cli/src/cli.ts
index 722cab1..1fd09ef 100644
--- a/packages/cli/src/cli.ts
+++ b/packages/cli/src/cli.ts
@@ -1,2996 +1,1204 @@
-import fs from 'node:fs';
-import path from 'node:path';
import { Command } from 'commander';
import * as workgraph from '@versatly/workgraph-kernel';
-import { registerDefaultDispatchAdaptersIntoKernelRegistry } from '@versatly/workgraph-runtime-adapter-core';
-import { startWorkgraphServer, waitForShutdown } from '@versatly/workgraph-control-api';
-import { registerAdapterCommands } from './cli/commands/adapter.js';
-import { registerAutonomyCommands } from './cli/commands/autonomy.js';
-import { registerCapabilityCommands } from './cli/commands/capability.js';
+import { startWorkgraphMcpHttpServer } from '@versatly/workgraph-mcp-server';
import { registerConversationCommands } from './cli/commands/conversation.js';
-import { registerCursorCommands } from './cli/commands/cursor.js';
-import { registerDispatchCommands } from './cli/commands/dispatch.js';
import { registerMcpCommands } from './cli/commands/mcp.js';
-import { registerMissionCommands } from './cli/commands/mission.js';
-import { registerSafetyCommands } from './cli/commands/safety.js';
-import { registerPortabilityCommands } from './cli/commands/portability.js';
-import { registerFederationCommands } from './cli/commands/federation.js';
-import { registerWebhookCommands } from './cli/commands/webhook.js';
-import { registerTriggerCommands } from './cli/commands/trigger.js';
import {
addWorkspaceOption,
csv,
- installNamedIntegration,
parseNonNegativeIntOption,
- parsePortOption,
parsePositiveIntOption,
parsePositiveIntegerOption,
- parsePositiveNumberOption,
parseSetPairs,
- renderInstalledIntegrationResult,
- resolveInitTargetPath,
- resolveApiKey,
- resolveApiUrl,
+ parsePortOption,
resolveWorkspacePath,
+ resolveInitTargetPath,
runCommand,
- type JsonCapableOptions,
wantsJson,
} from './cli/core.js';
-import { WorkgraphRemoteClient } from './remote-client.js';
-
-const DEFAULT_ACTOR =
- process.env.WORKGRAPH_AGENT ||
- process.env.USER ||
- 'anonymous';
-
-type PrimitiveRecord = {
- path: string;
- type: string;
- fields: Record;
-};
-
-registerDefaultDispatchAdaptersIntoKernelRegistry();
-
-const CLI_VERSION = (() => {
- try {
- const pkgUrl = new URL('../package.json', import.meta.url);
- const pkg = JSON.parse(fs.readFileSync(pkgUrl, 'utf-8')) as { version?: string };
- return pkg.version ?? '0.0.0';
- } catch {
- return '0.0.0';
- }
-})();
+
+const CLI_VERSION = '3.2.2';
+const DEFAULT_ACTOR = process.env.WORKGRAPH_ACTOR?.trim() || 'agent';
const program = new Command();
+
program
.name('workgraph')
- .description('Agent-first workgraph workspace for multi-agent collaboration.')
- .version(CLI_VERSION);
-
-program.showHelpAfterError();
+ .description('Context graph, thread collaboration, MCP exposure, and actor registration.')
+ .version(CLI_VERSION)
+ .showHelpAfterError();
addWorkspaceOption(
program
.command('init [path]')
- .description('Initialize or repair a workgraph workspace starter kit')
- .option('-n, --name ', 'Workspace name')
- .option('--no-type-dirs', 'Do not pre-create built-in type directories')
- .option('--no-bases', 'Do not generate .base files from primitive registry')
- .option('--no-readme', 'Do not create README.md/QUICKSTART.md')
- .option('--json', 'Emit structured JSON output')
+ .description('Initialize a workgraph workspace')
+ .option('--name ', 'Workspace name')
+ .option('--no-readme', 'Skip README/QUICKSTART generation')
+ .option('--no-bases', 'Skip base file generation')
+ .option('--json', 'Emit structured JSON output'),
).action((targetPath, opts) =>
runCommand(
opts,
- () => {
- const workspacePath = resolveInitTargetPath(targetPath, opts);
- const result = workgraph.workspace.initWorkspace(workspacePath, {
- name: opts.name,
- createTypeDirs: opts.typeDirs,
- createBases: opts.bases,
- createReadme: opts.readme,
- });
- return result;
- },
- (result) => {
- const roleSeeded = result.starterKit.roles.created.length + result.starterKit.roles.existing.length;
- const policySeeded = result.starterKit.policies.created.length + result.starterKit.policies.existing.length;
- const gateSeeded = result.starterKit.gates.created.length + result.starterKit.gates.existing.length;
- const spaceSeeded = result.starterKit.spaces.created.length + result.starterKit.spaces.existing.length;
- return [
- `${result.alreadyInitialized ? 'Updated' : 'Initialized'} workgraph workspace: ${result.workspacePath}`,
- `Seeded types: ${result.seededTypes.join(', ')}`,
- `Generated .base files: ${result.generatedBases.length}`,
- `Config: ${result.configPath}`,
- `Server config: ${result.serverConfigPath}`,
- `Starter kit primitives: roles=${roleSeeded} policies=${policySeeded} gates=${gateSeeded} spaces=${spaceSeeded}`,
- `Bootstrap trust token (${result.bootstrapTrustTokenPath}): ${result.bootstrapTrustToken}`,
- ...(result.quickstartPath ? [`Quickstart: ${result.quickstartPath}`] : []),
- '',
- 'Next steps:',
- `1) Start server: workgraph serve -w "${result.workspacePath}"`,
- `2) Preferred registration flow: workgraph agent request agent-1 -w "${result.workspacePath}" --role roles/admin.md`,
- ` Approve request: workgraph agent review agent-1 -w "${result.workspacePath}" --decision approved --actor admin-approver`,
- ` Bootstrap fallback: workgraph agent register agent-1 -w "${result.workspacePath}" --token ${result.bootstrapTrustToken}`,
- `3) Create first thread: workgraph thread create "First coordinated task" -w "${result.workspacePath}" --goal "Validate onboarding flow" --actor agent-1`,
- ];
- }
- )
+ () => workgraph.workspace.initWorkspace(resolveInitTargetPath(targetPath, opts), {
+ name: opts.name,
+ createReadme: opts.readme,
+ createBases: opts.bases,
+ }),
+ (result) => [
+ `Initialized workspace: ${result.workspacePath}`,
+ `Bootstrap trust token path: ${result.bootstrapTrustTokenPath}`,
+ `Server config: ${result.serverConfigPath}`,
+ ],
+ ),
);
-// ============================================================================
-// thread
-// ============================================================================
-
const threadCmd = program
.command('thread')
- .description('Coordinate work through claimable threads');
+ .description('Coordinate work through collaborative threads');
addWorkspaceOption(
threadCmd
.command('create ')
- .description('Create a new thread')
- .requiredOption('-g, --goal ', 'What success looks like')
- .option('-a, --actor ', 'Agent name', DEFAULT_ACTOR)
- .option('-p, --priority ', 'urgent | high | medium | low', 'medium')
- .option('--deps ', 'Comma-separated dependency thread paths')
- .option('--parent ', 'Parent thread path')
- .option('--space ', 'Optional space ref (e.g. spaces/backend.md)')
- .option('--context ', 'Comma-separated workspace doc refs for context')
+ .description('Create a thread')
+ .requiredOption('--goal ', 'Thread goal')
+ .option('-a, --actor ', 'Actor', DEFAULT_ACTOR)
+ .option('--priority ', 'urgent|high|medium|low', 'medium')
+ .option('--deps ', 'Comma-separated dependency thread refs')
+ .option('--parent [', 'Parent thread ref')
+ .option('--space ][', 'Space ref')
+ .option('--context-refs ', 'Comma-separated context refs')
.option('--tags ', 'Comma-separated tags')
- .option('--json', 'Emit structured JSON output')
-).action((title, opts) => {
- if (isRemoteMode(opts)) {
- return runCommand(
- opts,
- () => withRemoteClient(opts, (client) =>
- client.callTool<{ thread: PrimitiveRecord }>('workgraph_thread_create', {
- title,
- goal: opts.goal,
- actor: opts.actor,
- priority: opts.priority,
- deps: csv(opts.deps),
- parent: opts.parent,
- space: opts.space,
- context_refs: csv(opts.context),
- tags: csv(opts.tags),
- })),
- (result) => [
- `Created thread: ${result.thread.path}`,
- `Status: ${String(result.thread.fields.status)}`,
- `Priority: ${String(result.thread.fields.priority)}`,
- ],
- );
- }
- return runCommand(
+ .option('--json', 'Emit structured JSON output'),
+).action((title, opts) =>
+ runCommand(
opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- return {
- thread: workgraph.thread.createThread(workspacePath, title, opts.goal, opts.actor, {
- priority: opts.priority,
- deps: csv(opts.deps),
- parent: opts.parent,
- space: opts.space,
- context_refs: csv(opts.context),
- tags: csv(opts.tags),
- }),
- };
- },
+ () => workgraph.thread.createThread(resolveWorkspacePath(opts), title, opts.goal, opts.actor, {
+ priority: normalizePriority(opts.priority),
+ deps: csv(opts.deps),
+ parent: opts.parent,
+ space: opts.space,
+ context_refs: csv(opts.contextRefs),
+ tags: csv(opts.tags),
+ }),
(result) => [
- `Created thread: ${result.thread.path}`,
- `Status: ${String(result.thread.fields.status)}`,
- `Priority: ${String(result.thread.fields.priority)}`,
+ `Created thread: ${result.path}`,
+ `Status: ${String(result.fields.status)}`,
+ `Priority: ${String(result.fields.priority)}`,
],
- );
-});
+ ),
+);
addWorkspaceOption(
threadCmd
.command('list')
- .description('List threads (optionally by state/ready status)')
- .option('-s, --status ', 'open | active | blocked | done | cancelled')
- .option('--space ', 'Filter threads by space ref')
- .option('--ready', 'Only include threads ready to be claimed now')
- .option('--json', 'Emit structured JSON output')
-).action((opts) => {
- if (isRemoteMode(opts)) {
- return runCommand(
- opts,
- () => withRemoteClient(opts, (client) =>
- client.callTool<{ threads: Array; count: number }>(
- 'workgraph_thread_list',
- {
- status: opts.status,
- readyOnly: !!opts.ready,
- space: opts.space,
- },
- )),
- (result) => {
- if (result.threads.length === 0) return ['No threads found.'];
- return [
- ...result.threads.map((t) => {
- const status = String(t.fields.status);
- const owner = t.fields.owner ? ` (${String(t.fields.owner)})` : '';
- const ready = t.ready ? ' ready' : '';
- return `[${status}]${ready} ${String(t.fields.title)}${owner} -> ${t.path}`;
- }),
- `${result.count} thread(s)`,
- ];
- },
- );
- }
- return runCommand(
+ .description('List threads')
+ .option('--status ', 'Filter by status')
+ .option('--space ][', 'Filter by space')
+ .option('--ready', 'Only show ready threads')
+ .option('--json', 'Emit structured JSON output'),
+).action((opts) =>
+ runCommand(
opts,
() => {
const workspacePath = resolveWorkspacePath(opts);
let threads = opts.space
? workgraph.store.threadsInSpace(workspacePath, opts.space)
: workgraph.store.list(workspacePath, 'thread');
- const readySet = new Set(
- (opts.space
- ? workgraph.thread.listReadyThreadsInSpace(workspacePath, opts.space)
- : workgraph.thread.listReadyThreads(workspacePath))
- .map(t => t.path)
- );
- if (opts.status) threads = threads.filter(t => t.fields.status === opts.status);
- if (opts.ready) threads = threads.filter(t => readySet.has(t.path));
- const enriched = threads.map(t => ({
- ...t,
- ready: readySet.has(t.path),
- }));
- return { threads: enriched, count: enriched.length };
+ if (opts.status) {
+ threads = threads.filter((entry) => String(entry.fields.status) === opts.status);
+ }
+ if (opts.ready) {
+ const readySet = new Set(
+ (opts.space
+ ? workgraph.thread.listReadyThreadsInSpace(workspacePath, opts.space)
+ : workgraph.thread.listReadyThreads(workspacePath)).map((entry) => entry.path),
+ );
+ threads = threads.filter((entry) => readySet.has(entry.path));
+ }
+ return { threads, count: threads.length };
},
(result) => {
if (result.threads.length === 0) return ['No threads found.'];
return [
- ...result.threads.map((t) => {
- const status = String(t.fields.status);
- const owner = t.fields.owner ? ` (${String(t.fields.owner)})` : '';
- const ready = t.ready ? ' ready' : '';
- return `[${status}]${ready} ${String(t.fields.title)}${owner} -> ${t.path}`;
- }),
+ ...result.threads.map((entry) =>
+ `[${String(entry.fields.status)}] ${String(entry.fields.priority)} ${String(entry.fields.title)} -> ${entry.path}`),
`${result.count} thread(s)`,
];
},
- );
-});
+ ),
+);
addWorkspaceOption(
threadCmd
.command('next')
- .description('Pick the next ready thread, optionally claim it')
- .option('-a, --actor ', 'Agent name', DEFAULT_ACTOR)
- .option('--space ', 'Restrict scheduling to one space')
- .option('--claim', 'Immediately claim the next ready thread')
- .option('--fail-on-empty', 'Exit non-zero if no ready thread exists')
- .option('--json', 'Emit structured JSON output')
-).action((opts) => {
- if (isRemoteMode(opts)) {
- return runCommand(
- opts,
- () => withRemoteClient(opts, async (client) => {
- const readyResult = await client.callTool<{ threads: PrimitiveRecord[] }>(
- 'workgraph_thread_list',
- {
- readyOnly: true,
- space: opts.space,
- },
- );
- const nextThread = readyResult.threads[0];
- if (!nextThread) {
- if (opts.failOnEmpty) {
- throw new Error('No ready threads available.');
- }
- return { thread: null, claimed: false };
- }
- if (!opts.claim) {
- return { thread: nextThread, claimed: false };
- }
- const claimedResult = await client.callTool<{ thread: PrimitiveRecord }>(
- 'workgraph_thread_claim',
- {
- threadPath: nextThread.path,
- actor: opts.actor,
- },
- );
- return {
- thread: claimedResult.thread,
- claimed: true,
- };
- }),
- (result) => {
- if (!result.thread) return ['No ready thread available.'];
- return [
- `${result.claimed ? 'Claimed' : 'Selected'} thread: ${result.thread.path}`,
- `Title: ${String(result.thread.fields.title)}`,
- ...(result.thread.fields.space ? [`Space: ${String(result.thread.fields.space)}`] : []),
- ];
- },
- );
- }
- return runCommand(
+ .description('Show or claim the next ready thread')
+ .option('-a, --actor ', 'Actor', DEFAULT_ACTOR)
+ .option('--space ][', 'Limit to one space')
+ .option('--claim', 'Claim the next ready thread')
+ .option('--json', 'Emit structured JSON output'),
+).action((opts) =>
+ runCommand(
opts,
() => {
const workspacePath = resolveWorkspacePath(opts);
- const thread = opts.claim
- ? (opts.space
+ if (opts.claim) {
+ return {
+ thread: opts.space
? workgraph.thread.claimNextReadyInSpace(workspacePath, opts.actor, opts.space)
- : workgraph.thread.claimNextReady(workspacePath, opts.actor))
- : (opts.space
- ? workgraph.thread.pickNextReadyThreadInSpace(workspacePath, opts.space)
- : workgraph.thread.pickNextReadyThread(workspacePath));
- if (!thread && opts.failOnEmpty) {
- throw new Error('No ready threads available.');
+ : workgraph.thread.claimNextReady(workspacePath, opts.actor),
+ };
}
return {
- thread,
- claimed: !!opts.claim && !!thread,
+ thread: opts.space
+ ? workgraph.thread.pickNextReadyThreadInSpace(workspacePath, opts.space)
+ : workgraph.thread.pickNextReadyThread(workspacePath),
};
},
- (result) => {
- if (!result.thread) return ['No ready thread available.'];
- return [
- `${result.claimed ? 'Claimed' : 'Selected'} thread: ${result.thread.path}`,
- `Title: ${String(result.thread.fields.title)}`,
- ...(result.thread.fields.space ? [`Space: ${String(result.thread.fields.space)}`] : []),
- ];
- },
- );
-});
+ (result) => result.thread
+ ? [
+ `Thread: ${result.thread.path}`,
+ `Title: ${String(result.thread.fields.title)}`,
+ `Priority: ${String(result.thread.fields.priority)}`,
+ ]
+ : ['No ready thread found.'],
+ ),
+);
addWorkspaceOption(
threadCmd
.command('show ')
- .description('Show thread details and ledger history')
- .option('--json', 'Emit structured JSON output')
+ .description('Show one thread and its ledger history')
+ .option('--json', 'Emit structured JSON output'),
).action((threadPath, opts) =>
runCommand(
opts,
() => {
const workspacePath = resolveWorkspacePath(opts);
- const thread = workgraph.store.read(workspacePath, threadPath);
+ const thread = workgraph.store.read(workspacePath, normalizePath(threadPath));
if (!thread) throw new Error(`Thread not found: ${threadPath}`);
- const history = workgraph.ledger.historyOf(workspacePath, threadPath);
- return { thread, history };
+ return {
+ thread,
+ history: workgraph.ledger.historyOf(workspacePath, thread.path),
+ };
},
(result) => [
- `${String(result.thread.fields.title)} (${result.thread.path})`,
- `Status: ${String(result.thread.fields.status)} Owner: ${String(result.thread.fields.owner ?? 'unclaimed')}`,
- `History entries: ${result.history.length}`,
- ]
- )
+ `Thread: ${result.thread.path}`,
+ `Status: ${String(result.thread.fields.status)}`,
+ `Owner: ${String(result.thread.fields.owner ?? 'none')}`,
+ `Ledger entries: ${result.history.length}`,
+ ],
+ ),
);
addWorkspaceOption(
threadCmd
.command('participants ')
- .description('List thread participants and roles')
- .option('--json', 'Emit structured JSON output')
+ .description('List thread participants')
+ .option('--json', 'Emit structured JSON output'),
).action((threadPath, opts) =>
runCommand(
opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- const participants = workgraph.thread.listThreadParticipants(workspacePath, threadPath);
- return { threadPath, participants, count: participants.length };
- },
- (result) => {
- if (result.participants.length === 0) {
- return [`No participants recorded for ${result.threadPath}.`];
- }
- return [
- `Participants for ${result.threadPath}:`,
- ...result.participants.map((participant) =>
- `- ${participant.actor} [${participant.role}] joined=${participant.joined_at}`),
- ];
- },
- )
+ () => ({
+ participants: workgraph.thread.listThreadParticipants(resolveWorkspacePath(opts), normalizePath(threadPath)),
+ }),
+ (result) => result.participants.length > 0
+ ? result.participants.map((entry) => `${entry.actor} (${entry.role})`)
+ : ['No participants recorded.'],
+ ),
);
addWorkspaceOption(
threadCmd
.command('invite ')
- .description('Invite or update a participant role on a thread')
- .requiredOption('--participant ', 'Participant actor name')
- .option('--role ', 'owner | contributor | reviewer | observer', 'contributor')
- .option('-a, --actor ', 'Agent name', DEFAULT_ACTOR)
- .option('--json', 'Emit structured JSON output')
+ .description('Invite another participant onto a thread')
+ .requiredOption('--participant ', 'Participant actor')
+ .option('-a, --actor ', 'Actor', DEFAULT_ACTOR)
+ .option('--role ', 'owner|contributor|reviewer|observer', 'contributor')
+ .option('--json', 'Emit structured JSON output'),
).action((threadPath, opts) =>
runCommand(
opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- return {
- thread: workgraph.thread.inviteThreadParticipant(
- workspacePath,
- threadPath,
- opts.actor,
- opts.participant,
- opts.role,
- ),
- };
- },
- (result) => [`Invited participant on: ${result.thread.path}`],
- )
+ () => workgraph.thread.inviteThreadParticipant(
+ resolveWorkspacePath(opts),
+ normalizePath(threadPath),
+ opts.actor,
+ opts.participant,
+ normalizeParticipantRole(opts.role),
+ ),
+ (result) => [`Updated participants for ${result.path}.`],
+ ),
);
addWorkspaceOption(
threadCmd
.command('join ')
- .description('Join a thread as participant')
- .option('-a, --actor ', 'Agent name', DEFAULT_ACTOR)
- .option('--role ', 'contributor | reviewer | observer', 'contributor')
- .option('--json', 'Emit structured JSON output')
+ .description('Join a thread as a participant')
+ .option('-a, --actor ', 'Actor', DEFAULT_ACTOR)
+ .option('--role ', 'contributor|reviewer|observer', 'contributor')
+ .option('--json', 'Emit structured JSON output'),
).action((threadPath, opts) =>
runCommand(
opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- return {
- thread: workgraph.thread.joinThread(workspacePath, threadPath, opts.actor, opts.role),
- };
- },
- (result) => [`Joined thread: ${result.thread.path}`],
- )
+ () => workgraph.thread.joinThread(
+ resolveWorkspacePath(opts),
+ normalizePath(threadPath),
+ opts.actor,
+ normalizeParticipantRole(opts.role),
+ ),
+ (result) => [`Joined ${result.path} as ${opts.actor}.`],
+ ),
);
addWorkspaceOption(
threadCmd
.command('leave ')
- .description('Leave a thread (or remove another participant if authorized)')
- .option('-a, --actor ', 'Agent name', DEFAULT_ACTOR)
- .option('--participant ', 'Participant actor to remove (defaults to --actor)')
- .option('--json', 'Emit structured JSON output')
+ .description('Leave a thread or remove another participant')
+ .option('-a, --actor ', 'Actor', DEFAULT_ACTOR)
+ .option('--participant ', 'Optional participant to remove')
+ .option('--json', 'Emit structured JSON output'),
).action((threadPath, opts) =>
runCommand(
opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- return {
- thread: workgraph.thread.leaveThread(workspacePath, threadPath, opts.actor, opts.participant),
- };
- },
- (result) => [`Updated participants on: ${result.thread.path}`],
- )
+ () => workgraph.thread.leaveThread(
+ resolveWorkspacePath(opts),
+ normalizePath(threadPath),
+ opts.actor,
+ opts.participant,
+ ),
+ (result) => [`Updated participants for ${result.path}.`],
+ ),
);
addWorkspaceOption(
threadCmd
.command('claim ')
- .description('Claim a thread for this agent')
- .option('-a, --actor ', 'Agent name', DEFAULT_ACTOR)
- .option('--lease-ttl-minutes ', 'Claim lease TTL in minutes', '30')
- .option('--json', 'Emit structured JSON output')
-).action((threadPath, opts) => {
- if (isRemoteMode(opts)) {
- return runCommand(
- opts,
- () => withRemoteClient(opts, (client) =>
- client.callTool<{ thread: PrimitiveRecord }>('workgraph_thread_claim', {
- threadPath,
- actor: opts.actor,
- })),
- (result) => [`Claimed: ${result.thread.path}`, `Owner: ${String(result.thread.fields.owner)}`],
- );
- }
- return runCommand(
+ .description('Claim a thread')
+ .option('-a, --actor ', 'Actor', DEFAULT_ACTOR)
+ .option('--lease-ttl-minutes ', 'Lease TTL minutes')
+ .option('--json', 'Emit structured JSON output'),
+).action((threadPath, opts) =>
+ runCommand(
opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- return {
- thread: workgraph.thread.claim(workspacePath, threadPath, opts.actor, {
- leaseTtlMinutes: Number.parseFloat(String(opts.leaseTtlMinutes)),
- }),
- };
- },
- (result) => [`Claimed: ${result.thread.path}`, `Owner: ${String(result.thread.fields.owner)}`],
- );
-});
+ () => workgraph.thread.claim(resolveWorkspacePath(opts), normalizePath(threadPath), opts.actor, {
+ leaseTtlMinutes: opts.leaseTtlMinutes ? parsePositiveIntOption(opts.leaseTtlMinutes, 'lease-ttl-minutes') : undefined,
+ }),
+ (result) => [`Claimed ${result.path} as ${opts.actor}.`],
+ ),
+);
addWorkspaceOption(
threadCmd
.command('release ')
- .description('Release a claimed thread back to open')
- .option('-a, --actor ', 'Agent name', DEFAULT_ACTOR)
- .option('--reason ', 'Why you are releasing')
- .option('--json', 'Emit structured JSON output')
+ .description('Release a claimed thread')
+ .option('-a, --actor ', 'Actor', DEFAULT_ACTOR)
+ .option('--reason ', 'Release reason')
+ .option('--json', 'Emit structured JSON output'),
).action((threadPath, opts) =>
runCommand(
opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- return { thread: workgraph.thread.release(workspacePath, threadPath, opts.actor, opts.reason) };
- },
- (result) => [`Released: ${result.thread.path}`, `Status: ${String(result.thread.fields.status)}`]
- )
+ () => workgraph.thread.release(resolveWorkspacePath(opts), normalizePath(threadPath), opts.actor, opts.reason),
+ (result) => [`Released ${result.path} as ${opts.actor}.`],
+ ),
);
addWorkspaceOption(
threadCmd
.command('done ')
.description('Mark a thread done')
- .option('-a, --actor ', 'Agent name', DEFAULT_ACTOR)
- .option('-o, --output ', 'Output/result summary')
- .option('--evidence ', 'Comma-separated evidence values (url/path/reply/thread refs)')
- .option('--json', 'Emit structured JSON output')
-).action((threadPath, opts) => {
- if (isRemoteMode(opts)) {
- return runCommand(
- opts,
- () => withRemoteClient(opts, (client) =>
- client.callTool<{ thread: PrimitiveRecord }>('workgraph_thread_done', {
- threadPath,
- actor: opts.actor,
- output: opts.output,
- evidence: csv(opts.evidence),
- })),
- (result) => [`Done: ${result.thread.path}`],
- );
- }
- return runCommand(
+ .option('-a, --actor ', 'Actor', DEFAULT_ACTOR)
+ .option('--output ', 'Completion output')
+ .option('--evidence ', 'Comma-separated evidence items')
+ .option('--json', 'Emit structured JSON output'),
+).action((threadPath, opts) =>
+ runCommand(
opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- return {
- thread: workgraph.thread.done(workspacePath, threadPath, opts.actor, opts.output, {
- evidence: csv(opts.evidence),
- }),
- };
- },
- (result) => [`Done: ${result.thread.path}`],
- );
-});
+ () => workgraph.thread.done(resolveWorkspacePath(opts), normalizePath(threadPath), opts.actor, opts.output, {
+ evidence: csv(opts.evidence),
+ }),
+ (result) => [`Completed ${result.path} as ${opts.actor}.`],
+ ),
+);
addWorkspaceOption(
threadCmd
.command('reopen ')
- .description('Reopen a done/cancelled thread via compensating ledger op')
- .option('-a, --actor ', 'Agent name', DEFAULT_ACTOR)
- .option('--reason ', 'Why the thread is being reopened')
- .option('--json', 'Emit structured JSON output')
+ .description('Reopen a done or cancelled thread')
+ .option('-a, --actor ', 'Actor', DEFAULT_ACTOR)
+ .option('--reason ', 'Reopen reason')
+ .option('--json', 'Emit structured JSON output'),
).action((threadPath, opts) =>
runCommand(
opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- return { thread: workgraph.thread.reopen(workspacePath, threadPath, opts.actor, opts.reason) };
- },
- (result) => [`Reopened: ${result.thread.path}`, `Status: ${String(result.thread.fields.status)}`]
- )
+ () => workgraph.thread.reopen(resolveWorkspacePath(opts), normalizePath(threadPath), opts.actor, opts.reason),
+ (result) => [`Reopened ${result.path} as ${opts.actor}.`],
+ ),
);
addWorkspaceOption(
threadCmd
.command('block ')
- .description('Mark a thread blocked')
- .requiredOption('-b, --blocked-by ', 'Dependency blocking this thread')
- .option('-a, --actor ', 'Agent name', DEFAULT_ACTOR)
- .option('--reason ', 'Why it is blocked')
- .option('--json', 'Emit structured JSON output')
+ .description('Block a thread')
+ .option('-a, --actor ', 'Actor', DEFAULT_ACTOR)
+ .option('--blocked-by ][', 'Blocking dependency', 'external/manual')
+ .option('--reason ', 'Blocking reason')
+ .option('--json', 'Emit structured JSON output'),
).action((threadPath, opts) =>
runCommand(
opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- return {
- thread: workgraph.thread.block(workspacePath, threadPath, opts.actor, opts.blockedBy, opts.reason),
- };
- },
- (result) => [`Blocked: ${result.thread.path}`]
- )
+ () => workgraph.thread.block(
+ resolveWorkspacePath(opts),
+ normalizePath(threadPath),
+ opts.actor,
+ opts.blockedBy,
+ opts.reason,
+ ),
+ (result) => [`Blocked ${result.path} as ${opts.actor}.`],
+ ),
);
addWorkspaceOption(
threadCmd
.command('unblock ')
.description('Unblock a thread')
- .option('-a, --actor ', 'Agent name', DEFAULT_ACTOR)
- .option('--json', 'Emit structured JSON output')
+ .option('-a, --actor ', 'Actor', DEFAULT_ACTOR)
+ .option('--json', 'Emit structured JSON output'),
).action((threadPath, opts) =>
runCommand(
opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- return { thread: workgraph.thread.unblock(workspacePath, threadPath, opts.actor) };
- },
- (result) => [`Unblocked: ${result.thread.path}`]
- )
+ () => workgraph.thread.unblock(resolveWorkspacePath(opts), normalizePath(threadPath), opts.actor),
+ (result) => [`Unblocked ${result.path} as ${opts.actor}.`],
+ ),
);
addWorkspaceOption(
threadCmd
.command('heartbeat [threadPath]')
- .description('Refresh thread claim lease heartbeat (one thread or all active claims for actor)')
- .option('-a, --actor ', 'Agent name', DEFAULT_ACTOR)
- .option('--ttl-minutes ', 'Lease TTL in minutes', '30')
- .option('--json', 'Emit structured JSON output')
+ .description('Refresh one or more claim heartbeats')
+ .option('-a, --actor ', 'Actor', DEFAULT_ACTOR)
+ .option('--ttl-minutes ', 'Lease TTL minutes')
+ .option('--json', 'Emit structured JSON output'),
).action((threadPath, opts) =>
runCommand(
opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- return workgraph.thread.heartbeatClaim(
- workspacePath,
- opts.actor,
- threadPath,
- {
- ttlMinutes: Number.parseFloat(String(opts.ttlMinutes)),
- },
- );
- },
+ () => workgraph.thread.heartbeatClaim(resolveWorkspacePath(opts), opts.actor, threadPath ? normalizePath(threadPath) : undefined, {
+ ttlMinutes: opts.ttlMinutes ? parsePositiveIntOption(opts.ttlMinutes, 'ttl-minutes') : undefined,
+ }),
(result) => [
- `Heartbeat actor: ${result.actor}`,
- `Touched leases: ${result.touched.length}`,
- ...(result.touched.length > 0
- ? result.touched.map((entry) => `- ${entry.threadPath} expires=${entry.expiresAt}`)
- : []),
- ...(result.skipped.length > 0
- ? result.skipped.map((entry) => `SKIP ${entry.threadPath}: ${entry.reason}`)
- : []),
+ `Touched: ${result.touched.length}`,
+ `Skipped: ${result.skipped.length}`,
],
- )
+ ),
);
addWorkspaceOption(
threadCmd
.command('reap-stale')
- .description('Reopen/release stale claimed threads whose leases expired')
- .option('-a, --actor ', 'Agent name', DEFAULT_ACTOR)
- .option('--limit ', 'Max stale leases to reap this run')
- .option('--json', 'Emit structured JSON output')
+ .description('Reap stale claims')
+ .option('-a, --actor ', 'Actor', DEFAULT_ACTOR)
+ .option('--limit ', 'Maximum claims to reap')
+ .option('--json', 'Emit structured JSON output'),
).action((opts) =>
runCommand(
opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- return workgraph.thread.reapStaleClaims(workspacePath, opts.actor, {
- limit: opts.limit ? Number.parseInt(String(opts.limit), 10) : undefined,
- });
- },
+ () => workgraph.thread.reapStaleClaims(resolveWorkspacePath(opts), opts.actor, {
+ limit: opts.limit ? parsePositiveIntOption(opts.limit, 'limit') : undefined,
+ }),
(result) => [
- `Reaper actor: ${result.actor}`,
- `Scanned stale leases: ${result.scanned}`,
+ `Scanned: ${result.scanned}`,
`Reaped: ${result.reaped.length}`,
- ...(result.reaped.length > 0
- ? result.reaped.map((entry) => `- ${entry.threadPath} (prev=${entry.previousOwner})`)
- : []),
- ...(result.skipped.length > 0
- ? result.skipped.map((entry) => `SKIP ${entry.threadPath}: ${entry.reason}`)
- : []),
+ `Skipped: ${result.skipped.length}`,
],
- )
+ ),
);
addWorkspaceOption(
threadCmd
.command('leases')
- .description('List claim leases and staleness state')
- .option('--json', 'Emit structured JSON output')
+ .description('List claim lease status')
+ .option('--json', 'Emit structured JSON output'),
).action((opts) =>
runCommand(
opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- const leases = workgraph.thread.listClaimLeaseStatus(workspacePath);
- return { leases, count: leases.length };
- },
- (result) => result.leases.map((lease) =>
- `${lease.stale ? 'STALE' : 'LIVE'} ${lease.owner} -> ${lease.target} expires=${lease.expiresAt}`)
- )
+ () => ({ leases: workgraph.thread.listClaimLeaseStatus(resolveWorkspacePath(opts)) }),
+ (result) => result.leases.length > 0
+ ? result.leases.map((lease) => `${lease.target} owner=${lease.owner} stale=${lease.stale}`)
+ : ['No claim leases found.'],
+ ),
);
addWorkspaceOption(
threadCmd
.command('decompose ')
- .description('Break a thread into sub-threads')
- .requiredOption('--sub ', 'Sub-thread specs as "title|goal"')
- .option('-a, --actor ', 'Agent name', DEFAULT_ACTOR)
- .option('--json', 'Emit structured JSON output')
+ .description('Create child threads under one parent thread')
+ .requiredOption('--subthread ', 'Repeatable child thread spec', collectSubthreadSpecs, [])
+ .option('-a, --actor ', 'Actor', DEFAULT_ACTOR)
+ .option('--json', 'Emit structured JSON output'),
).action((threadPath, opts) =>
runCommand(
opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- const subthreads = opts.sub.map((spec: string) => {
- const [title, ...goalParts] = spec.split('|');
- const goal = goalParts.join('|').trim() || title.trim();
- return { title: title.trim(), goal };
- });
- return { children: workgraph.thread.decompose(workspacePath, threadPath, subthreads, opts.actor) };
- },
- (result) => [`Created ${result.children.length} sub-thread(s).`]
- )
+ () => ({
+ threads: workgraph.thread.decompose(resolveWorkspacePath(opts), normalizePath(threadPath), opts.subthread, opts.actor),
+ }),
+ (result) => result.threads.map((entry) => `Created child thread: ${entry.path}`),
+ ),
);
-// ============================================================================
-// agent presence
-// ============================================================================
-
const agentCmd = program
.command('agent')
- .description('Track agent presence heartbeats');
+ .description('Manage actor registration, credentials, and presence');
addWorkspaceOption(
agentCmd
.command('heartbeat ')
- .description('Create/update an agent presence heartbeat')
- .option('-a, --actor ', 'Actor writing the heartbeat', DEFAULT_ACTOR)
- .option('--status ', 'online | busy | offline', 'online')
- .option('--current-task ', 'Current task/thread slug for this agent')
- .option('--capabilities ', 'Comma-separated capability tags')
- .option('--json', 'Emit structured JSON output')
-).action((name, opts) => {
- if (isRemoteMode(opts)) {
- return runCommand(
- opts,
- () => withRemoteClient(opts, (client) =>
- client.callTool<{ presence: PrimitiveRecord }>('workgraph_agent_heartbeat', {
- name,
- actor: opts.actor,
- status: normalizeAgentPresenceStatus(opts.status),
- currentTask: opts.currentTask,
- capabilities: csv(opts.capabilities),
- })),
- (result) => [
- `Heartbeat: ${String(result.presence.fields.name)} [${String(result.presence.fields.status)}]`,
- `Last seen: ${String(result.presence.fields.last_seen)}`,
- `Current task: ${String(result.presence.fields.current_task ?? 'none')}`,
- ],
- );
- }
- return runCommand(
+ .description('Write an actor presence heartbeat')
+ .option('-a, --actor ', 'Actor performing the update', DEFAULT_ACTOR)
+ .option('--status ', 'online|busy|offline', 'online')
+ .option('--current-task ', 'Current task')
+ .option('--capabilities ', 'Comma-separated capabilities')
+ .option('--json', 'Emit structured JSON output'),
+).action((name, opts) =>
+ runCommand(
opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- return {
- presence: workgraph.agent.heartbeat(workspacePath, name, {
- actor: opts.actor,
- status: normalizeAgentPresenceStatus(opts.status),
- currentTask: opts.currentTask,
- capabilities: csv(opts.capabilities),
- }),
- };
- },
- (result) => [
- `Heartbeat: ${String(result.presence.fields.name)} [${String(result.presence.fields.status)}]`,
- `Last seen: ${String(result.presence.fields.last_seen)}`,
- `Current task: ${String(result.presence.fields.current_task ?? 'none')}`,
- ],
- );
-});
+ () => workgraph.agent.heartbeat(resolveWorkspacePath(opts), name, {
+ actor: opts.actor,
+ status: normalizePresenceStatus(opts.status),
+ currentTask: opts.currentTask,
+ capabilities: csv(opts.capabilities),
+ }),
+ (result) => [`Heartbeated ${String(result.fields.name)} (${String(result.fields.status)}).`],
+ ),
+);
addWorkspaceOption(
agentCmd
.command('register ')
- .description('Register an agent using bootstrap token fallback (legacy/hybrid mode)')
- .option('--token ', 'Bootstrap trust token (or WORKGRAPH_TRUST_TOKEN env)')
- .option('--role ', 'Role slug/path override (default from trust token)')
- .option('--capabilities ', 'Comma-separated extra capabilities')
- .option('--status ', 'online | busy | offline', 'online')
- .option('--current-task ', 'Optional current task/thread ref')
- .option('-a, --actor ', 'Actor writing registration artifacts')
- .option('--json', 'Emit structured JSON output')
-).action((name, opts) => {
- if (isRemoteMode(opts)) {
- return runCommand(
- opts,
- () => withRemoteClient(opts, (client) =>
- client.callTool>('workgraph_agent_register', {
- name,
- token: opts.token,
- role: opts.role,
- capabilities: csv(opts.capabilities),
- status: normalizeAgentPresenceStatus(opts.status),
- currentTask: opts.currentTask,
- actor: opts.actor,
- })),
- (result) => [
- `Registered agent: ${result.agentName}`,
- `Role: ${result.role} (${result.rolePath})`,
- `Capabilities: ${result.capabilities.join(', ') || 'none'}`,
- `Presence: ${result.presence.path}`,
- `Policy party: ${result.policyParty.id}`,
- `Bootstrap token: ${result.trustTokenPath} [${result.trustTokenStatus}]`,
- ...(result.credential ? [`Credential: ${result.credential.id} [${result.credential.status}]`] : []),
- ...(result.apiKey ? [`API key (store securely, shown once): ${result.apiKey}`] : []),
- ],
- );
- }
- return runCommand(
+ .description('Register an actor using a trust token')
+ .option('-a, --actor ', 'Actor performing the update', DEFAULT_ACTOR)
+ .option('--token ', 'Trust token (or WORKGRAPH_TRUST_TOKEN env)')
+ .option('--role ', 'Role ref')
+ .option('--capabilities ', 'Comma-separated capabilities')
+ .option('--status ', 'online|busy|offline', 'online')
+ .option('--current-task ', 'Current task')
+ .option('--json', 'Emit structured JSON output'),
+).action((name, opts) =>
+ runCommand(
opts,
() => {
- const workspacePath = resolveWorkspacePath(opts);
- const token = String(opts.token ?? process.env.WORKGRAPH_TRUST_TOKEN ?? '').trim();
+ const token = readNonEmptyString(opts.token) ?? process.env.WORKGRAPH_TRUST_TOKEN;
if (!token) {
throw new Error('Missing trust token. Provide --token or set WORKGRAPH_TRUST_TOKEN.');
}
- return workgraph.agent.registerAgent(workspacePath, name, {
+ return workgraph.agent.registerAgent(resolveWorkspacePath(opts), name, {
token,
+ actor: opts.actor,
role: opts.role,
capabilities: csv(opts.capabilities),
- status: normalizeAgentPresenceStatus(opts.status),
+ status: normalizePresenceStatus(opts.status),
currentTask: opts.currentTask,
- actor: opts.actor,
});
},
(result) => [
- `Registered agent: ${result.agentName}`,
- `Role: ${result.role} (${result.rolePath})`,
- `Capabilities: ${result.capabilities.join(', ') || 'none'}`,
+ `Registered actor: ${result.agentName}`,
+ `Role: ${result.role}`,
`Presence: ${result.presence.path}`,
- `Policy party: ${result.policyParty.id}`,
- `Bootstrap token: ${result.trustTokenPath} [${result.trustTokenStatus}]`,
- ...(result.credential ? [`Credential: ${result.credential.id} [${result.credential.status}]`] : []),
- ...(result.apiKey ? [`API key (store securely, shown once): ${result.apiKey}`] : []),
],
- );
-});
+ ),
+);
addWorkspaceOption(
agentCmd
.command('request ')
- .description('Submit an approval-based agent registration request')
- .option('--role ', 'Requested role slug/path (default: roles/contributor.md)')
- .option('--capabilities ', 'Comma-separated requested extra capabilities')
- .option('-a, --actor ', 'Actor submitting the request')
- .option('--note ', 'Optional request note')
- .option('--json', 'Emit structured JSON output')
+ .description('Submit an actor registration request')
+ .option('-a, --actor ', 'Actor performing the update', DEFAULT_ACTOR)
+ .option('--role ', 'Requested role ref')
+ .option('--capabilities ', 'Comma-separated capabilities')
+ .option('--note ', 'Request note')
+ .option('--json', 'Emit structured JSON output'),
).action((name, opts) =>
runCommand(
opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- return workgraph.agent.submitRegistrationRequest(workspacePath, name, {
- role: opts.role,
- capabilities: csv(opts.capabilities),
- actor: opts.actor,
- note: opts.note,
- });
- },
+ () => workgraph.agent.submitRegistrationRequest(resolveWorkspacePath(opts), name, {
+ actor: opts.actor,
+ role: opts.role,
+ capabilities: csv(opts.capabilities),
+ note: opts.note,
+ }),
(result) => [
- `Submitted registration request for ${result.agentName}`,
- `Request: ${result.request.path}`,
+ `Submitted request: ${result.request.path}`,
`Requested role: ${result.requestedRolePath}`,
- `Requested capabilities: ${result.requestedCapabilities.join(', ') || 'none'}`,
],
- )
+ ),
);
addWorkspaceOption(
agentCmd
.command('review ')
- .description('Approve or reject a pending registration request')
- .requiredOption('--decision ', 'approved | rejected')
- .option('-a, --actor ', 'Reviewer actor', DEFAULT_ACTOR)
- .option('--role ', 'Approved role slug/path (for approved decisions)')
- .option('--capabilities ', 'Comma-separated approved extra capabilities')
- .option('--scopes ', 'Comma-separated credential scopes (defaults to approved capabilities)')
- .option('--expires-at ', 'Optional credential expiry ISO date')
- .option('--note ', 'Optional review note')
- .option('--json', 'Emit structured JSON output')
+ .description('Approve or reject a registration request')
+ .requiredOption('--decision ', 'approved|rejected')
+ .option('-a, --actor ', 'Reviewer actor', DEFAULT_ACTOR)
+ .option('--role ', 'Approved role ref')
+ .option('--capabilities ', 'Comma-separated capabilities')
+ .option('--scopes ', 'Comma-separated credential scopes')
+ .option('--expires-at ', 'Credential expiry')
+ .option('--note ', 'Review note')
+ .option('--json', 'Emit structured JSON output'),
).action((requestRef, opts) =>
runCommand(
opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- const decision = String(opts.decision ?? '').trim().toLowerCase();
- if (decision !== 'approved' && decision !== 'rejected') {
- throw new Error('Invalid --decision value. Expected approved|rejected.');
- }
- return workgraph.agent.reviewRegistrationRequest(
- workspacePath,
- requestRef,
- opts.actor,
- decision,
- {
- role: opts.role,
- capabilities: csv(opts.capabilities),
- scopes: csv(opts.scopes),
- expiresAt: opts.expiresAt,
- note: opts.note,
- },
- );
- },
+ () => workgraph.agent.reviewRegistrationRequest(
+ resolveWorkspacePath(opts),
+ requestRef,
+ opts.actor,
+ normalizeRegistrationDecision(opts.decision),
+ {
+ role: opts.role,
+ capabilities: csv(opts.capabilities),
+ scopes: csv(opts.scopes),
+ expiresAt: opts.expiresAt,
+ note: opts.note,
+ },
+ ),
(result) => [
`Reviewed request: ${result.request.path}`,
`Decision: ${result.decision}`,
- `Approval record: ${result.approval.path}`,
- ...(result.policyParty
- ? [`Policy party: ${result.policyParty.id} (${result.policyParty.roles.join(', ')})`]
- : []),
- ...(result.credential ? [`Credential: ${result.credential.id} [${result.credential.status}]`] : []),
- ...(result.apiKey ? [`API key (store securely, shown once): ${result.apiKey}`] : []),
+ `Approval: ${result.approval.path}`,
],
- )
+ ),
);
addWorkspaceOption(
agentCmd
.command('credential-list')
- .description('List issued agent credentials')
- .option('--actor ', 'Filter by actor id')
- .option('--json', 'Emit structured JSON output')
+ .description('List actor credentials')
+ .option('--actor-filter ', 'Optional actor filter')
+ .option('--json', 'Emit structured JSON output'),
).action((opts) =>
runCommand(
opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- const credentials = workgraph.agent.listAgentCredentials(workspacePath, opts.actor);
- return {
- credentials,
- count: credentials.length,
- };
- },
- (result) => {
- if (result.credentials.length === 0) return ['No credentials found.'];
- return [
- ...result.credentials.map((credential) =>
- `${credential.id} actor=${credential.actor} status=${credential.status} scopes=${credential.scopes.join(', ') || 'none'}`
- ),
- `${result.count} credential(s)`,
- ];
- },
- )
+ () => ({
+ credentials: workgraph.agent.listAgentCredentials(resolveWorkspacePath(opts), opts.actorFilter),
+ }),
+ (result) => result.credentials.length > 0
+ ? result.credentials.map((entry) => `${entry.id} actor=${entry.actor} status=${entry.status}`)
+ : ['No credentials found.'],
+ ),
);
addWorkspaceOption(
agentCmd
.command('credential-revoke ')
- .description('Revoke an issued credential')
- .option('-a, --actor ', 'Actor revoking the credential', DEFAULT_ACTOR)
- .option('--reason ', 'Optional revocation reason')
- .option('--json', 'Emit structured JSON output')
+ .description('Revoke an actor credential')
+ .option('-a, --actor ', 'Actor performing the update', DEFAULT_ACTOR)
+ .option('--reason ', 'Revocation reason')
+ .option('--json', 'Emit structured JSON output'),
).action((credentialId, opts) =>
runCommand(
opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- return {
- credential: workgraph.agent.revokeAgentCredential(
- workspacePath,
- credentialId,
- opts.actor,
- opts.reason,
- ),
- };
- },
- (result) => [
- `Revoked credential: ${result.credential.id}`,
- `Actor: ${result.credential.actor}`,
- `Status: ${result.credential.status}`,
- ],
- )
+ () => workgraph.agent.revokeAgentCredential(resolveWorkspacePath(opts), credentialId, opts.actor, opts.reason),
+ (result) => [`Revoked credential ${result.id} for ${result.actor}.`],
+ ),
);
addWorkspaceOption(
agentCmd
.command('list')
- .description('List known agent presence entries')
- .option('--json', 'Emit structured JSON output')
-).action((opts) => {
- if (isRemoteMode(opts)) {
- return runCommand(
- opts,
- () => withRemoteClient(opts, (client) =>
- client.callTool<{ agents: PrimitiveRecord[]; count: number }>('workgraph_agent_list', {})),
- (result) => {
- if (result.agents.length === 0) return ['No agent presence entries found.'];
- return [
- ...result.agents.map((entry) => {
- const name = String(entry.fields.name ?? entry.path);
- const status = String(entry.fields.status ?? 'unknown');
- const task = String(entry.fields.current_task ?? 'none');
- const lastSeen = String(entry.fields.last_seen ?? 'unknown');
- return `${name} [${status}] task=${task} last_seen=${lastSeen}`;
- }),
- `${result.count} agent(s)`,
- ];
- },
- );
- }
- return runCommand(
+ .description('List actor presence entries')
+ .option('--json', 'Emit structured JSON output'),
+).action((opts) =>
+ runCommand(
opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- const agents = workgraph.agent.list(workspacePath);
- return {
- agents,
- count: agents.length,
- };
- },
- (result) => {
- if (result.agents.length === 0) return ['No agent presence entries found.'];
- return [
- ...result.agents.map((entry) => {
- const name = String(entry.fields.name ?? entry.path);
- const status = String(entry.fields.status ?? 'unknown');
- const task = String(entry.fields.current_task ?? 'none');
- const lastSeen = String(entry.fields.last_seen ?? 'unknown');
- return `${name} [${status}] task=${task} last_seen=${lastSeen}`;
- }),
- `${result.count} agent(s)`,
- ];
- },
- );
-});
-
-// ============================================================================
-// primitive
-// ============================================================================
+ () => ({ agents: workgraph.agent.list(resolveWorkspacePath(opts)) }),
+ (result) => result.agents.length > 0
+ ? result.agents.map((entry) => `${String(entry.fields.name)} (${String(entry.fields.status)}) -> ${entry.path}`)
+ : ['No actors found.'],
+ ),
+);
const primitiveCmd = program
.command('primitive')
- .description('Manage primitive type definitions and instances');
+ .description('Manage primitive schemas and instances');
addWorkspaceOption(
primitiveCmd
.command('define ')
.description('Define a new primitive type')
- .requiredOption('-d, --description ', 'Type description')
- .option('-a, --actor ', 'Agent name', DEFAULT_ACTOR)
- .option('--fields ', 'Field definitions as "name:type"')
- .option('--dir ', 'Storage directory override')
- .option('--json', 'Emit structured JSON output')
+ .requiredOption('--description ', 'Type description')
+ .option('-a, --actor ', 'Actor', DEFAULT_ACTOR)
+ .option('--directory ', 'Storage directory')
+ .option('--field ', 'Repeatable field definition', collectFieldSpecs, [])
+ .option('--json', 'Emit structured JSON output'),
).action((name, opts) =>
runCommand(
opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- const fields: Record = {};
- for (const spec of opts.fields ?? []) {
- const [fieldName, fieldType = 'string'] = String(spec).split(':');
- fields[fieldName.trim()] = { type: fieldType.trim() as workgraph.FieldDefinition['type'] };
- }
- const type = workgraph.registry.defineType(
- workspacePath,
- name,
- opts.description,
- fields,
- opts.actor,
- opts.dir
- );
- workgraph.bases.syncPrimitiveRegistryManifest(workspacePath);
- const baseResult = workgraph.bases.generateBasesFromPrimitiveRegistry(workspacePath, {
- includeNonCanonical: true,
- });
- return {
- type,
- basesGenerated: baseResult.generated.length,
- };
- },
+ () => workgraph.registry.defineType(
+ resolveWorkspacePath(opts),
+ name,
+ opts.description,
+ parseFieldDefinitions(opts.field),
+ opts.actor,
+ opts.directory,
+ ),
(result) => [
- `Defined type: ${result.type.name}`,
- `Directory: ${result.type.directory}/`,
- `Bases generated: ${result.basesGenerated}`,
- ]
- )
+ `Defined primitive type: ${result.name}`,
+ `Directory: ${result.directory}`,
+ ],
+ ),
);
-registerPrimitiveSchemaCommand('schema', 'Show supported fields for a primitive type');
-registerPrimitiveSchemaCommand('fields', 'Alias for schema');
-
-// ============================================================================
-// bases
-// ============================================================================
-
-const basesCmd = program
- .command('bases')
- .description('Generate Obsidian .base files from primitive-registry.yaml');
-
addWorkspaceOption(
- basesCmd
- .command('sync-registry')
- .description('Sync .workgraph/primitive-registry.yaml from active registry')
- .option('--json', 'Emit structured JSON output')
+ primitiveCmd
+ .command('list')
+ .description('List registered primitive types')
+ .option('--json', 'Emit structured JSON output'),
).action((opts) =>
runCommand(
opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- const manifest = workgraph.bases.syncPrimitiveRegistryManifest(workspacePath);
- return {
- primitiveCount: manifest.primitives.length,
- manifestPath: '.workgraph/primitive-registry.yaml',
- };
- },
- (result) => [
- `Synced primitive registry manifest: ${result.manifestPath}`,
- `Primitives: ${result.primitiveCount}`,
- ]
- )
+ () => ({ types: workgraph.registry.listTypes(resolveWorkspacePath(opts)) }),
+ (result) => result.types.map((type) => `${type.name} -> ${type.directory}`),
+ ),
);
-addWorkspaceOption(
- basesCmd
- .command('generate')
- .description('Generate .base files by reading primitive-registry.yaml')
- .option('--all', 'Include non-canonical primitives')
- .option('--refresh-registry', 'Refresh primitive-registry.yaml before generation')
- .option('--output-dir ', 'Output directory for .base files (default: .workgraph/bases)')
- .option('--json', 'Emit structured JSON output')
-).action((opts) =>
- runCommand(
- opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- if (opts.refreshRegistry) {
- workgraph.bases.syncPrimitiveRegistryManifest(workspacePath);
- }
- return workgraph.bases.generateBasesFromPrimitiveRegistry(workspacePath, {
- includeNonCanonical: !!opts.all,
- outputDirectory: opts.outputDir,
- });
- },
- (result) => [
- `Generated ${result.generated.length} .base file(s)`,
- `Directory: ${result.outputDirectory}`,
- ]
- )
-);
-
-addWorkspaceOption(
- primitiveCmd
- .command('list')
- .description('List primitive types')
- .option('--json', 'Emit structured JSON output')
-).action((opts) =>
- runCommand(
- opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- const types = workgraph.registry.listTypes(workspacePath);
- return { types, count: types.length };
- },
- (result) => result.types.map(t => `${t.name} (${t.directory}/) ${t.builtIn ? '[built-in]' : ''}`)
- )
-);
-
-function registerPrimitiveSchemaCommand(commandName: string, description: string): void {
- addWorkspaceOption(
- primitiveCmd
- .command(`${commandName} `)
- .description(description)
- .option('--json', 'Emit structured JSON output')
- ).action((typeName, opts) =>
- runCommand(
- opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- const typeDef = workgraph.registry.getType(workspacePath, typeName);
- if (!typeDef) {
- throw new Error(`Unknown primitive type "${typeName}". Use \`workgraph primitive list\` to inspect available types.`);
- }
- const fields = Object.entries(typeDef.fields).map(([name, definition]) => ({
- name,
- type: definition.type,
- required: definition.required === true,
- default: definition.default,
- enum: definition.enum ?? [],
- description: definition.description ?? '',
- template: definition.template ?? undefined,
- pattern: definition.pattern ?? undefined,
- refTypes: definition.refTypes ?? [],
- }));
- return {
- type: typeDef.name,
- description: typeDef.description,
- directory: typeDef.directory,
- builtIn: typeDef.builtIn,
- fields,
- };
- },
- (result) => [
- `Type: ${result.type}`,
- `Directory: ${result.directory}/`,
- `Built-in: ${result.builtIn}`,
- ...result.fields.map((field) =>
- `- ${field.name}: ${field.type}${field.required ? ' (required)' : ''}${field.description ? ` — ${field.description}` : ''}`),
- ],
- )
- );
-}
-
addWorkspaceOption(
primitiveCmd
.command('create ')
- .description('Create an instance of any primitive type')
- .option('-a, --actor ', 'Agent name', DEFAULT_ACTOR)
- .option('--set ', 'Set fields as "key=value"')
- .option('--body ', 'Markdown body content', '')
- .option('--json', 'Emit structured JSON output')
+ .description('Create a primitive instance')
+ .option('-a, --actor ', 'Actor', DEFAULT_ACTOR)
+ .option('--body ', 'Markdown body')
+ .option('--set ', 'Repeatable field assignment', collectSetPairs, [])
+ .option('--json', 'Emit structured JSON output'),
).action((type, title, opts) =>
runCommand(
opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- const fields: Record = { title, ...parseSetPairs(opts.set ?? []) };
- return {
- instance: workgraph.store.create(workspacePath, type, fields, opts.body, opts.actor),
- };
- },
- (result) => [`Created ${result.instance.type}: ${result.instance.path}`]
- )
+ () => workgraph.store.create(
+ resolveWorkspacePath(opts),
+ type,
+ {
+ title,
+ ...mergeSetPairs(opts.set),
+ },
+ opts.body ?? '',
+ opts.actor,
+ ),
+ (result) => [`Created primitive: ${result.path}`],
+ ),
);
addWorkspaceOption(
primitiveCmd
.command('update ')
- .description('Update an existing primitive instance')
- .option('-a, --actor ', 'Agent name', DEFAULT_ACTOR)
- .option('--set ', 'Set fields as "key=value"')
+ .description('Update a primitive instance')
+ .option('-a, --actor ', 'Actor', DEFAULT_ACTOR)
+ .option('--body ', 'Replace markdown body')
+ .option('--set ', 'Repeatable field assignment', collectSetPairs, [])
.option('--etag ', 'Expected etag for optimistic concurrency')
- .option('--body ', 'Replace markdown body content')
- .option('--body-file ', 'Read markdown body content from file')
- .option('--json', 'Emit structured JSON output')
+ .option('--json', 'Emit structured JSON output'),
).action((targetPath, opts) =>
runCommand(
opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- const updates = parseSetPairs(opts.set ?? []);
- let body: string | undefined = opts.body;
- if (opts.bodyFile) {
- body = fs.readFileSync(path.resolve(opts.bodyFile), 'utf-8');
- }
- return {
- instance: workgraph.store.update(workspacePath, targetPath, updates, body, opts.actor, {
- expectedEtag: opts.etag,
- }),
- };
- },
- (result) => [`Updated ${result.instance.type}: ${result.instance.path}`]
- )
-);
-
-// ============================================================================
-// skill
-// ============================================================================
-
-const skillCmd = program
- .command('skill')
- .description('Manage native skill primitives in shared workgraph vaults');
-
-addWorkspaceOption(
- skillCmd
- .command('write ')
- .description('Create or update a skill primitive')
- .option('-a, --actor ', 'Agent name', DEFAULT_ACTOR)
- .option('--owner ', 'Skill owner')
- .option('--skill-version ', 'Skill version')
- .option('--status ', 'draft | proposed | active | deprecated | archived')
- .option('--distribution ', 'Distribution mode', 'tailscale-shared-vault')
- .option('--tailscale-path ', 'Shared Tailscale workspace path')
- .option('--reviewers ]', 'Comma-separated reviewer names')
- .option('--depends-on ', 'Comma-separated skill dependencies (slug/path)')
- .option('--expected-updated-at ', 'Optimistic concurrency guard for updates')
- .option('--tags ', 'Comma-separated tags')
- .option('--body ', 'Skill markdown content')
- .option('--body-file ', 'Read markdown content from file')
- .option('--json', 'Emit structured JSON output')
-).action((title, opts) =>
- runCommand(
- opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- let body = opts.body ?? '';
- if (opts.bodyFile) {
- const absBodyFile = path.resolve(opts.bodyFile);
- body = fs.readFileSync(absBodyFile, 'utf-8');
- }
- const instance = workgraph.skill.writeSkill(
- workspacePath,
- title,
- body,
- opts.actor,
- {
- owner: opts.owner,
- version: opts.skillVersion,
- status: opts.status,
- distribution: opts.distribution,
- tailscalePath: opts.tailscalePath,
- reviewers: csv(opts.reviewers),
- dependsOn: csv(opts.dependsOn),
- expectedUpdatedAt: opts.expectedUpdatedAt,
- tags: csv(opts.tags),
- }
- );
- workgraph.bases.syncPrimitiveRegistryManifest(workspacePath);
- workgraph.bases.generateBasesFromPrimitiveRegistry(workspacePath, { includeNonCanonical: true });
- return { skill: instance };
- },
- (result) => [
- `Wrote skill: ${result.skill.path}`,
- `Status: ${String(result.skill.fields.status)} Version: ${String(result.skill.fields.version)}`,
- ]
- )
-);
-
-addWorkspaceOption(
- skillCmd
- .command('load ')
- .description('Load one skill primitive by slug or path')
- .option('--json', 'Emit structured JSON output')
-).action((skillRef, opts) =>
- runCommand(
- opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- return { skill: workgraph.skill.loadSkill(workspacePath, skillRef) };
- },
- (result) => [
- `Skill: ${String(result.skill.fields.title)}`,
- `Path: ${result.skill.path}`,
- `Status: ${String(result.skill.fields.status)}`,
- ]
- )
-);
-
-addWorkspaceOption(
- skillCmd
- .command('list')
- .description('List skills')
- .option('--status ', 'Filter by status')
- .option('--updated-since ', 'Filter by updated timestamp (ISO-8601)')
- .option('--json', 'Emit structured JSON output')
-).action((opts) =>
- runCommand(
- opts,
- () => {
- const workspacePath = resolveWorkspacePath(opts);
- const skills = workgraph.skill.listSkills(workspacePath, {
- status: opts.status,
- updatedSince: opts.updatedSince,
- });
- return { skills, count: skills.length };
- },
- (result) => result.skills.map((skill) =>
- `${String(skill.fields.title)} [${String(skill.fields.status)}] -> ${skill.path}`)
- )
-);
-
-addWorkspaceOption(
- skillCmd
- .command('history