Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/README.skills.md
Original file line number Diff line number Diff line change
Expand Up @@ -225,6 +225,7 @@ See [CONTRIBUTING.md](../CONTRIBUTING.md#adding-skills) for guidelines on how to
| [microsoft-skill-creator](../skills/microsoft-skill-creator/SKILL.md)<br />`gh skills install github/awesome-copilot microsoft-skill-creator` | Create agent skills for Microsoft technologies using Learn MCP tools. Use when users want to create a skill that teaches agents about any Microsoft technology, library, framework, or service (Azure, .NET, M365, VS Code, Bicep, etc.). Investigates topics deeply, then generates a hybrid skill storing essential knowledge locally while enabling dynamic deeper investigation. | `references/skill-templates.md` |
| [migrating-oracle-to-postgres-stored-procedures](../skills/migrating-oracle-to-postgres-stored-procedures/SKILL.md)<br />`gh skills install github/awesome-copilot migrating-oracle-to-postgres-stored-procedures` | Migrates Oracle PL/SQL stored procedures to PostgreSQL PL/pgSQL. Translates Oracle-specific syntax, preserves method signatures and type-anchored parameters, leverages orafce where appropriate, and applies COLLATE "C" for Oracle-compatible text sorting. Use when converting Oracle stored procedures or functions to PostgreSQL equivalents during a database migration. | None |
| [minecraft-plugin-development](../skills/minecraft-plugin-development/SKILL.md)<br />`gh skills install github/awesome-copilot minecraft-plugin-development` | Use this skill when building or modifying Minecraft server plugins for Paper, Spigot, or Bukkit, including plugin.yml setup, commands, listeners, schedulers, player state, team or arena systems, persistent progression, economy or profile data, configuration files, Adventure text, and version-safe API usage. Trigger for requests like "build a Minecraft plugin", "add a Paper command", "fix a Bukkit listener", "create plugin.yml", "implement a minigame mechanic", "add a perk or quest system", or "debug server plugin behavior". | `references/bootstrap-registration.md`<br />`references/build-test-and-runtime-validation.md`<br />`references/config-data-and-async.md`<br />`references/maps-heroes-and-feature-modules.md`<br />`references/minigame-instance-flow.md`<br />`references/persistent-progression-and-events.md`<br />`references/project-patterns.md`<br />`references/state-sessions-and-phases.md` |
| [mini-context-graph](../skills/mini-context-graph/SKILL.md)<br />`gh skills install github/awesome-copilot mini-context-graph` | A persistent, compounding knowledge base combining Karpathy's LLM Wiki pattern<br />with a structured knowledge graph. Ingest documents once — the LLM writes wiki<br />pages, extracts entities/relations into the graph, and stores raw content for<br />evidence retrieval. Knowledge accumulates and cross-references; it is never<br />re-derived from scratch. | `references/ingestion.md`<br />`references/lint.md`<br />`references/ontology.md`<br />`references/retrieval.md`<br />`scripts/config.py`<br />`scripts/contextgraph.py`<br />`scripts/template_agent_workflow.py`<br />`scripts/tools`<br />`skill.md` |
| [mkdocs-translations](../skills/mkdocs-translations/SKILL.md)<br />`gh skills install github/awesome-copilot mkdocs-translations` | Generate a language translation for a mkdocs documentation stack. | None |
| [model-recommendation](../skills/model-recommendation/SKILL.md)<br />`gh skills install github/awesome-copilot model-recommendation` | Analyze chatmode or prompt files and recommend optimal AI models based on task complexity, required capabilities, and cost-efficiency | None |
| [msstore-cli](../skills/msstore-cli/SKILL.md)<br />`gh skills install github/awesome-copilot msstore-cli` | Microsoft Store Developer CLI (msstore) for publishing Windows applications to the Microsoft Store. Use when asked to configure Store credentials, list Store apps, check submission status, publish submissions, manage package flights, set up CI/CD for Store publishing, or integrate with Partner Center. Supports Windows App SDK/WinUI, UWP, .NET MAUI, Flutter, Electron, React Native, and PWA applications. | None |
Expand Down
196 changes: 196 additions & 0 deletions skills/mini-context-graph/references/ingestion.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,196 @@
# Ingestion Instructions

This file defines how the agent extracts entities and relations from a raw document.

---

## Step 1: Read the Document

Read the provided text carefully. Identify:
- **Entities**: noun phrases that refer to real-world objects, systems, components, actors, concepts, or events.
- **Relations**: verb phrases that describe how one entity affects, contains, causes, uses, or is related to another.

---

## Step 2: Extract Entities

For each entity:
- Record its **name** (normalized: lowercase, strip leading/trailing whitespace)
- Assign a **type**: a short label (1–3 words) that categorizes the entity

### Entity Type Examples

| Entity Name | Suggested Type |
|-------------|---------------|
| Python interpreter | software |
| memory leak | issue |
| operating system | system |
| database | infrastructure |
| user | actor |
| API endpoint | interface |
| server | infrastructure |

**Rules:**
- Types must be general enough to reuse across documents
- Do NOT create unique types per entity (e.g., avoid `python-interpreter-type`)
- Use `ontology.md` normalization rules to canonicalize types

---

## Step 3: Extract Relations

For each pair of entities with an explicit connection in the text:
- Record the **source** entity name
- Record the **target** entity name
- Record the **relation type**: a verb or verb phrase (normalized: lowercase)
- Assign a **confidence** score between 0 and 1:
- 1.0 = stated explicitly ("A causes B")
- 0.8 = strongly implied ("A is linked to B")
- 0.6 = weakly implied ("A may affect B")
- < 0.6 = do NOT include

---

## Step 4: Output Format

Produce a JSON object in this exact format:

```json
{
"entities": [
{ "name": "entity name", "type": "entity type", "supporting_text": "exact quote mentioning this entity" }
],
"relations": [
{
"source": "source entity name",
"target": "target entity name",
"type": "relation type",
"confidence": 0.9,
"supporting_text": "exact quote that justifies this relation"
}
]
}
```

The `supporting_text` field is **required for provenance**. It must be a verbatim or near-verbatim quote from the document that mentions or supports the entity/relation. This is what links graph nodes and edges back to their source.

---

## Rules

- All names and types must be **lowercase**
- Only include relations where **both entities** are present in the entities list
- Do NOT invent entities or relations not supported by the text
- Prefer **reusing existing entity and relation types** from the ontology over creating new ones
- One entity can appear in multiple relations (as source or target)
- Always include `supporting_text` — this enables evidence retrieval and audit trails

---

## Step 5: Write Wiki Pages (Required)

After calling `skill.ingest_with_content(...)`, you MUST write wiki pages:

### 5a. Write a summary page for the document

```python
from scripts.tools import wiki_store

wiki_store.write_page(
category="summary",
title=f"{title} Summary",
content=f"""---
title: {title}
source_document: {doc_id}
tags: [summary]
---

# {title}

**Source:** {source}

## Key Claims

{chr(10).join(f'- [[{r["source"].replace(" ", "-")}]] {r["type"]} [[{r["target"].replace(" ", "-")}]] (confidence: {r["confidence"]})' for r in relations)}

## Entities

{chr(10).join(f'- [[{e["name"].replace(" ", "-")}]] ({e["type"]})' for e in entities)}

## Open Questions

- (Add questions from reading the document here)
""",
summary=f"Summary of {title}",
)
```

### 5b. Write or update entity pages

For each **new** entity not already in the wiki, write an entity page:

```python
wiki_store.write_page(
category="entity",
title=entity_name,
content=f"""---
title: {entity_name}
type: {entity_type}
source_document: {doc_id}
tags: [{entity_type}]
---

# {entity_name}

(Description from the document or prior knowledge.)

## Relations

(List any wikilinks to related entities extracted from relations.)

## Mentioned in

- [[{doc_id}-summary]]
""",
summary=f"{entity_name}: {entity_type}",
)
```

For **existing** entity pages, read the current page and append new information, updated relations, or flag contradictions.

---

## Example

**Input document:**
```
System crashes due to memory leaks.
Memory leaks occur when objects are not released.
```

**Expected extraction output:**
```json
{
"entities": [
{ "name": "system crash", "type": "issue", "supporting_text": "system crashes due to memory leaks" },
{ "name": "memory leak", "type": "issue", "supporting_text": "memory leaks occur when objects are not released" },
{ "name": "object", "type": "component", "supporting_text": "objects are not released" }
],
"relations": [
{
"source": "memory leak",
"target": "system crash",
"type": "causes",
"confidence": 1.0,
"supporting_text": "System crashes due to memory leaks."
},
{
"source": "object",
"target": "memory leak",
"type": "contributes to",
"confidence": 0.9,
"supporting_text": "Memory leaks occur when objects are not released."
}
]
}
```
163 changes: 163 additions & 0 deletions skills/mini-context-graph/references/lint.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,163 @@
# Lint Instructions

This file defines the wiki health-check workflow.

Run this periodically (or after a large batch of ingests) to keep the wiki
clean and accurate. The pattern is from Karpathy's LLM Wiki: detect contradictions,
orphans, broken links, stale claims, and data gaps.

---

## When to Run

- After ingesting 5+ documents
- When the user asks "check the wiki" or "health check"
- When answers seem inconsistent or contradictory
- Before a major synthesis or presentation

---

## Step 1: Run the Automated Health Check

```python
from scripts.tools import wiki_store

issues = wiki_store.lint_wiki()
# Returns:
# {
# "orphan_pages": [list of slugs in files but not in index],
# "missing_pages": [list of slugs in index but file deleted],
# "broken_wikilinks": {slug: [broken link targets]},
# "isolated_pages": [slugs with no wikilinks at all],
# }
```

---

## Step 2: Triage Each Issue Type

### Orphan Pages
Pages exist on disk but are not in the index. They are invisible to search.
**Fix**: Add them to the index or delete if stale.

```python
# To add to index, re-write the page (this auto-updates the index):
wiki_store.write_page(category="...", title="...", content=existing_content)

# To delete (manual step — confirm with user first):
# rm wiki/{category}/{slug}.md
```

### Missing Pages
In the index but the file was deleted. Dangling references.
**Fix**: Either recreate the page from knowledge or remove from index.

### Broken Wikilinks
`[[slug]]` references that point to pages that don't exist.
**Fix**: Create the missing page, or correct the link.

### Isolated Pages
Pages with no `[[wikilinks]]` — they are unreachable via link traversal.
**Fix**: Add links from/to related pages.

---

## Step 3: Check for Contradictions

Read the wiki index and scan for pages that might contradict each other:

```python
pages = wiki_store.list_pages()
# Returns [{slug, category, summary, date}, ...]
```

Look for:
- Same entity with conflicting `type` in different pages
- Same relation with different direction in different pages
- Newer ingests that update/supersede older claims

**When you find a contradiction:**
- Add a `## Contradictions` section to the relevant entity/topic pages:
```markdown
## Contradictions
- doc_001 says X; doc_003 says not-X — unresolved
```
- Flag it in the log:
```python
# Handled by wiki_store.write_page which auto-appends to log.md
```

---

## Step 4: Check for Stale Claims

Review pages ingested more than N days ago (use the `date` field from the index).
Ask: "Has any newer document superseded this claim?"

**When a claim is stale:**
- Update the page: add a `## Superseded` section or update the body.
- Mark the old claim with _(superseded by [[newer-doc-summary]])_.

---

## Step 5: Check for Missing Cross-References

For each entity page, check: does it link back to all summary pages that mention it?
For each summary page, check: does it link to all entity pages it extracted?

**Fix**: Read the page and add missing `[[slug]]` links.

---

## Step 6: Identify Data Gaps

Review entity pages that lack:
- A proper description (just a stub)
- Any `## Relations` section
- Any `## Mentioned in` links

These are candidates for deeper research or new ingests.

---

## Step 7: Log the Lint Pass

```python
# wiki_store.write_page automatically logs the activity.
# For a manual lint summary, append to log.md via write_page on a topic:
wiki_store.write_page(
category="topic",
title="Lint Pass YYYY-MM-DD",
content="# Lint Pass\n\n## Issues Found\n\n...\n\n## Fixed\n\n...",
summary="Lint pass results",
)
```

---

## Quick Lint Commands

```python
from scripts.tools import wiki_store

# Full health check
issues = wiki_store.lint_wiki()

# Get recent history
log = wiki_store.get_log(last_n=10)

# List all pages
all_pages = wiki_store.list_pages()

# Search for a concept across wiki
results = wiki_store.search_wiki("memory leak")
```

---

## Rules

- NEVER delete pages without user confirmation
- NEVER auto-resolve a contradiction — flag it for human review
- File all lint results as a topic page in the wiki (so the history is visible)
- Prefer adding cross-references over rewriting existing content
Loading
Loading