Code visualization for non-programmers.
A docent is a guide who explains things to people who aren't experts. Codedocent does that for code.
You're staring at a codebase you didn't write — maybe thousands of files across dozens of directories — and you need to understand what it does. Reading every file isn't realistic. You need a way to visualize the code structure, get a high-level map of what's where, and drill into the parts that matter without losing context.
Codedocent parses the codebase into a navigable, visual block structure and explains each piece in plain English. It's an AI code analysis tool — use a cloud provider for speed or run locally through Ollama for full privacy. Point it at any codebase and get a structural overview you can explore interactively, understand quickly, and share as a static HTML file.
- Developers onboarding onto an unfamiliar codebase — get oriented in minutes instead of days
- Non-programmers (managers, designers, PMs) who need to understand what code does without reading it
- Solo developers inheriting legacy code — map out the structure before making changes
- Code reviewers who want a high-level overview before diving into details
- Security reviewers who need a structural map of an application
- Students learning to read and navigate real-world codebases
Nested, color-coded blocks representing directories, files, classes, and functions — the entire structure of a codebase laid out visually. Each block shows a plain English summary, a pseudocode translation, and quality warnings (green/yellow/red). Click any block to drill down; breadcrumbs navigate you back up. You can export code from any block or paste replacement code back into the source file. AI explanations come from your choice of cloud provider or local Ollama.
pip install codedocentRequires Python 3.10+. Cloud AI needs an API key set in an env var (e.g. OPENAI_API_KEY). Local AI needs Ollama running. --no-ai skips AI entirely.
codedocent # setup wizard — walks you through everything
codedocent /path/to/code # interactive mode (recommended)
codedocent /path/to/code --full # full analysis, static HTML output
codedocent --gui # graphical launcher
codedocent /path/to/code --cloud openai # use OpenAI
codedocent /path/to/code --cloud groq # use Groq
codedocent /path/to/code --cloud custom --endpoint https://my-llm/v1/chat/completionsParses code structure with tree-sitter, scores quality with static analysis, and sends individual blocks to a cloud AI provider or local Ollama model for plain English summaries and pseudocode. Interactive mode analyzes on click — typically 1-2 seconds per block. Full mode analyzes everything upfront into a self-contained HTML file you can share.
- Cloud AI — send code to OpenAI, OpenRouter, Groq, or any OpenAI-compatible endpoint. Fast, no local setup. Your code is sent to that service. API keys are read from env vars (
OPENAI_API_KEY,OPENROUTER_API_KEY,GROQ_API_KEY,CODEDOCENT_API_KEYfor custom endpoints). - Local AI — Ollama on your machine. Code never leaves your laptop. No API keys, no accounts.
- No AI (
--no-ai) — structure and quality scores only.
The setup wizard (codedocent with no args) walks you through choosing.
Full AST parsing for Python and JavaScript/TypeScript (functions, classes, methods, imports). File-level detection for 23 extensions including C, C++, Rust, Go, Java, Ruby, PHP, Swift, Kotlin, Scala, HTML, CSS, and config formats.
MIT