Static dashboard for CodeRabbit adoption across tracked GitHub repositories. The UI in index.html has two tabs—OpenShift and EEE—each loading its own JSON under data/. It renders KPIs, adoption breakdowns, Mermaid diagrams, and a filterable repository table.
Live site: https://anandkuma77.github.io/codeRabbitDashboard/
| Tab | JSON file | Updated by |
|---|---|---|
| OpenShift | data/coderabbit-openshift-status.json |
scripts/fetch_coderabbit_Openshift_data.py |
| EEE | data/coderabbit-EEE-status.json |
scripts/fetch_coderabbit_EEE_data.py |
Treat the summary fields and last_updated at the top of each JSON as authoritative. README numbers can drift between refreshes.
codeRabbitDashboard/
├── index.html # Dashboard UI (GitHub Pages entry; tab switcher)
├── data/
│ ├── coderabbit-openshift-status.json
│ └── coderabbit-EEE-status.json
├── scripts/
│ ├── fetch_coderabbit_Openshift_data.py
│ └── fetch_coderabbit_EEE_data.py
├── .github/workflows/
│ └── fetch-coderabbit-data.yml
└── README.md
Each Python script does not discover new repos by itself: it loads the existing repos array from its JSON file, then for each repository it:
- Looks for
.coderabbit.yamlor.coderabbit.ymlvia the GitHub contents API. - Fetches up to 50 recent PRs (closed and open, by last update).
- For each PR, loads reviews, review comments, and issue comments and detects activity from
coderabbitai[bot]. - Computes per-repo and summary stats, including how often CodeRabbit commented before any human reviewer.
- Writes the file back in place.
Requirements: Python 3.11+ (stdlib only). Set GITHUB_TOKEN for authenticated requests and sensible rate limits. The scripts can fall back to an authenticated gh CLI when useful.
Token scope: fine-grained or classic token able to read repository metadata and pull requests for the orgs/repos listed in the JSON (same access you use to browse those repos on GitHub).
From the repository root:
export GITHUB_TOKEN="your-token"
python3 scripts/fetch_coderabbit_Openshift_data.py
python3 scripts/fetch_coderabbit_EEE_data.pyCommit and push when you want the site to reflect the change (GitHub Pages will pick it up after the next deploy).
Workflow: .github/workflows/fetch-coderabbit-data.yml
- Schedule: weekdays (Monday–Friday) at 01:00 and 13:00 UTC.
- Manual run: Actions → Fetch CodeRabbit Data → Run workflow.
- Runs both fetch scripts, then commits only if either JSON changed (commit message:
Update CodeRabbit adoption data [skip ci]).
- Tab switcher for OpenShift vs EEE datasets.
- KPI strip driven by each JSON summary.
- Adoption overview bar and integration flow (Mermaid).
- Breakdowns by team, domain, and repo lifecycle status (Active / Maintenance / Deprecated from the JSON).
- Sortable table with filters (CodeRabbit presence, repo status, team) and search.
- To add or remove tracked repositories, edit the
reposentries in the appropriate JSON file, then run the matching fetch script so metrics stay consistent. - Prefer updating README narrative here; copy exact counts from the JSON when sharing a formal status report.