A GitHub follower analytics project with two distinct experiences: a polished ShadCN-style web dashboard for dense, modern analytics and a desktop Tkinter utility for local tracking, charts, and AI profile summaries.
- A real analytics surface, not just a counter. Track follower trends, 24-hour movement, churn, activity, profile context, and grounded insight narratives in one view.
- Two workflows in one repo. Use the web dashboard for a modern product experience and the Tkinter app for local utility workflows.
- Practical data pipeline. FastAPI backend, SQLite persistence, live GitHub API reads, and a responsive React frontend.
- Built for extension. The current structure supports future notifications, richer timeline analysis, segmentation, local natural-language querying, and explainable engineering-activity summaries.
| Experience | Best For | Stack | What You Get |
|---|---|---|---|
| Web dashboard | Daily monitoring, demos, polished analytics | Next.js, React, Tailwind CSS, custom SVG charts, FastAPI | KPI cards, trend chart, AI insights, dashboard query bar, follower investigation, GitHub profile panel, signal quality panel |
| Desktop utility | Local automation, direct controls, quick tracking | Python, Tkinter, Matplotlib, OpenAI SDK | Follower tracking, follower file output, segmentation, charts, AI profile summaries |
The recommended primary experience is the web dashboard.
- Follower intelligence dashboard
- Total followers
- 24-hour net movement
- New followers
- Lost followers
- Interactive follower growth chart
- 7-day, 30-day, and all-time ranges
- Snapshot-based growth tracking
- Audience and delta modes
- Hover tooltip with timestamp, follower count, delta, comparison, and event annotations
- AI insights and local query layer
- Brief, Executive, and Technical summaries
- 24-hour, 7-day, and 30-day insight windows
- Natural-language questions grounded in local dashboard data
- Evidence and recommended next actions returned with every generated answer
- Investigation workflow
- New, lost, and high-signal follower drawers
- Sort by newest, oldest, signal, reach, or repository count
- Export filtered drawer data to CSV or JSON
- GitHub profile context
- Avatar
- Bio
- Public repositories
- Following count
- Direct profile link
- Per-profile summaries from cached dashboard data
- Signal quality panel
- API health state
- Freshness and cadence
- Missed snapshots
- Snapshot count
- Partial data indicator
- Recent sync run summary
- Desktop utility workflows
- Local follower tracking
- JSON history file output
- Matplotlib analytics
- OpenAI-generated profile summaries
Create a .env file in the repository root:
GITHUB_USERNAME=your-github-username
GITHUB_TOKEN=your-github-token
OPENAI_API_KEY=your-openai-api-keyOPENAI_API_KEY is optional. When present, the web profile Summarize action reuses the desktop app's OpenAI profile-summary pattern with gpt-4o-mini; otherwise it falls back to a local grounded summary. Set PROFILE_SUMMARY_PROVIDER=local to force the local fallback, or OPENAI_PROFILE_SUMMARY_MODEL to change the OpenAI model.
The backend supports backend/.env as a local override if you want to keep web-dashboard credentials separate.
Python:
python -m pip install -r requirements.txt
python -m pip install -r backend/requirements.txtFrontend:
cd frontend
npm install
cd ..Start the FastAPI backend:
python -m uvicorn app.main:app --app-dir backend --reload --host 127.0.0.1 --port 8000Start the frontend in a second terminal:
cd frontend
npm run devOpen:
http://localhost:3000
If 3000 is already in use, free it from the frontend folder with:
npm run killOr clear both common dev ports:
npm run kill:allThe web app primarily reads from:
GET /stats/dashboard
Sprint 2 intelligence endpoints:
POST /stats/insights- Body:
{ "range": "24h" | "7d" | "30d", "mode": "brief" | "executive" | "technical", "refresh": false } - Returns a grounded narrative, evidence list, confidence, data warnings, and recommended actions.
- Body:
POST /stats/query- Body:
{ "question": "What changed this month?", "range": "30d", "refresh": false } - Returns an answer, interpreted intent, evidence, confidence, warnings, and next action.
- Body:
POST /stats/profile-summary- Body:
{ "profile": { "username": "octocat", "...": "visible cached fields" }, "event_type": "new" | "lost" | "high-signal" | "profile", "prefer_ai": true } - Returns a grounded per-profile summary, evidence, confidence, source (
openaiorlocal), warnings, and recommended next action.
- Body:
Supporting endpoints remain available:
GET /stats/profileGET /stats/followersGET /stats/trendsGET /stats/history/newGET /stats/history/lost
Base URL during development:
http://localhost:8000
Run the legacy desktop application with:
python main.pyUse it to:
- Track followers into a local JSON file
- View follower and unfollower charts with Matplotlib
- Segment followers
- Generate GitHub profile summaries with OpenAI
The desktop application remains part of the repo because it is useful for local workflows and historical tracking, even though the dashboard is now the primary user experience.
flowchart LR
A[GitHub API] --> B[FastAPI Backend]
B --> C[(SQLite)]
B --> D[Next.js Dashboard]
B --> I[Grounded Insights + Query Service]
A --> E[Tkinter Desktop Utility]
E --> F[(JSON + SQLite)]
E --> G[Matplotlib Charts]
E --> H[OpenAI Summary]
.
├── backend/
│ └── app/
│ ├── api/
│ ├── services/
│ ├── main.py
│ └── models.py
├── archive/
│ ├── legacy-main-scripts/
│ │ ├── README.md
│ │ └── main1.py ... main12.py
│ └── legacy-follow-scripts/
│ ├── README.md
│ └── follow_unfollow_main*.py
├── frontend/
│ ├── app/
│ ├── components/
│ ├── lib/
│ └── package.json
├── scripts/
│ └── kill-frontend.ps1
├── docs/
│ └── screenshots/
│ ├── dashboard.png
│ ├── dashboard-detail.png
│ └── desktop/
├── main.py
├── analytics.py
├── requirements.txt
└── README.md
- Web dashboard snapshots:
backend/followers.db - Desktop utility data:
follower_data.dband local follower JSON files
Database files and environment files are ignored by git.
Frontend production build:
cd frontend
npm run build
npm run startRepeatable dashboard e2e verification:
cd frontend
npm run test:e2eThe e2e test expects the backend on http://127.0.0.1:8000 and the frontend on http://localhost:3000. It verifies range, chart mode, density, insights, query flow, and refreshes docs/screenshots/dashboard.png plus docs/screenshots/dashboard-detail.png.
Type and compile checks:
npx --yes pyright
python -m py_compile main.py analytics.py backend/app/api/stats.py backend/app/services/tracker.pyWith the backend and frontend running:
npx --yes playwright screenshot --browser=chromium --viewport-size=1980,1250 --wait-for-timeout=3000 http://localhost:3000 docs/screenshots/dashboard.pngTo refresh the focused detail image from the same capture on Windows PowerShell:
Add-Type -AssemblyName System.Drawing
$source = "docs/screenshots/dashboard.png"
$target = "docs/screenshots/dashboard-detail.png"
$bitmap = [System.Drawing.Bitmap]::new($source)
$rect = [System.Drawing.Rectangle]::new(150, 300, 1700, 760)
$crop = $bitmap.Clone($rect, $bitmap.PixelFormat)
$crop.Save($target, [System.Drawing.Imaging.ImageFormat]::Png)
$crop.Dispose()
$bitmap.Dispose()The crop values above are tuned for the 1980x1250 dashboard capture command shown here.
If Playwright needs Chromium installed:
npx --yes playwright install chromiumThe committed e2e spec uses Microsoft Edge (channel: "msedge") to avoid requiring a bundled Chromium install on Windows.
Sprint 2 includes only the architecture boundary for a future module named Engineering Activity Intelligence. This future module should produce explainable summaries from engineering activity signals, not employment decisioning.
Potential signals:
- PR throughput
- Review responsiveness
- Issue participation
- Commit cadence
- Collaboration patterns
- Repo ownership patterns
Guardrails:
- No compensation recommendations
- No promotion or pay-raise scoring
- No hidden employee ranking
- No black-box score used for employment decisions
- Summaries must be evidence-backed and inspectable
- Follower growth is based on stored snapshots, so the historical chart improves as the app is used over time.
- The dashboard is optimized for desktop analytics and becomes scrollable on smaller viewports.
- Historical numbered
main*.pysnapshots are archived underarchive/legacy-main-scripts/. - Legacy Selenium follow/unfollow scripts are archived under
archive/legacy-follow-scripts/. - Keep your
.env, tokens, and local database files out of version control.
This project is licensed under the MIT License. See LICENSE.
If this project is useful, you can support it here:






