You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copilot code review drip-feeds comments across multiple rounds (~2 per pass). The orchestrator currently dispatches the responder on every round regardless of comment severity. This burns responder attempts on trivial nits (naming, style, "consider renaming") and often exhausts the 3-attempt limit, marking PRs as aw-pr-stuck:review when the remaining comments are cosmetic.
Proposed Solution
After the responder's first attempt, the orchestrator should classify NEW comments from Copilot's re-review before dispatching again:
Deep investigation into Copilot code review behavior:
The drip-feed is by design, not a bug
GitHub's own data: average 5.1 comments per review, 29% of reviews produce zero comments
Comment count is driven by internal confidence scoring and clustering, not a hard cap
GitHub's blog explicitly states: "more comments don't necessarily mean a better review"
The ~2 comments/pass pattern appears to be an internal behavioral limit that varies by PR size and risk assessment
No configuration exists to change it
No API endpoint, repo setting, org setting, or ruleset option controls comment volume
copilot-instructions.md controls what gets flagged but cannot override the internal batching algorithm
GitHub's own blog warns against instructions like "be more thorough" or "identify all issues" — says it "adds noise"
Instructions have a 4,000 character limit for code review (does not apply to chat/coding agent)
Model switching is not supported — "purpose-built product with carefully tuned mix of models"
Community reports confirm the problem
GitHub Community Discussion #189767: "Copilot Code Review generates new comments on every push" — 5 rounds of review on a ~9-file PR, ~21 of 24 total comments were low-value nits
Discussion #152385: Users report Copilot reviewing 120 files but generating only 1-2 comments, with re-requests surfacing comments on different files
No official GitHub staff response addressing the drip-feed behavior specifically
Platform improvements (not user-configurable)
Date
Change
May 2025
"80% more comments per PR"
Jul 2025
Lifted file count/complexity limits for large PRs
Oct 2025
Agentic architecture with full project context + tool calling
Problem
Copilot code review drip-feeds comments across multiple rounds (~2 per pass). The orchestrator currently dispatches the responder on every round regardless of comment severity. This burns responder attempts on trivial nits (naming, style, "consider renaming") and often exhausts the 3-attempt limit, marking PRs as
aw-pr-stuck:reviewwhen the remaining comments are cosmetic.Proposed Solution
After the responder's first attempt, the orchestrator should classify NEW comments from Copilot's re-review before dispatching again:
This requires the orchestrator to:
aw-review-nits-onlyand leave a summary commentComplexity
This is a non-trivial change:
Research Findings (April 2026)
Deep investigation into Copilot code review behavior:
The drip-feed is by design, not a bug
No configuration exists to change it
copilot-instructions.mdcontrols what gets flagged but cannot override the internal batching algorithmCommunity reports confirm the problem
Platform improvements (not user-configurable)
Workarounds others have tried
copilot-cli-coderevieweraction (github.com/hancengiz/copilot-cli-codereviewer): Bypasses native review entirely — pipesgit diffintogh copilot explain, posts one monolithic comment. Nuclear option.*.instructions.mdwithapplyTo:frontmatter): Focus Copilot on what matters per file typeexcludeAgent: "coding-agent"in instruction frontmatter: Create review-only instructionsKey documentation
Instructions file facts
copilot-instructions.mdapplies to code review (first 4K chars, from base branch — though we empirically disproved the base-branch claim in PR test: verify Copilot follows guidelines link (clean retest) #479).github/instructions/NAME.instructions.mdwithapplyTo:globcopilot-review-instructions.mdexists as a recognized fileDepends On
Acceptance Criteria
aw-pr-stuck:reviewlabels caused by nit exhaustion