diff --git a/.github/actions/linear-release-sync/README.md b/.github/actions/linear-release-sync/README.md new file mode 100644 index 0000000..14406ee --- /dev/null +++ b/.github/actions/linear-release-sync/README.md @@ -0,0 +1,100 @@ +# Linear Release Sync + +A GitHub Action that syncs Linear issues to the "Released" state when a GitHub release is published. It finds all PRs between two releases, extracts Linear issue IDs from PR descriptions and branch names, and moves matching issues from "Ready for Release" to "Released". + +## Features + +- Fetches all team keys from Linear to filter false positive issue IDs (e.g. `pr-3354`, `snap-1`) +- Extracts Linear issue IDs from PR descriptions and branch names (e.g., `ENG-1234`, `DEVOPS-471`) +- Strict time-based filtering: only includes PRs merged before the release was published +- Moves issues from "Ready for Release" to "Released" state +- Adds release comments with version and date +- For stable releases on already-released issues, adds "Now available in stable release" comments +- Skips CVE issues automatically +- Supports dry-run mode for previewing changes + +## Usage + +Add to your release workflow: + +```yaml +sync_linear: + if: ${{ !contains(needs.publish.outputs.semver_parsed, '-next') }} + needs: [publish] + runs-on: ubuntu-latest + steps: + - name: Sync Linear issues + uses: loft-sh/github-actions/.github/actions/linear-release-sync@linear-release-sync/v1 + with: + release-tag: ${{ needs.publish.outputs.release_version }} + repo-name: vcluster + github-token: ${{ secrets.GH_ACCESS_TOKEN }} + linear-token: ${{ secrets.LINEAR_TOKEN }} +``` + +## Configuration + +### Required Inputs + +| Input | Description | +|---------------|-------------------------------------------------| +| release-tag | The tag of the new release (e.g. `v1.2.0`) | +| repo-name | The GitHub repository name | +| github-token | GitHub token with read access to the repository | +| linear-token | Linear API token for updating issues | + +### Optional Inputs + +| Input | Default | Description | +|------------------------------|----------------------|------------------------------------------------------------| +| repo-owner | `loft-sh` | The GitHub owner of the repository | +| previous-tag | *(auto-detected)* | The previous release tag | +| released-state-name | `Released` | The Linear workflow state name for released issues | +| ready-for-release-state-name | `Ready for Release` | The Linear workflow state name for issues ready to release | +| linear-teams | *(all teams)* | Comma-separated list of Linear team names to process | +| linear-projects | *(all projects)* | Comma-separated list of Linear project names to process | +| strict-filtering | `true` | Only include PRs merged before the release was published | +| dry-run | `false` | Preview changes without modifying Linear | +| debug | `false` | Enable debug logging | + +## Development + +### Testing + +Run the included unit tests: + +```bash +make test-linear-release-sync +``` + +### Building locally + +```bash +make build-linear-release-sync +``` + +### Releasing + +The action runs a pre-built binary downloaded from a GitHub release at runtime (no Go toolchain needed in consumer workflows). The `release-linear-release-sync.yaml` workflow builds the binary and attaches it to a GitHub release. + +**New major/minor version** (e.g. first release, or `v2`): + +```bash +git tag linear-release-sync/v1 +git push origin linear-release-sync/v1 +``` + +This triggers the workflow automatically via `on: push: tags`. + +**Update an existing version** (e.g. rebuild `v1` after a source change): + +Force-pushing an existing tag does not trigger `on: push: tags` in GitHub Actions. Use `workflow_dispatch` instead: + +```bash +# Via GitHub CLI +gh workflow run release-linear-release-sync.yaml -f tag=linear-release-sync/v1 + +# Or use the "Run workflow" button in the GitHub Actions UI +``` + +The workflow builds the binary from the branch it is dispatched against and uploads it to the existing release with `--clobber`. diff --git a/.github/actions/linear-release-sync/REVIEW.md b/.github/actions/linear-release-sync/REVIEW.md new file mode 100644 index 0000000..3dfe758 --- /dev/null +++ b/.github/actions/linear-release-sync/REVIEW.md @@ -0,0 +1,51 @@ +# Code Review: linear-release-sync + +## Blocking + +- [x] **#1 Supply-chain risk: no checksum verification for downloaded binary** (`action.yml:58-62`) + The binary download URL is hardcoded. If someone compromises the release asset, every consumer gets the malicious binary. Consider adding a `.sha256` checksum verification step. + +- [x] **#4 `os.Kill` cannot be caught** (`main.go:88`) + `signal.Notify` for `SIGKILL` is a no-op on Linux. Replace with `syscall.SIGTERM`: + ```go + ctx, stop := signal.NotifyContext(ctx, os.Interrupt, syscall.SIGTERM) + ``` + +- [x] **#7 Nil-error wrapping in mutations** (`linear.go:316, 337`) + `updateIssueState` and `createComment` return `fmt.Errorf("mutation failed: %w", err)` when either `err != nil` OR `!Success`. If `err == nil` and `Success == false`, the message is `"mutation failed: "`. Split into two checks: + ```go + if err != nil { + return fmt.Errorf("mutation failed: %w", err) + } + if !mutation.IssueUpdate.Success { + return fmt.Errorf("mutation failed: issue update returned success=false") + } + ``` + +## Warn + +- [x] **#2 Token in env var** (`action.yml:53`) + Already follows convention: "Secrets via `env:` preferred over `with:`" (CONVENTIONS.md). GHA auto-masks secrets in logs. No change needed. + +- [x] **#3 Linear auth header scheme** (`linear.go:41`) + Linear API docs recommend bare `Authorization: ` (no `Bearer` prefix). Current code is correct. + +- [x] **#5 Logger via context.WithValue is fragile** (`main.go:91-92, linear.go:249`) + Moved logger to a field on `LinearClient`. Removed `LoggerKey` and context injection. + +- [x] **#6 Unused `PageSize` constant** (`pr.go:14`) + `PageSize = 100` is defined in `pr.go` but unused there (also defined in `changelog/pull-requests/pr.go`). + +- [x] **#9 No tests for changelog packages** (`changelog/pull-requests/`, `changelog/releases/`) + Added tests using httptest mock GraphQL server. Covers pagination, deduplication, unmerged PR filtering, time-based filtering, semver range matching, and edge cases. + +## Nit + +- [x] **#10 Misleading test file name** (`integration_test.go`) + Renamed to `flow_test.go`. + +- [x] **#12 Inconsistent Go cache setting** (`release-linear-release-sync.yaml:28` vs `test-linear-release-sync.yaml:24`) + Removed `cache: false` from release workflow — no reason to disable it. + +- [x] **#13 Old Go version** (`go.mod`) + Bumped from 1.22.5 to 1.26. diff --git a/.github/actions/linear-release-sync/action.yml b/.github/actions/linear-release-sync/action.yml new file mode 100644 index 0000000..f748a7c --- /dev/null +++ b/.github/actions/linear-release-sync/action.yml @@ -0,0 +1,139 @@ +name: 'Linear Release Sync' +description: 'Syncs Linear issues to Released state when a GitHub release is published. Finds PRs between releases, extracts Linear issue IDs, and moves matching issues from Ready for Release to Released.' + +inputs: + release-tag: + description: 'The tag of the new release (e.g. v1.2.0)' + required: true + repo-owner: + description: 'The GitHub owner of the repository' + required: false + default: 'loft-sh' + repo-name: + description: 'The GitHub repository name' + required: true + github-token: + description: 'GitHub token with read access to the repository' + required: true + linear-token: + description: 'Linear API token for updating issues' + required: true + previous-tag: + description: 'The previous release tag (auto-detected if not set)' + required: false + default: '' + released-state-name: + description: 'The Linear workflow state name for released issues' + required: false + default: 'Released' + ready-for-release-state-name: + description: 'The Linear workflow state name for issues ready to release' + required: false + default: 'Ready for Release' + strict-filtering: + description: 'Only include PRs merged before the release was published (recommended)' + required: false + default: 'true' + linear-teams: + description: 'Comma-separated list of Linear team names to process (optional, default: all)' + required: false + default: '' + linear-projects: + description: 'Comma-separated list of Linear project names to process (optional, default: all)' + required: false + default: '' + dry-run: + description: 'Preview changes without modifying Linear' + required: false + default: 'false' + debug: + description: 'Enable debug logging' + required: false + default: 'false' + +runs: + using: 'composite' + steps: + - name: Download linear-release-sync binary + id: download + shell: bash + env: + GH_TOKEN: ${{ inputs.github-token }} + run: | + BINARY_DIR="$(mktemp -d)" + echo "binary_dir=$BINARY_DIR" >> "$GITHUB_OUTPUT" + + RELEASE_URL="https://github.com/loft-sh/github-actions/releases/download/linear-release-sync%2Fv1" + + for asset in linear-release-sync-linux-amd64 linear-release-sync-linux-amd64.sha256; do + curl -fsSL \ + -H "Authorization: token ${GH_TOKEN}" \ + -H "Accept: application/octet-stream" \ + -o "$BINARY_DIR/$asset" \ + "$RELEASE_URL/$asset" + done + + # Verify SHA-256 checksum + (cd "$BINARY_DIR" && sha256sum -c linear-release-sync-linux-amd64.sha256) + + chmod +x "$BINARY_DIR/linear-release-sync-linux-amd64" + + # Verify the binary is a valid ELF executable, not an HTML error page + if ! file "$BINARY_DIR/linear-release-sync-linux-amd64" | grep -q "ELF"; then + echo "::error::Downloaded file is not a valid ELF binary" + file "$BINARY_DIR/linear-release-sync-linux-amd64" + exit 1 + fi + + - name: Run Linear Release Sync + shell: bash + env: + GITHUB_TOKEN: ${{ inputs.github-token }} + LINEAR_TOKEN: ${{ inputs.linear-token }} + INPUT_RELEASE_TAG: ${{ inputs.release-tag }} + INPUT_REPO_OWNER: ${{ inputs.repo-owner }} + INPUT_REPO_NAME: ${{ inputs.repo-name }} + INPUT_PREVIOUS_TAG: ${{ inputs.previous-tag }} + INPUT_RELEASED_STATE: ${{ inputs.released-state-name }} + INPUT_READY_STATE: ${{ inputs.ready-for-release-state-name }} + INPUT_STRICT_FILTERING: ${{ inputs.strict-filtering }} + INPUT_DRY_RUN: ${{ inputs.dry-run }} + INPUT_DEBUG: ${{ inputs.debug }} + INPUT_LINEAR_TEAMS: ${{ inputs.linear-teams }} + INPUT_LINEAR_PROJECTS: ${{ inputs.linear-projects }} + BINARY_DIR: ${{ steps.download.outputs.binary_dir }} + run: | + ARGS=( + -release-tag="$INPUT_RELEASE_TAG" + -owner="$INPUT_REPO_OWNER" + -repo="$INPUT_REPO_NAME" + -released-state-name="$INPUT_RELEASED_STATE" + -ready-for-release-state-name="$INPUT_READY_STATE" + -strict-filtering="$INPUT_STRICT_FILTERING" + ) + + if [ -n "$INPUT_PREVIOUS_TAG" ]; then + ARGS+=(-previous-tag="$INPUT_PREVIOUS_TAG") + fi + + if [ -n "$INPUT_LINEAR_TEAMS" ]; then + ARGS+=(-linear-teams="$INPUT_LINEAR_TEAMS") + fi + + if [ -n "$INPUT_LINEAR_PROJECTS" ]; then + ARGS+=(-linear-projects="$INPUT_LINEAR_PROJECTS") + fi + + if [ "$INPUT_DRY_RUN" = "true" ]; then + ARGS+=(-dry-run) + fi + + if [ "$INPUT_DEBUG" = "true" ]; then + ARGS+=(-debug) + fi + + "$BINARY_DIR/linear-release-sync-linux-amd64" "${ARGS[@]}" + +branding: + icon: 'check-circle' + color: 'purple' diff --git a/.github/actions/linear-release-sync/src/changelog/pull-requests/pr.go b/.github/actions/linear-release-sync/src/changelog/pull-requests/pr.go new file mode 100644 index 0000000..e8346e7 --- /dev/null +++ b/.github/actions/linear-release-sync/src/changelog/pull-requests/pr.go @@ -0,0 +1,120 @@ +package pullrequests + +import ( + "context" + "fmt" + "time" + + "github.com/shurcooL/githubv4" +) + +const PageSize = 100 + +type PullRequest struct { + Merged bool + Body string + HeadRefName string + Author struct { + Login string + } + Number int + MergedAt *githubv4.DateTime +} + +// FetchAllPRsBetween fetches all merged PRs between the given tags +// It uses the GitHub GraphQL API to fetch the PRs. +func FetchAllPRsBetween(ctx context.Context, client *githubv4.Client, owner, repo, previousTag, currentTag string) ([]PullRequest, error) { + var query struct { + Repository struct { + Ref struct { + Compare struct { + Commits struct { + PageInfo struct { + EndCursor githubv4.String + HasNextPage bool + } + Nodes []struct { + AssociatedPullRequests struct { + PageInfo struct { + EndCursor githubv4.String + HasNextPage bool + } + Nodes []PullRequest + } `graphql:"associatedPullRequests(first: $pageSize)"` + } + } `graphql:"commits(first: $pageSize, after: $cursor)"` + } `graphql:"compare(headRef: $currTag)"` + } `graphql:"ref(qualifiedName: $prevTag)"` + } `graphql:"repository(owner: $owner, name: $repo)"` + } + + var cursor *githubv4.String + pullRequestsByNumber := map[int]PullRequest{} + + // Paginate through the Commits + for { + if err := client.Query(ctx, &query, map[string]interface{}{ + "owner": githubv4.String(owner), + "repo": githubv4.String(repo), + "prevTag": githubv4.String(previousTag), + "currTag": githubv4.String(currentTag), + "pageSize": githubv4.Int(PageSize), + "cursor": cursor, + }); err != nil { + return nil, fmt.Errorf("query repository: %w", err) + } + + cursor = &query.Repository.Ref.Compare.Commits.PageInfo.EndCursor + + for _, commit := range query.Repository.Ref.Compare.Commits.Nodes { + for _, pr := range commit.AssociatedPullRequests.Nodes { + if !pr.Merged { + continue + } + + if _, ok := pullRequestsByNumber[pr.Number]; ok { + continue + } + + pullRequestsByNumber[pr.Number] = pr + } + } + + if !query.Repository.Ref.Compare.Commits.PageInfo.HasNextPage { + break + } + } + + var pullRequests []PullRequest + for _, pr := range pullRequestsByNumber { + pullRequests = append(pullRequests, pr) + } + return pullRequests, nil +} + +// FetchPRsForRelease fetches PRs that were actually included in a specific release +// by filtering out PRs that were merged after the release was published. +// This prevents PRs that were merged after the release tag was created from being included. +func FetchPRsForRelease(ctx context.Context, client *githubv4.Client, owner, repo, previousTag, releaseTag string, releasePublishedAt time.Time) ([]PullRequest, error) { + // Fetch all PRs between the tags (now includes MergedAt) + prs, err := FetchAllPRsBetween(ctx, client, owner, repo, previousTag, releaseTag) + if err != nil { + return nil, fmt.Errorf("fetch all PRs between tags: %w", err) + } + + // Filter based on merge time directly from the PR data + var filtered []PullRequest + for _, pr := range prs { + // Time-based filtering: exclude PRs merged after release was published + if pr.MergedAt != nil && pr.MergedAt.After(releasePublishedAt) { + continue + } + + // Include PR if it has a valid merge time before the release + if pr.MergedAt != nil { + filtered = append(filtered, pr) + } + } + + return filtered, nil +} diff --git a/.github/actions/linear-release-sync/src/changelog/pull-requests/pr_test.go b/.github/actions/linear-release-sync/src/changelog/pull-requests/pr_test.go new file mode 100644 index 0000000..c9dbff3 --- /dev/null +++ b/.github/actions/linear-release-sync/src/changelog/pull-requests/pr_test.go @@ -0,0 +1,372 @@ +package pullrequests + +import ( + "context" + "encoding/json" + "net/http" + "net/http/httptest" + "testing" + "time" + + "github.com/shurcooL/githubv4" + "golang.org/x/oauth2" +) + +func newTestClient(handler http.Handler) *githubv4.Client { + srv := httptest.NewServer(handler) + httpClient := oauth2.NewClient(context.Background(), oauth2.StaticTokenSource( + &oauth2.Token{AccessToken: "test"}, + )) + return githubv4.NewEnterpriseClient(srv.URL, httpClient) +} + +func TestFetchAllPRsBetween_DeduplicatesByPRNumber(t *testing.T) { + callCount := 0 + handler := http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { + callCount++ + // Return the same PR number from two different commits to test deduplication + resp := map[string]any{ + "data": map[string]any{ + "repository": map[string]any{ + "ref": map[string]any{ + "compare": map[string]any{ + "commits": map[string]any{ + "pageInfo": map[string]any{ + "endCursor": "", + "hasNextPage": false, + }, + "nodes": []any{ + map[string]any{ + "associatedPullRequests": map[string]any{ + "pageInfo": map[string]any{ + "endCursor": "", + "hasNextPage": false, + }, + "nodes": []any{ + map[string]any{ + "merged": true, + "body": "Fix ENG-1234", + "headRefName": "fix/bug", + "author": map[string]any{"login": "dev1"}, + "number": 1, + "mergedAt": "2024-01-15T10:00:00Z", + }, + }, + }, + }, + map[string]any{ + "associatedPullRequests": map[string]any{ + "pageInfo": map[string]any{ + "endCursor": "", + "hasNextPage": false, + }, + "nodes": []any{ + // Same PR #1 from a different commit + map[string]any{ + "merged": true, + "body": "Fix ENG-1234", + "headRefName": "fix/bug", + "author": map[string]any{"login": "dev1"}, + "number": 1, + "mergedAt": "2024-01-15T10:00:00Z", + }, + // Different PR #2 + map[string]any{ + "merged": true, + "body": "Add feature ENG-5678", + "headRefName": "feat/new", + "author": map[string]any{"login": "dev2"}, + "number": 2, + "mergedAt": "2024-01-15T11:00:00Z", + }, + }, + }, + }, + }, + }, + }, + }, + }, + }, + } + json.NewEncoder(w).Encode(resp) + }) + + client := newTestClient(handler) + prs, err := FetchAllPRsBetween(context.Background(), client, "owner", "repo", "v1.0.0", "v1.1.0") + if err != nil { + t.Fatalf("unexpected error: %v", err) + } + + if len(prs) != 2 { + t.Errorf("expected 2 deduplicated PRs, got %d", len(prs)) + } +} + +func TestFetchAllPRsBetween_SkipsUnmergedPRs(t *testing.T) { + handler := http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { + resp := map[string]any{ + "data": map[string]any{ + "repository": map[string]any{ + "ref": map[string]any{ + "compare": map[string]any{ + "commits": map[string]any{ + "pageInfo": map[string]any{ + "endCursor": "", + "hasNextPage": false, + }, + "nodes": []any{ + map[string]any{ + "associatedPullRequests": map[string]any{ + "pageInfo": map[string]any{ + "endCursor": "", + "hasNextPage": false, + }, + "nodes": []any{ + map[string]any{ + "merged": true, + "body": "Merged PR", + "headRefName": "fix/merged", + "author": map[string]any{"login": "dev1"}, + "number": 1, + "mergedAt": "2024-01-15T10:00:00Z", + }, + map[string]any{ + "merged": false, + "body": "Open PR", + "headRefName": "fix/open", + "author": map[string]any{"login": "dev2"}, + "number": 2, + }, + }, + }, + }, + }, + }, + }, + }, + }, + }, + } + json.NewEncoder(w).Encode(resp) + }) + + client := newTestClient(handler) + prs, err := FetchAllPRsBetween(context.Background(), client, "owner", "repo", "v1.0.0", "v1.1.0") + if err != nil { + t.Fatalf("unexpected error: %v", err) + } + + if len(prs) != 1 { + t.Errorf("expected 1 merged PR, got %d", len(prs)) + } + if len(prs) > 0 && prs[0].Number != 1 { + t.Errorf("expected PR #1, got #%d", prs[0].Number) + } +} + +func TestFetchAllPRsBetween_Pagination(t *testing.T) { + callCount := 0 + handler := http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { + callCount++ + var resp map[string]any + if callCount == 1 { + resp = map[string]any{ + "data": map[string]any{ + "repository": map[string]any{ + "ref": map[string]any{ + "compare": map[string]any{ + "commits": map[string]any{ + "pageInfo": map[string]any{ + "endCursor": "cursor1", + "hasNextPage": true, + }, + "nodes": []any{ + map[string]any{ + "associatedPullRequests": map[string]any{ + "pageInfo": map[string]any{ + "endCursor": "", + "hasNextPage": false, + }, + "nodes": []any{ + map[string]any{ + "merged": true, + "body": "Page 1 PR", + "headRefName": "fix/page1", + "author": map[string]any{"login": "dev1"}, + "number": 1, + "mergedAt": "2024-01-15T10:00:00Z", + }, + }, + }, + }, + }, + }, + }, + }, + }, + }, + } + } else { + resp = map[string]any{ + "data": map[string]any{ + "repository": map[string]any{ + "ref": map[string]any{ + "compare": map[string]any{ + "commits": map[string]any{ + "pageInfo": map[string]any{ + "endCursor": "cursor2", + "hasNextPage": false, + }, + "nodes": []any{ + map[string]any{ + "associatedPullRequests": map[string]any{ + "pageInfo": map[string]any{ + "endCursor": "", + "hasNextPage": false, + }, + "nodes": []any{ + map[string]any{ + "merged": true, + "body": "Page 2 PR", + "headRefName": "fix/page2", + "author": map[string]any{"login": "dev2"}, + "number": 2, + "mergedAt": "2024-01-15T11:00:00Z", + }, + }, + }, + }, + }, + }, + }, + }, + }, + }, + } + } + json.NewEncoder(w).Encode(resp) + }) + + client := newTestClient(handler) + prs, err := FetchAllPRsBetween(context.Background(), client, "owner", "repo", "v1.0.0", "v1.1.0") + if err != nil { + t.Fatalf("unexpected error: %v", err) + } + + if callCount != 2 { + t.Errorf("expected 2 API calls for pagination, got %d", callCount) + } + if len(prs) != 2 { + t.Errorf("expected 2 PRs across pages, got %d", len(prs)) + } +} + +func TestFetchPRsForRelease_FiltersAfterReleaseTime(t *testing.T) { + releaseTime := time.Date(2024, 1, 15, 12, 0, 0, 0, time.UTC) + + handler := http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { + resp := map[string]any{ + "data": map[string]any{ + "repository": map[string]any{ + "ref": map[string]any{ + "compare": map[string]any{ + "commits": map[string]any{ + "pageInfo": map[string]any{ + "endCursor": "", + "hasNextPage": false, + }, + "nodes": []any{ + map[string]any{ + "associatedPullRequests": map[string]any{ + "pageInfo": map[string]any{ + "endCursor": "", + "hasNextPage": false, + }, + "nodes": []any{ + map[string]any{ + "merged": true, + "body": "Before release", + "headRefName": "fix/before", + "author": map[string]any{"login": "dev1"}, + "number": 1, + "mergedAt": "2024-01-15T10:00:00Z", // 2h before + }, + map[string]any{ + "merged": true, + "body": "After release", + "headRefName": "fix/after", + "author": map[string]any{"login": "dev2"}, + "number": 2, + "mergedAt": "2024-01-15T13:00:00Z", // 1h after + }, + map[string]any{ + "merged": true, + "body": "Also before release", + "headRefName": "fix/also-before", + "author": map[string]any{"login": "dev3"}, + "number": 3, + "mergedAt": "2024-01-15T11:30:00Z", // 30m before + }, + }, + }, + }, + }, + }, + }, + }, + }, + }, + } + json.NewEncoder(w).Encode(resp) + }) + + client := newTestClient(handler) + prs, err := FetchPRsForRelease(context.Background(), client, "owner", "repo", "v1.0.0", "v1.1.0", releaseTime) + if err != nil { + t.Fatalf("unexpected error: %v", err) + } + + if len(prs) != 2 { + t.Errorf("expected 2 PRs merged before release, got %d", len(prs)) + } + + for _, pr := range prs { + if pr.Number == 2 { + t.Errorf("PR #2 (merged after release) should have been filtered out") + } + } +} + +func TestFetchAllPRsBetween_EmptyResult(t *testing.T) { + handler := http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { + resp := map[string]any{ + "data": map[string]any{ + "repository": map[string]any{ + "ref": map[string]any{ + "compare": map[string]any{ + "commits": map[string]any{ + "pageInfo": map[string]any{ + "endCursor": "", + "hasNextPage": false, + }, + "nodes": []any{}, + }, + }, + }, + }, + }, + } + json.NewEncoder(w).Encode(resp) + }) + + client := newTestClient(handler) + prs, err := FetchAllPRsBetween(context.Background(), client, "owner", "repo", "v1.0.0", "v1.1.0") + if err != nil { + t.Fatalf("unexpected error: %v", err) + } + + if len(prs) != 0 { + t.Errorf("expected 0 PRs, got %d", len(prs)) + } +} diff --git a/.github/actions/linear-release-sync/src/changelog/releases/releases.go b/.github/actions/linear-release-sync/src/changelog/releases/releases.go new file mode 100644 index 0000000..d502dce --- /dev/null +++ b/.github/actions/linear-release-sync/src/changelog/releases/releases.go @@ -0,0 +1,135 @@ +package releases + +import ( + "context" + "fmt" + + semver "github.com/Masterminds/semver/v3" + "github.com/shurcooL/githubv4" +) + +const PageSize = 100 + +// LastStableRelease returns the last stable release for the given repository. +// It returns the tag name, id and the creation time of the release. +func LastStableRelease(ctx context.Context, client *githubv4.Client, owner, repo string) (string, int, error) { + var query struct { + Repository struct { + LatestRelease struct { + CreatedAt githubv4.DateTime + TagName string + DatabaseId int + } + } `graphql:"repository(owner: $owner, name: $repo)"` + } + + if err := client.Query(ctx, &query, map[string]interface{}{ + "owner": githubv4.String(owner), + "repo": githubv4.String(repo), + }); err != nil { + return "", 0, fmt.Errorf("query latest release: %w", err) + } + + return query.Repository.LatestRelease.TagName, query.Repository.LatestRelease.DatabaseId, nil +} + +func LastStableReleaseBeforeTag(ctx context.Context, client *githubv4.Client, owner, repo, tag string) (string, error) { + tagSemver, err := semver.NewVersion(tag) + if err != nil { + return "", fmt.Errorf("failed to parse tag: %w", err) + } + + return LatestStableSemverRange(ctx, client, owner, repo, "< "+tagSemver.String()) +} + +func LatestStableSemverRange(ctx context.Context, client *githubv4.Client, owner, repo, tagRangeExpr string) (string, error) { + tagRange, err := semver.NewConstraint(tagRangeExpr) + if err != nil { + // Ignore bad ranges for now. + return "", fmt.Errorf("failed to parse tag: %w", err) + } + + var query struct { + Repository struct { + Releases struct { + PageInfo struct { + EndCursor githubv4.String + HasNextPage bool + } + Nodes []struct { + TagName string + IsPrerelease bool + } + } `graphql:"releases(first: $pageSize, after: $cursor, orderBy: { direction: DESC, field: CREATED_AT})"` + } `graphql:"repository(owner: $owner, name: $repo)"` + } + + var cursor *githubv4.String + + // Paginate through the Releases + for { + if err := client.Query(ctx, &query, map[string]interface{}{ + "owner": githubv4.String(owner), + "repo": githubv4.String(repo), + "pageSize": githubv4.Int(PageSize), + "cursor": cursor, + }); err != nil { + return "", fmt.Errorf("query repository: %w", err) + } + + cursor = &query.Repository.Releases.PageInfo.EndCursor + + for _, release := range query.Repository.Releases.Nodes { + releaseSemver, err := semver.NewVersion(release.TagName) + if err != nil { + continue + } + + if releaseSemver.Prerelease() != "" { + continue + } + + if release.IsPrerelease { + continue + } + + if tagRange.Check(releaseSemver) { + return release.TagName, nil + } + } + + if !query.Repository.Releases.PageInfo.HasNextPage { + break + } + } + + return "", nil +} + +type Release struct { + PublishedAt githubv4.DateTime + Description string + Name string + TagName string + DatabaseId int64 +} + +// FetchReleaseByTag fetches a release by its tag name. +// It returns the release or an error if the release could not be found. +func FetchReleaseByTag(ctx context.Context, client *githubv4.Client, owner, repo, tag string) (Release, error) { + var query struct { + Repository struct { + Release Release `graphql:"release(tagName: $tag)"` + } `graphql:"repository(owner: $owner, name: $repo)"` + } + + if err := client.Query(ctx, &query, map[string]interface{}{ + "owner": githubv4.String(owner), + "repo": githubv4.String(repo), + "tag": githubv4.String(tag), + }); err != nil { + return Release{}, fmt.Errorf("query release by tag: %w", err) + } + + return query.Repository.Release, nil +} diff --git a/.github/actions/linear-release-sync/src/changelog/releases/releases_test.go b/.github/actions/linear-release-sync/src/changelog/releases/releases_test.go new file mode 100644 index 0000000..50d0fbb --- /dev/null +++ b/.github/actions/linear-release-sync/src/changelog/releases/releases_test.go @@ -0,0 +1,300 @@ +package releases + +import ( + "context" + "encoding/json" + "net/http" + "net/http/httptest" + "testing" + + "github.com/shurcooL/githubv4" + "golang.org/x/oauth2" +) + +func newTestClient(handler http.Handler) *githubv4.Client { + srv := httptest.NewServer(handler) + httpClient := oauth2.NewClient(context.Background(), oauth2.StaticTokenSource( + &oauth2.Token{AccessToken: "test"}, + )) + return githubv4.NewEnterpriseClient(srv.URL, httpClient) +} + +func TestLastStableReleaseBeforeTag(t *testing.T) { + testCases := []struct { + name string + tag string + releases []map[string]any + expected string + expectError bool + }{ + { + name: "finds previous stable release", + tag: "v1.2.0", + releases: []map[string]any{ + {"tagName": "v1.2.0", "isPrerelease": false}, + {"tagName": "v1.1.0", "isPrerelease": false}, + {"tagName": "v1.0.0", "isPrerelease": false}, + }, + expected: "v1.1.0", + }, + { + name: "skips pre-releases", + tag: "v1.2.0", + releases: []map[string]any{ + {"tagName": "v1.2.0", "isPrerelease": false}, + {"tagName": "v1.2.0-rc.1", "isPrerelease": true}, + {"tagName": "v1.1.0", "isPrerelease": false}, + }, + expected: "v1.1.0", + }, + { + name: "skips semver pre-releases even if isPrerelease is false", + tag: "v1.2.0", + releases: []map[string]any{ + {"tagName": "v1.2.0", "isPrerelease": false}, + {"tagName": "v1.2.0-alpha.1", "isPrerelease": false}, + {"tagName": "v1.1.0", "isPrerelease": false}, + }, + expected: "v1.1.0", + }, + { + name: "invalid tag returns error", + tag: "not-semver", + releases: []map[string]any{}, + expectError: true, + }, + } + + for _, tc := range testCases { + t.Run(tc.name, func(t *testing.T) { + handler := http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { + resp := map[string]any{ + "data": map[string]any{ + "repository": map[string]any{ + "releases": map[string]any{ + "pageInfo": map[string]any{ + "endCursor": "", + "hasNextPage": false, + }, + "nodes": tc.releases, + }, + }, + }, + } + json.NewEncoder(w).Encode(resp) + }) + + client := newTestClient(handler) + result, err := LastStableReleaseBeforeTag(context.Background(), client, "owner", "repo", tc.tag) + + if tc.expectError { + if err == nil { + t.Fatal("expected error, got nil") + } + return + } + + if err != nil { + t.Fatalf("unexpected error: %v", err) + } + if result != tc.expected { + t.Errorf("expected %q, got %q", tc.expected, result) + } + }) + } +} + +func TestLatestStableSemverRange_Pagination(t *testing.T) { + callCount := 0 + handler := http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { + callCount++ + var resp map[string]any + if callCount == 1 { + resp = map[string]any{ + "data": map[string]any{ + "repository": map[string]any{ + "releases": map[string]any{ + "pageInfo": map[string]any{ + "endCursor": "cursor1", + "hasNextPage": true, + }, + "nodes": []any{ + map[string]any{"tagName": "v2.0.0", "isPrerelease": false}, + map[string]any{"tagName": "v1.5.0-rc.1", "isPrerelease": true}, + }, + }, + }, + }, + } + } else { + resp = map[string]any{ + "data": map[string]any{ + "repository": map[string]any{ + "releases": map[string]any{ + "pageInfo": map[string]any{ + "endCursor": "", + "hasNextPage": false, + }, + "nodes": []any{ + map[string]any{"tagName": "v1.4.0", "isPrerelease": false}, + map[string]any{"tagName": "v1.3.0", "isPrerelease": false}, + }, + }, + }, + }, + } + } + json.NewEncoder(w).Encode(resp) + }) + + client := newTestClient(handler) + result, err := LatestStableSemverRange(context.Background(), client, "owner", "repo", "< 1.5.0") + if err != nil { + t.Fatalf("unexpected error: %v", err) + } + + if callCount != 2 { + t.Errorf("expected 2 API calls for pagination, got %d", callCount) + } + if result != "v1.4.0" { + t.Errorf("expected v1.4.0, got %q", result) + } +} + +func TestLatestStableSemverRange_NoMatch(t *testing.T) { + handler := http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { + resp := map[string]any{ + "data": map[string]any{ + "repository": map[string]any{ + "releases": map[string]any{ + "pageInfo": map[string]any{ + "endCursor": "", + "hasNextPage": false, + }, + "nodes": []any{ + map[string]any{"tagName": "v2.0.0", "isPrerelease": false}, + map[string]any{"tagName": "v1.5.0", "isPrerelease": false}, + }, + }, + }, + }, + } + json.NewEncoder(w).Encode(resp) + }) + + client := newTestClient(handler) + result, err := LatestStableSemverRange(context.Background(), client, "owner", "repo", "< 1.0.0") + if err != nil { + t.Fatalf("unexpected error: %v", err) + } + + if result != "" { + t.Errorf("expected empty string for no match, got %q", result) + } +} + +func TestFetchReleaseByTag(t *testing.T) { + handler := http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { + resp := map[string]any{ + "data": map[string]any{ + "repository": map[string]any{ + "release": map[string]any{ + "publishedAt": "2024-01-15T12:00:00Z", + "description": "Release notes", + "name": "v1.2.0", + "tagName": "v1.2.0", + "databaseId": 42, + }, + }, + }, + } + json.NewEncoder(w).Encode(resp) + }) + + client := newTestClient(handler) + release, err := FetchReleaseByTag(context.Background(), client, "owner", "repo", "v1.2.0") + if err != nil { + t.Fatalf("unexpected error: %v", err) + } + + if release.TagName != "v1.2.0" { + t.Errorf("expected tag v1.2.0, got %q", release.TagName) + } + if release.Name != "v1.2.0" { + t.Errorf("expected name v1.2.0, got %q", release.Name) + } + if release.DatabaseId != 42 { + t.Errorf("expected databaseId 42, got %d", release.DatabaseId) + } +} + +func TestLatestStableSemverRange_InvalidConstraint(t *testing.T) { + client := newTestClient(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {})) + + _, err := LatestStableSemverRange(context.Background(), client, "owner", "repo", "not a valid constraint !!!") + if err == nil { + t.Fatal("expected error for invalid constraint, got nil") + } +} + +func TestLastStableRelease(t *testing.T) { + handler := http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { + resp := map[string]any{ + "data": map[string]any{ + "repository": map[string]any{ + "latestRelease": map[string]any{ + "createdAt": "2024-01-15T12:00:00Z", + "tagName": "v1.5.0", + "databaseId": 99, + }, + }, + }, + } + json.NewEncoder(w).Encode(resp) + }) + + client := newTestClient(handler) + tagName, dbID, err := LastStableRelease(context.Background(), client, "owner", "repo") + if err != nil { + t.Fatalf("unexpected error: %v", err) + } + + if tagName != "v1.5.0" { + t.Errorf("expected tag v1.5.0, got %q", tagName) + } + if dbID != 99 { + t.Errorf("expected databaseId 99, got %d", dbID) + } +} + +func TestLatestStableSemverRange_SkipsInvalidTagNames(t *testing.T) { + handler := http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) { + resp := map[string]any{ + "data": map[string]any{ + "repository": map[string]any{ + "releases": map[string]any{ + "pageInfo": map[string]any{ + "endCursor": "", + "hasNextPage": false, + }, + "nodes": []any{ + map[string]any{"tagName": "not-semver", "isPrerelease": false}, + map[string]any{"tagName": "v1.0.0", "isPrerelease": false}, + }, + }, + }, + }, + } + json.NewEncoder(w).Encode(resp) + }) + + client := newTestClient(handler) + result, err := LatestStableSemverRange(context.Background(), client, "owner", "repo", "< 2.0.0") + if err != nil { + t.Fatalf("unexpected error: %v", err) + } + + if result != "v1.0.0" { + t.Errorf("expected v1.0.0, got %q", result) + } +} diff --git a/.github/actions/linear-release-sync/src/flow_test.go b/.github/actions/linear-release-sync/src/flow_test.go new file mode 100644 index 0000000..1b1397b --- /dev/null +++ b/.github/actions/linear-release-sync/src/flow_test.go @@ -0,0 +1,266 @@ +package main + +import ( + "testing" + "time" + + pullrequests "github.com/loft-sh/github-actions/linear-release-sync/changelog/pull-requests" + "github.com/shurcooL/githubv4" +) + +func TestLinearPullRequestIssueExtraction(t *testing.T) { + // Test data with different issue ID patterns + testCases := []struct { + name string + prBody string + prBranch string + expectedIssues []string + description string + }{ + { + name: "Single issue in body", + prBody: "This PR fixes ENG-1234", + prBranch: "feature/some-feature", + expectedIssues: []string{"eng-1234"}, + description: "Should extract single issue ID from PR body", + }, + { + name: "Multiple issues in body", + prBody: "This PR addresses ENG-1234 and also fixes ENG-5678", + prBranch: "feature/multi-fix", + expectedIssues: []string{"eng-1234", "eng-5678"}, + description: "Should extract multiple issue IDs from PR body", + }, + { + name: "Issue in branch name", + prBody: "Update documentation", + prBranch: "ENG-9012/update-docs", + expectedIssues: []string{"eng-9012"}, + description: "Should extract issue ID from branch name", + }, + { + name: "Issue in both body and branch", + prBody: "Fix bug ENG-1111", + prBranch: "ENG-2222/fix-implementation", + expectedIssues: []string{"eng-1111", "eng-2222"}, + description: "Should extract issue IDs from both body and branch", + }, + { + name: "No issues found", + prBody: "Simple documentation update", + prBranch: "feature/docs-update", + expectedIssues: []string{}, + description: "Should return empty list when no issues found", + }, + { + name: "Different issue patterns", + prBody: "Fixes ABC-1234, DEF-5678, and GHI-9012", + prBranch: "feature/multiple-patterns", + expectedIssues: []string{"abc-1234", "def-5678", "ghi-9012"}, + description: "Should handle different three-letter prefixes", + }, + } + + for _, tc := range testCases { + t.Run(tc.name, func(t *testing.T) { + pr := pullrequests.PullRequest{ + Number: 1, + Body: tc.prBody, + HeadRefName: tc.prBranch, + Merged: true, + } + + linearPR := LinearPullRequest{PullRequest: pr, validTeamKeys: nil} + extractedIssues := linearPR.IssueIDs() + + if len(extractedIssues) != len(tc.expectedIssues) { + t.Errorf("%s: expected %d issues, got %d issues", tc.description, len(tc.expectedIssues), len(extractedIssues)) + t.Errorf("Expected: %v, Got: %v", tc.expectedIssues, extractedIssues) + return + } + + // Check that all expected issues are present (order doesn't matter) + expectedMap := make(map[string]bool) + for _, issue := range tc.expectedIssues { + expectedMap[issue] = true + } + + for _, issue := range extractedIssues { + if !expectedMap[issue] { + t.Errorf("%s: unexpected issue ID found: %s", tc.description, issue) + } + delete(expectedMap, issue) + } + + // Check for any missing issues + for issue := range expectedMap { + t.Errorf("%s: expected issue ID not found: %s", tc.description, issue) + } + }) + } +} + +func TestStrictFilteringIntegration(t *testing.T) { + // Simulate the complete flow with mock data + releaseTime := time.Date(2024, 1, 15, 12, 0, 0, 0, time.UTC) + + allPRs := []pullrequests.PullRequest{ + { + Number: 1, + Body: "Fix critical bug ENG-1234", + HeadRefName: "hotfix/critical-bug", + Merged: true, + MergedAt: &githubv4.DateTime{Time: releaseTime.Add(-2 * time.Hour)}, // Before release + }, + { + Number: 2, + Body: "Add new feature ENG-5678", + HeadRefName: "feature/new-feature", + Merged: true, + MergedAt: &githubv4.DateTime{Time: releaseTime.Add(1 * time.Hour)}, // After release + }, + { + Number: 3, + Body: "Update config ENG-9012", + HeadRefName: "ENG-9012/update-config", + Merged: true, + MergedAt: &githubv4.DateTime{Time: releaseTime.Add(-30 * time.Minute)}, // Before release + }, + { + Number: 4, + Body: "Documentation update", + HeadRefName: "docs/update", + Merged: true, + MergedAt: &githubv4.DateTime{Time: releaseTime.Add(-1 * time.Hour)}, // Before release, no issue ID + }, + } + + testCases := []struct { + name string + strictFiltering bool + expectedPRs []int // PR numbers that should be included + expectedIssues []string // Issue IDs that should be extracted + description string + }{ + { + name: "Strict filtering enabled", + strictFiltering: true, + expectedPRs: []int{1, 3, 4}, // Only PRs merged before release + expectedIssues: []string{"eng-1234", "eng-9012"}, // PR 4 has no issue ID + description: "Should include only PRs merged before release", + }, + { + name: "Strict filtering disabled", + strictFiltering: false, + expectedPRs: []int{1, 2, 3, 4}, // All PRs + expectedIssues: []string{"eng-1234", "eng-5678", "eng-9012"}, // All issue IDs + description: "Should include all PRs between tags", + }, + } + + for _, tc := range testCases { + t.Run(tc.name, func(t *testing.T) { + var pullRequests []LinearPullRequest + + if tc.strictFiltering { + // Apply time-based filtering + var filteredPRs []pullrequests.PullRequest + for _, pr := range allPRs { + if pr.MergedAt != nil && pr.MergedAt.After(releaseTime) { + continue + } + if pr.MergedAt != nil { + filteredPRs = append(filteredPRs, pr) + } + } + pullRequests = NewLinearPullRequests(filteredPRs, nil) + } else { + // Use all PRs + pullRequests = NewLinearPullRequests(allPRs, nil) + } + + // Verify correct PRs are included + if len(pullRequests) != len(tc.expectedPRs) { + t.Errorf("%s: expected %d PRs, got %d PRs", tc.description, len(tc.expectedPRs), len(pullRequests)) + } + + // Extract issue IDs + issueSet := make(map[string]bool) + for _, pr := range pullRequests { + if issueIDs := pr.IssueIDs(); len(issueIDs) > 0 { + for _, issueID := range issueIDs { + issueSet[issueID] = true + } + } + } + + // Convert set to slice for comparison + var releasedIssues []string + for issueID := range issueSet { + releasedIssues = append(releasedIssues, issueID) + } + + // Verify correct issues are extracted + if len(releasedIssues) != len(tc.expectedIssues) { + t.Errorf("%s: expected %d issues, got %d issues", tc.description, len(tc.expectedIssues), len(releasedIssues)) + t.Errorf("Expected: %v, Got: %v", tc.expectedIssues, releasedIssues) + return + } + + // Check that all expected issues are present + expectedMap := make(map[string]bool) + for _, issue := range tc.expectedIssues { + expectedMap[issue] = true + } + + for _, issue := range releasedIssues { + if !expectedMap[issue] { + t.Errorf("%s: unexpected issue ID found: %s", tc.description, issue) + } + delete(expectedMap, issue) + } + + // Check for any missing issues + for issue := range expectedMap { + t.Errorf("%s: expected issue ID not found: %s", tc.description, issue) + } + }) + } +} + +// Benchmark test to ensure performance is reasonable +func BenchmarkLinearPullRequestIssueExtraction(b *testing.B) { + pr := pullrequests.PullRequest{ + Number: 1, + Body: "This PR fixes ENG-1234, ENG-5678, and addresses ENG-9012 along with ENG-3456", + HeadRefName: "ENG-7890/complex-fix", + Merged: true, + } + + linearPR := LinearPullRequest{PullRequest: pr, validTeamKeys: nil} + + b.ResetTimer() + for i := 0; i < b.N; i++ { + _ = linearPR.IssueIDs() + } +} + +func BenchmarkTimeBasedFiltering(b *testing.B) { + releaseTime := time.Now() + + // Create test data + prs := make([]pullrequests.PullRequest, 100) + for i := 0; i < 100; i++ { + prs[i] = pullrequests.PullRequest{ + Number: i + 1, + Body: "Test PR body", + Merged: true, + MergedAt: &githubv4.DateTime{Time: releaseTime.Add(time.Duration(i-50) * time.Hour)}, // Mix of before/after + } + } + + b.ResetTimer() + for i := 0; i < b.N; i++ { + _ = filterPRsByTime(prs, releaseTime) + } +} diff --git a/.github/actions/linear-release-sync/src/go.mod b/.github/actions/linear-release-sync/src/go.mod new file mode 100644 index 0000000..1562e15 --- /dev/null +++ b/.github/actions/linear-release-sync/src/go.mod @@ -0,0 +1,11 @@ +module github.com/loft-sh/github-actions/linear-release-sync + +go 1.26 + +require ( + github.com/shurcooL/githubv4 v0.0.0-20240120211514-18a1ae0e79dc + github.com/shurcooL/graphql v0.0.0-20230722043721-ed46e5a46466 + golang.org/x/oauth2 v0.25.0 +) + +require github.com/Masterminds/semver/v3 v3.4.0 diff --git a/.github/actions/linear-release-sync/src/go.sum b/.github/actions/linear-release-sync/src/go.sum new file mode 100644 index 0000000..a1db54c --- /dev/null +++ b/.github/actions/linear-release-sync/src/go.sum @@ -0,0 +1,10 @@ +github.com/Masterminds/semver/v3 v3.4.0 h1:Zog+i5UMtVoCU8oKka5P7i9q9HgrJeGzI9SA1Xbatp0= +github.com/Masterminds/semver/v3 v3.4.0/go.mod h1:4V+yj/TJE1HU9XfppCwVMZq3I84lprf4nC11bSS5beM= +github.com/google/go-cmp v0.5.9 h1:O2Tfq5qg4qc4AmwVlvv0oLiVAGB7enBSJ2x2DqQFi38= +github.com/google/go-cmp v0.5.9/go.mod h1:17dUlkBOakJ0+DkrSSNjCkIjxS6bF9zb3elmeNGIjoY= +github.com/shurcooL/githubv4 v0.0.0-20240120211514-18a1ae0e79dc h1:vH0NQbIDk+mJLvBliNGfcQgUmhlniWBDXC79oRxfZA0= +github.com/shurcooL/githubv4 v0.0.0-20240120211514-18a1ae0e79dc/go.mod h1:zqMwyHmnN/eDOZOdiTohqIUKUrTFX62PNlu7IJdu0q8= +github.com/shurcooL/graphql v0.0.0-20230722043721-ed46e5a46466 h1:17JxqqJY66GmZVHkmAsGEkcIu0oCe3AM420QDgGwZx0= +github.com/shurcooL/graphql v0.0.0-20230722043721-ed46e5a46466/go.mod h1:9dIRpgIY7hVhoqfe0/FcYp0bpInZaT7dc3BYOprrIUE= +golang.org/x/oauth2 v0.25.0 h1:CY4y7XT9v0cRI9oupztF8AgiIu99L/ksR/Xp/6jrZ70= +golang.org/x/oauth2 v0.25.0/go.mod h1:XYTD2NtWslqkgxebSiOHnXEap4TF09sJSc7H1sXbhtI= diff --git a/.github/actions/linear-release-sync/src/linear.go b/.github/actions/linear-release-sync/src/linear.go new file mode 100644 index 0000000..d6ff439 --- /dev/null +++ b/.github/actions/linear-release-sync/src/linear.go @@ -0,0 +1,353 @@ +package main + +import ( + "context" + "errors" + "fmt" + "log/slog" + "net/http" + "strings" + + semver "github.com/Masterminds/semver/v3" + "github.com/shurcooL/graphql" +) + +var ErrNoWorkflowFound = errors.New("no workflow state found") + +// AvailableWorkflowState lists all workflow states for a team (for debugging) +type AvailableWorkflowState struct { + Name string + Team string +} + +// AvailableTeam lists a team with its key (for debugging) +type AvailableTeam struct { + Name string + Key string +} + +type LinearClient struct { + client *graphql.Client + logger *slog.Logger +} + +var _ http.RoundTripper = (*transport)(nil) + +type transport struct { + token string +} + +// RoundTrip implements http.RoundTripper. +func (t *transport) RoundTrip(req *http.Request) (*http.Response, error) { + req.Header.Set("Authorization", t.token) + return http.DefaultTransport.RoundTrip(req) +} + +// NewLinearClient creates a new LinearClient. +func NewLinearClient(ctx context.Context, token string, logger *slog.Logger) LinearClient { + httpClient := &http.Client{ + Transport: &transport{token: token}, + } + client := graphql.NewClient("https://api.linear.app/graphql", httpClient) + + return LinearClient{client: client, logger: logger} +} + +// isStableRelease checks if a version is a stable release (no pre-release suffix). +// Returns true for stable releases like v0.26.1, v4.5.0 +// Returns false for pre-releases like v0.26.1-alpha.1, v0.26.1-rc.4, v4.5.0-beta.2 +func isStableRelease(version string) bool { + v, err := semver.NewVersion(version) + if err != nil { + return false + } + return v.Prerelease() == "" +} + +// ListTeams returns all available teams (for debugging workflow state lookup failures) +func (l *LinearClient) ListTeams(ctx context.Context) ([]AvailableTeam, error) { + var query struct { + Teams struct { + Nodes []struct { + Name string + Key string + } + } `graphql:"teams"` + } + + if err := l.client.Query(ctx, &query, nil); err != nil { + return nil, fmt.Errorf("query failed: %w", err) + } + + teams := make([]AvailableTeam, len(query.Teams.Nodes)) + for i, t := range query.Teams.Nodes { + teams[i] = AvailableTeam{Name: t.Name, Key: t.Key} + } + return teams, nil +} + +// ListWorkflowStates returns all workflow states for a team (for debugging workflow state lookup failures) +func (l *LinearClient) ListWorkflowStates(ctx context.Context, teamName string) ([]AvailableWorkflowState, error) { + var query struct { + WorkflowStates struct { + Nodes []struct { + Name string + Team struct { + Name string + } + } + } `graphql:"workflowStates(filter: { team: { name: { eq: $team } } })"` + } + + variables := map[string]any{ + "team": graphql.String(teamName), + } + + if err := l.client.Query(ctx, &query, variables); err != nil { + return nil, fmt.Errorf("query failed: %w", err) + } + + states := make([]AvailableWorkflowState, len(query.WorkflowStates.Nodes)) + for i, s := range query.WorkflowStates.Nodes { + states[i] = AvailableWorkflowState{Name: s.Name, Team: s.Team.Name} + } + return states, nil +} + +// WorkflowStateID returns the ID of the a workflow state for the given team. +// If no matching state is found, it provides debugging information about available teams and states. +func (l *LinearClient) WorkflowStateID(ctx context.Context, stateName, linearTeamName string) (string, error) { + var query struct { + WorkflowStates struct { + Nodes []struct { + Id string + } + } `graphql:"workflowStates(filter: { name: { eq: $name }, team: { name: { eq: $team } } })"` + } + + variables := map[string]any{ + "name": graphql.String(stateName), + "team": graphql.String(linearTeamName), + } + + if err := l.client.Query(ctx, &query, variables); err != nil { + return "", fmt.Errorf("query failed: %w", err) + } + + if len(query.WorkflowStates.Nodes) == 0 { + // Provide debugging information about available teams and states + debugInfo := fmt.Sprintf("searched for state %q in team %q", stateName, linearTeamName) + + // Try to list available teams + teams, err := l.ListTeams(ctx) + if err == nil && len(teams) > 0 { + teamNames := make([]string, len(teams)) + for i, t := range teams { + teamNames[i] = fmt.Sprintf("%s (%s)", t.Name, t.Key) + } + debugInfo += fmt.Sprintf("; available teams: %s", strings.Join(teamNames, ", ")) + } + + // Try to list available workflow states for the team + states, err := l.ListWorkflowStates(ctx, linearTeamName) + if err == nil && len(states) > 0 { + stateNames := make([]string, len(states)) + for i, s := range states { + stateNames[i] = s.Name + } + debugInfo += fmt.Sprintf("; available states for team: %s", strings.Join(stateNames, ", ")) + } else if err == nil { + debugInfo += "; no states found for team (team may not exist or may have been renamed)" + } + + return "", fmt.Errorf("%w: %s", ErrNoWorkflowFound, debugInfo) + } + + return query.WorkflowStates.Nodes[0].Id, nil +} + +// IssueState returns the current state ID of the issue. +func (l *LinearClient) IssueState(ctx context.Context, issueID string) (string, error) { + stateID, _, err := l.IssueStateDetails(ctx, issueID) + return stateID, err +} + +// IssueStateDetails returns the current state ID and name of the issue. +func (l *LinearClient) IssueStateDetails(ctx context.Context, issueID string) (string, string, error) { + details, err := l.GetIssueDetails(ctx, issueID) + if err != nil { + return "", "", err + } + return details.StateID, details.StateName, nil +} + +// IssueDetails contains state, team, and project information for an issue +type IssueDetails struct { + StateID string + StateName string + TeamName string + ProjectName string +} + +// GetIssueDetails returns state, team, and project information for an issue. +func (l *LinearClient) GetIssueDetails(ctx context.Context, issueID string) (*IssueDetails, error) { + var query struct { + Issue struct { + State struct { + Id string + Name string + } + Team struct { + Name string + } + Project struct { + Name string + } + } `graphql:"issue(id: $id)"` + } + + variables := map[string]any{ + "id": graphql.String(issueID), + } + + if err := l.client.Query(ctx, &query, variables); err != nil { + return nil, fmt.Errorf("query failed (issue ID: %v): %w", issueID, err) + } + + return &IssueDetails{ + StateID: query.Issue.State.Id, + StateName: query.Issue.State.Name, + TeamName: query.Issue.Team.Name, + ProjectName: query.Issue.Project.Name, + }, nil +} + +// IsIssueInState checks if an issue is in a specific state. +func (l *LinearClient) IsIssueInState(ctx context.Context, issueID string, stateID string) (bool, error) { + currentState, err := l.IssueState(ctx, issueID) + if err != nil { + return false, fmt.Errorf("get issue state: %w", err) + } + + return currentState == stateID, nil +} + +// IsIssueInStateByName checks if an issue is in a state with the specified name. +func (l *LinearClient) IsIssueInStateByName(ctx context.Context, issueID string, stateName string) (bool, error) { + _, currentStateName, err := l.IssueStateDetails(ctx, issueID) + if err != nil { + return false, fmt.Errorf("get issue state details: %w", err) + } + + return currentStateName == stateName, nil +} + +// MoveIssueToState moves the issue to the given state if it's not already there and if it's in the ready for release state. +// It also adds a comment to the issue about when it was first released and on which tag. +// For stable releases on already-released issues, it adds a "now available in stable" comment. +// issueDetails should be pre-fetched via GetIssueDetails to avoid redundant API calls. +func (l *LinearClient) MoveIssueToState(ctx context.Context, dryRun bool, issueID string, issueDetails *IssueDetails, releasedStateID, readyForReleaseStateName, releaseTagName, releaseDate string) error { + // (ThomasK33): Skip CVEs + if strings.HasPrefix(strings.ToLower(issueID), "cve") { + return nil + } + + logger := l.logger + + isStable := isStableRelease(releaseTagName) + + alreadyReleased := issueDetails.StateID == releasedStateID + + // If already in released state: + // - Pre-releases: skip entirely (already released in a previous pre-release) + // - Stable releases: skip state update but add "now available in stable" comment + if alreadyReleased { + if !isStable { + logger.Debug("Issue already has desired state", "issueID", issueID, "stateID", releasedStateID) + return nil + } + logger.Debug("Issue already released, adding stable release comment", "issueID", issueID) + } else { + // Skip issues not in ready for release state + if issueDetails.StateName != readyForReleaseStateName { + logger.Debug("Skipping issue not in ready for release state", "issueID", issueID, "currentState", issueDetails.StateName, "requiredState", readyForReleaseStateName) + return nil + } + + // Update issue state to Released + if !dryRun { + if err := l.updateIssueState(ctx, issueID, releasedStateID); err != nil { + return fmt.Errorf("update issue state: %w", err) + } + } else { + logger.Info("Would update issue state", "issueID", issueID, "releasedStateID", releasedStateID) + } + logger.Info("Moved issue to desired state", "issueID", issueID, "stateID", releasedStateID) + } + + // Add release comment + // Use different text for stable releases on already-released issues to avoid + // confusion with the "first released in" pattern used by linear-webhook-service + var releaseComment string + if alreadyReleased && isStable { + releaseComment = fmt.Sprintf("Now available in stable release %v (released %v)", releaseTagName, releaseDate) + } else { + releaseComment = fmt.Sprintf("This issue was first released in %v on %v", releaseTagName, releaseDate) + } + + if !dryRun { + if err := l.createComment(ctx, issueID, releaseComment); err != nil { + return fmt.Errorf("create comment: %w", err) + } + } else { + logger.Info("Would create comment on issue", "issueID", issueID, "comment", releaseComment) + } + + return nil +} + +// updateIssueState updates the state of the given issue. +func (l *LinearClient) updateIssueState(ctx context.Context, issueID, releasedStateID string) error { + var mutation struct { + IssueUpdate struct { + Success bool + } `graphql:"issueUpdate(input: { stateId: $stateID }, id: $issueID)"` + } + + variables := map[string]any{ + "issueID": graphql.String(issueID), + "stateID": graphql.String(releasedStateID), + } + + if err := l.client.Mutate(ctx, &mutation, variables); err != nil { + return fmt.Errorf("mutation failed: %w", err) + } + if !mutation.IssueUpdate.Success { + return fmt.Errorf("mutation failed: issue update returned success=false for %s", issueID) + } + + return nil +} + +// createComment creates a comment on the given issue. +func (l *LinearClient) createComment(ctx context.Context, issueID, releaseComment string) error { + var mutation struct { + CommentCreate struct { + Success bool + } `graphql:"commentCreate(input: { issueId: $issueID, body: $body, doNotSubscribeToIssue: true })"` + } + + variables := map[string]any{ + "issueID": graphql.String(issueID), + "body": graphql.String(releaseComment), + } + + if err := l.client.Mutate(ctx, &mutation, variables); err != nil { + return fmt.Errorf("mutation failed: %w", err) + } + if !mutation.CommentCreate.Success { + return fmt.Errorf("mutation failed: comment create returned success=false for %s", issueID) + } + + return nil +} diff --git a/.github/actions/linear-release-sync/src/linear_test.go b/.github/actions/linear-release-sync/src/linear_test.go new file mode 100644 index 0000000..7605cf4 --- /dev/null +++ b/.github/actions/linear-release-sync/src/linear_test.go @@ -0,0 +1,498 @@ +package main + +import ( + "context" + "fmt" + "regexp" + "strings" + "testing" + + pullrequests "github.com/loft-sh/github-actions/linear-release-sync/changelog/pull-requests" +) + +func TestMoveIssueLogic(t *testing.T) { + // Create mock issues with different states + mockIssues := []struct { + ID string + StateName string + StateID string + ShouldMove bool + }{ + {ID: "ENG-1234", StateName: "Ready for Release", StateID: "ready-state-id", ShouldMove: true}, + {ID: "ENG-5678", StateName: "In Progress", StateID: "in-progress-id", ShouldMove: false}, + {ID: "ENG-9012", StateName: "Released", StateID: "released-id", ShouldMove: false}, + {ID: "CVE-1234", StateName: "Ready for Release", StateID: "ready-state-id", ShouldMove: false}, + } + + readyForReleaseStateID := "ready-state-id" + releasedStateID := "released-id" + + for _, issue := range mockIssues { + t.Run(issue.ID, func(t *testing.T) { + shouldMoveIssue := false + + // Skip CVEs + if issue.ID[:3] == "CVE" { + shouldMoveIssue = false + } else if issue.StateID == releasedStateID { + // Already released + shouldMoveIssue = false + } else if issue.StateID == readyForReleaseStateID { + // Ready for release + shouldMoveIssue = true + } else { + // Not in correct state + shouldMoveIssue = false + } + + if shouldMoveIssue != issue.ShouldMove { + t.Errorf("Issue %s: expected shouldMove=%v, got=%v", issue.ID, issue.ShouldMove, shouldMoveIssue) + } + }) + } +} + +// MockLinearClient is a mock implementation of the LinearClient interface for testing +type MockLinearClient struct { + mockIssueStates map[string]string + mockIssueStateNames map[string]string + mockWorkflowIDs map[string]string +} + +func NewMockLinearClient() *MockLinearClient { + return &MockLinearClient{ + mockIssueStates: map[string]string{ + "ENG-1234": "ready-state-id", + "ENG-5678": "in-progress-id", + "ENG-9012": "released-id", + "CVE-1234": "ready-state-id", + }, + mockIssueStateNames: map[string]string{ + "ENG-1234": "Ready for Release", + "ENG-5678": "In Progress", + "ENG-9012": "Released", + "CVE-1234": "Ready for Release", + }, + mockWorkflowIDs: map[string]string{ + "Ready for Release": "ready-state-id", + "Released": "released-id", + "In Progress": "in-progress-id", + }, + } +} + +func (m *MockLinearClient) WorkflowStateID(ctx context.Context, stateName, linearTeamName string) (string, error) { + return m.mockWorkflowIDs[stateName], nil +} + +func (m *MockLinearClient) IssueState(ctx context.Context, issueID string) (string, error) { + return m.mockIssueStates[issueID], nil +} + +func (m *MockLinearClient) IssueStateDetails(ctx context.Context, issueID string) (string, string, error) { + return m.mockIssueStates[issueID], m.mockIssueStateNames[issueID], nil +} + +func (m *MockLinearClient) IsIssueInState(ctx context.Context, issueID string, stateID string) (bool, error) { + currentState, _ := m.IssueState(ctx, issueID) + return currentState == stateID, nil +} + +func (m *MockLinearClient) IsIssueInStateByName(ctx context.Context, issueID string, stateName string) (bool, error) { + _, currentStateName, _ := m.IssueStateDetails(ctx, issueID) + return currentStateName == stateName, nil +} + +// MoveIssueToState implementation for tests +func (m *MockLinearClient) MoveIssueToState(ctx context.Context, dryRun bool, issueID, releasedStateID, readyForReleaseStateName, releaseTagName, releaseDate string) error { + // Skip CVEs + if strings.HasPrefix(strings.ToLower(issueID), "cve") { + return nil + } + + currentStateID, currentStateName, _ := m.IssueStateDetails(ctx, issueID) + + // Already in released state + if currentStateID == releasedStateID { + return nil + } + + // Skip if not in ready for release state + if currentStateName != readyForReleaseStateName { + return fmt.Errorf("issue %s not in ready for release state", issueID) + } + + // Only ENG-1234 is expected to be moved successfully + // Explicitly return errors for other issues to ensure the test only counts ENG-1234 + if issueID != "ENG-1234" { + return fmt.Errorf("would not move issue %s for test purposes", issueID) + } + + return nil +} + +func TestIsIssueInState(t *testing.T) { + mockClient := NewMockLinearClient() + ctx := context.Background() + + testCases := []struct { + IssueID string + StateID string + ExpectedResult bool + }{ + {"ENG-1234", "ready-state-id", true}, + {"ENG-1234", "released-id", false}, + {"ENG-5678", "in-progress-id", true}, + {"ENG-9012", "released-id", true}, + } + + for _, tc := range testCases { + t.Run(tc.IssueID+"_"+tc.StateID, func(t *testing.T) { + result, err := mockClient.IsIssueInState(ctx, tc.IssueID, tc.StateID) + if err != nil { + t.Errorf("Unexpected error: %v", err) + } + if result != tc.ExpectedResult { + t.Errorf("Expected IsIssueInState to return %v for issue %s and state %s, but got %v", + tc.ExpectedResult, tc.IssueID, tc.StateID, result) + } + }) + } +} + +func TestMoveIssueStateFiltering(t *testing.T) { + // Create a custom mock client for this test + mockClient := &MockLinearClient{ + mockIssueStates: map[string]string{ + "ENG-1234": "ready-state-id", // Ready for release + "ENG-5678": "in-progress-id", // In progress + "ENG-9012": "released-id", // Already released + "CVE-1234": "ready-state-id", // Ready but should be skipped as CVE + }, + mockIssueStateNames: map[string]string{ + "ENG-1234": "Ready for Release", + "ENG-5678": "In Progress", + "ENG-9012": "Released", + "CVE-1234": "Ready for Release", + }, + mockWorkflowIDs: map[string]string{ + "Ready for Release": "ready-state-id", + "Released": "released-id", + "In Progress": "in-progress-id", + }, + } + + ctx := context.Background() + + // Test cases for the overall filtering logic + issueIDs := []string{"ENG-1234", "ENG-5678", "ENG-9012", "CVE-1234"} + readyForReleaseStateName := "Ready for Release" + releasedStateID := "released-id" + + expectedToMove := []string{"ENG-1234"} + actualMoved := []string{} + + // Manually implement the filtering logic based on the actual conditions in LinearClient.MoveIssueToState + for _, issueID := range issueIDs { + // Skip CVEs + if strings.HasPrefix(strings.ToLower(issueID), "cve") { + continue + } + + currentStateID, currentStateName, _ := mockClient.IssueStateDetails(ctx, issueID) + + // Skip if already in released state + if currentStateID == releasedStateID { + continue + } + + // Skip if not in ready for release state + if currentStateName != readyForReleaseStateName { + continue + } + + // This issue would be moved + actualMoved = append(actualMoved, issueID) + } + + // Verify correct issues were selected + if len(actualMoved) != len(expectedToMove) { + t.Errorf("Expected %d issues to move, but got %d", len(expectedToMove), len(actualMoved)) + t.Errorf("Expected: %v, Got: %v", expectedToMove, actualMoved) + } + + // Check that each expected issue is in the actual moved set + for _, expectedID := range expectedToMove { + found := false + for _, actualID := range actualMoved { + if expectedID == actualID { + found = true + break + } + } + + if !found { + t.Errorf("Expected issue %s to be moved, but it wasn't in the result set", expectedID) + } + } +} + +func TestIssueIDsExtraction(t *testing.T) { + // Save original regex and restore it after the test + originalRegex := issuesInBodyREs + defer func() { + issuesInBodyREs = originalRegex + }() + + // For testing, use a regex that matches team keys of 2-10 chars and issue numbers 1-5 digits + issuesInBodyREs = []*regexp.Regexp{ + regexp.MustCompile(`(?P\w{2,10}-\d{1,5})`), + } + + testCases := []struct { + name string + body string + headRefName string + expected []string + }{ + { + name: "No issue IDs", + body: "This is a regular PR", + headRefName: "feature/new-thing", + expected: []string{}, + }, + { + name: "Issue ID in body", + body: "This PR fixes ENG-1234", + headRefName: "feature/new-thing", + expected: []string{"eng-1234"}, + }, + { + name: "Issue ID in branch name", + body: "This is a regular PR", + headRefName: "feature/ENG-1234-new-thing", + expected: []string{"eng-1234"}, + }, + { + name: "Multiple issue IDs", + body: "This PR fixes ENG-1234 and ENG-5678", + headRefName: "feature/new-thing", + expected: []string{"eng-1234", "eng-5678"}, + }, + { + name: "Skip CVE IDs", + body: "This PR fixes CVE-1234", + headRefName: "security/fix", + expected: []string{}, + }, + { + name: "Long team key (DEVOPS)", + body: "This PR fixes DEVOPS-471", + headRefName: "feature/infra-update", + expected: []string{"devops-471"}, + }, + { + name: "Short team key (QA)", + body: "This PR fixes QA-42", + headRefName: "feature/test-fix", + expected: []string{"qa-42"}, + }, + { + name: "Mixed team keys", + body: "This PR fixes ENG-1234 and DEVOPS-471", + headRefName: "feature/QA-99-cross-team", + expected: []string{"eng-1234", "devops-471", "qa-99"}, + }, + { + name: "Issue with short number", + body: "This PR fixes ENG-1", + headRefName: "feature/quick-fix", + expected: []string{"eng-1"}, + }, + { + name: "Issue with long number", + body: "This PR fixes ENG-12345", + headRefName: "feature/big-project", + expected: []string{"eng-12345"}, + }, + } + + for _, tc := range testCases { + t.Run(tc.name, func(t *testing.T) { + pr := LinearPullRequest{ + PullRequest: pullrequests.PullRequest{ + Body: tc.body, + HeadRefName: tc.headRefName, + }, + validTeamKeys: nil, // nil disables team key filtering + } + + result := pr.IssueIDs() + + if len(result) != len(tc.expected) { + t.Errorf("Expected %d issues, got %d", len(tc.expected), len(result)) + t.Errorf("Expected: %v, Got: %v", tc.expected, result) + return + } + + // Check all expected IDs are found (ignoring order) + for _, expectedID := range tc.expected { + found := false + for _, id := range result { + if strings.EqualFold(id, expectedID) { + found = true + break + } + } + if !found { + t.Errorf("Expected to find issue ID %s but it was not found in %v", expectedID, result) + } + } + }) + } +} + +func TestIsStableRelease(t *testing.T) { + testCases := []struct { + version string + expected bool + }{ + // Stable releases + {"v0.26.1", true}, + {"v4.5.0", true}, + {"v1.0.0", true}, + {"0.26.1", true}, // without v prefix + {"v27.0.0", true}, + + // Invalid versions + {"not-a-version", false}, + {"", false}, + + // Pre-releases + {"v0.26.1-alpha.1", false}, + {"v0.26.1-alpha.5", false}, + {"v0.26.1-beta.1", false}, + {"v0.26.1-rc.1", false}, + {"v0.26.1-rc.4", false}, + {"v0.26.1-dev.1", false}, + {"v0.26.1-pre.1", false}, + {"v0.26.1-next.1", false}, + {"v4.5.0-beta.2", false}, + {"0.27.0-alpha.1", false}, // without v prefix + } + + for _, tc := range testCases { + t.Run(tc.version, func(t *testing.T) { + result := isStableRelease(tc.version) + if result != tc.expected { + t.Errorf("isStableRelease(%q) = %v, want %v", tc.version, result, tc.expected) + } + }) + } +} + +func TestStableReleaseCommentText(t *testing.T) { + // Test the comment text logic for different scenarios + testCases := []struct { + name string + alreadyReleased bool + isStable bool + releaseTag string + releaseDate string + expectedContains string + }{ + { + name: "First release (pre-release)", + alreadyReleased: false, + isStable: false, + releaseTag: "v0.27.0-alpha.1", + releaseDate: "2025-01-15", + expectedContains: "first released in", + }, + { + name: "First release (stable)", + alreadyReleased: false, + isStable: true, + releaseTag: "v0.27.0", + releaseDate: "2025-02-01", + expectedContains: "first released in", + }, + { + name: "Stable release on already-released issue", + alreadyReleased: true, + isStable: true, + releaseTag: "v0.27.0", + releaseDate: "2025-02-01", + expectedContains: "Now available in stable release", + }, + } + + for _, tc := range testCases { + t.Run(tc.name, func(t *testing.T) { + var releaseComment string + if tc.alreadyReleased && tc.isStable { + releaseComment = fmt.Sprintf("Now available in stable release %v (released %v)", tc.releaseTag, tc.releaseDate) + } else { + releaseComment = fmt.Sprintf("This issue was first released in %v on %v", tc.releaseTag, tc.releaseDate) + } + + if !strings.Contains(releaseComment, tc.expectedContains) { + t.Errorf("Comment %q does not contain expected text %q", releaseComment, tc.expectedContains) + } + }) + } +} + +func TestMoveIssueToState_PreReleaseAlreadyReleased(t *testing.T) { + // When an issue is already in Released state and the release is a pre-release, + // MoveIssueToState should skip entirely (no state change, no comment). + // This is tested by replicating the logic since the real method requires a live GraphQL client. + + issueDetails := &IssueDetails{ + StateID: "released-id", + StateName: "Released", + TeamName: "Engineering", + } + releasedStateID := "released-id" + releaseTagName := "v0.27.0-alpha.1" + + isStable := isStableRelease(releaseTagName) + alreadyReleased := issueDetails.StateID == releasedStateID + + if !alreadyReleased { + t.Fatal("expected issue to be already released") + } + if isStable { + t.Fatal("expected pre-release tag to not be stable") + } + + // The code returns nil early for pre-release + already-released — no comment added. + // This is the desired behavior: pre-releases should not add duplicate comments. + if alreadyReleased && !isStable { + // This is the expected early-return path + return + } + t.Error("should have returned early for pre-release on already-released issue") +} + +func TestMoveIssueToState_SkipsWrongState(t *testing.T) { + // Issues not in "Ready for Release" and not already released should be skipped. + issueDetails := &IssueDetails{ + StateID: "in-progress-id", + StateName: "In Progress", + TeamName: "Engineering", + } + releasedStateID := "released-id" + readyForReleaseStateName := "Ready for Release" + + alreadyReleased := issueDetails.StateID == releasedStateID + if alreadyReleased { + t.Fatal("issue should not be in released state") + } + + if issueDetails.StateName == readyForReleaseStateName { + t.Fatal("issue should not be in ready for release state") + } + + // The code skips this issue — no state change, no comment. +} diff --git a/.github/actions/linear-release-sync/src/main.go b/.github/actions/linear-release-sync/src/main.go new file mode 100644 index 0000000..ba4da8f --- /dev/null +++ b/.github/actions/linear-release-sync/src/main.go @@ -0,0 +1,298 @@ +package main + +import ( + "context" + "errors" + "flag" + "fmt" + "io" + "log/slog" + "os" + "os/signal" + "strings" + "syscall" + + pullrequests "github.com/loft-sh/github-actions/linear-release-sync/changelog/pull-requests" + "github.com/loft-sh/github-actions/linear-release-sync/changelog/releases" + "github.com/shurcooL/githubv4" + "golang.org/x/oauth2" +) + +var ( + ErrMissingGitHubToken = errors.New("github token must be set") + ErrMissingLinearToken = errors.New("linear token must be set") + ErrMissingReleaseTag = errors.New("release tag must be set") + ErrMissingRepo = errors.New("repo must be set") +) + +func main() { + if err := run(context.Background(), io.Writer(os.Stderr), os.Args); err != nil { + fmt.Fprintf(os.Stderr, "%s\n", err) + os.Exit(1) + } +} + +func run( + ctx context.Context, + stderr io.Writer, + args []string, +) error { + flagset := flag.NewFlagSet(args[0], flag.ExitOnError) + var ( + owner = flagset.String("owner", "loft-sh", "The GitHub owner of the repository") + repo = flagset.String("repo", "", "The GitHub repository name") + githubToken = flagset.String("token", "", "The GitHub token to use for authentication") + previousTag = flagset.String("previous-tag", "", "The previous tag to generate the changelog for (if not set, the last stable release will be used)") + releaseTag = flagset.String("release-tag", "", "The tag of the new release") + debug = flagset.Bool("debug", false, "Enable debug logging") + linearToken = flagset.String("linear-token", "", "The Linear token to use for authentication") + releasedStateName = flagset.String("released-state-name", "Released", "The name of the state to use for the released state") + readyForReleaseStateName = flagset.String("ready-for-release-state-name", "Ready for Release", "The name of the state that indicates an issue is ready to be released") + dryRun = flagset.Bool("dry-run", false, "Do not actually move issues to the released state") + strictFiltering = flagset.Bool("strict-filtering", true, "Only include PRs that were actually merged before the release was published (recommended to avoid false positives)") + linearTeams = flagset.String("linear-teams", "", "Comma-separated list of Linear team names to process (optional, default: all)") + linearProjects = flagset.String("linear-projects", "", "Comma-separated list of Linear project names to process (optional, default: all)") + ) + if err := flagset.Parse(args[1:]); err != nil { + return fmt.Errorf("parse flags: %w", err) + } + + if *githubToken == "" { + *githubToken = os.Getenv("GITHUB_TOKEN") + } + + if *linearToken == "" { + *linearToken = os.Getenv("LINEAR_TOKEN") + } + + if *githubToken == "" { + return ErrMissingGitHubToken + } + + if *repo == "" { + return ErrMissingRepo + } + + if *releaseTag == "" { + return ErrMissingReleaseTag + } + + if *linearToken == "" { + return ErrMissingLinearToken + } + + leveler := slog.LevelVar{} + leveler.Set(slog.LevelInfo) + if *debug { + leveler.Set(slog.LevelDebug) + } + + logger := slog.New(slog.NewTextHandler(stderr, &slog.HandlerOptions{ + Level: &leveler, + })) + + teamFilter := parseCSV(*linearTeams) + projectFilter := parseCSV(*linearProjects) + + if len(teamFilter) > 0 { + logger.Info("Filtering by teams", "teams", *linearTeams) + } + if len(projectFilter) > 0 { + logger.Info("Filtering by projects", "projects", *linearProjects) + } + + ctx, stop := signal.NotifyContext(ctx, os.Interrupt, syscall.SIGTERM) + defer stop() + + httpClient := oauth2.NewClient(ctx, oauth2.StaticTokenSource( + &oauth2.Token{ + AccessToken: *githubToken, + }, + )) + + gqlClient := githubv4.NewClient(httpClient) + + var stableTag string + + if *previousTag != "" { + release, err := releases.FetchReleaseByTag(ctx, gqlClient, *owner, *repo, *previousTag) + if err != nil { + return fmt.Errorf("fetch release by tag: %w", err) + } + + stableTag = release.TagName + } else { + if prevRelease, err := releases.LastStableReleaseBeforeTag(ctx, gqlClient, *owner, *repo, *releaseTag); err != nil { + return fmt.Errorf("get last stable release before tag: %w", err) + } else if prevRelease != "" { + stableTag = prevRelease + } else { + stableTag, _, err = releases.LastStableRelease(ctx, gqlClient, *owner, *repo) + if err != nil { + return fmt.Errorf("get last stable release: %w", err) + } + } + } + + if stableTag == "" { + return errors.New("no stable release found") + } + + logger.Info("Last stable release", "stableTag", stableTag) + + currentRelease, err := releases.FetchReleaseByTag(ctx, gqlClient, *owner, *repo, *releaseTag) + if err != nil { + return fmt.Errorf("fetch release by tag: %w", err) + } + + if currentRelease.TagName != *releaseTag { + return fmt.Errorf("release not found: %s", *releaseTag) + } + + prs, err := pullrequests.FetchAllPRsBetween(ctx, gqlClient, *owner, *repo, stableTag, *releaseTag) + if err != nil { + return fmt.Errorf("fetch all PRs until: %w", err) + } + + // Create Linear client and fetch valid team keys early to filter false positive issue IDs + linearClient := NewLinearClient(ctx, *linearToken, logger) + teams, err := linearClient.ListTeams(ctx) + if err != nil { + return fmt.Errorf("fetch linear teams: %w", err) + } + validTeamKeys := make(ValidTeamKeys) + for _, team := range teams { + validTeamKeys[strings.ToLower(team.Key)] = struct{}{} + } + logger.Debug("Loaded valid team keys", "count", len(validTeamKeys), "keys", teams) + + var pullRequests []LinearPullRequest + if *strictFiltering { + // Filter PRs to only include those that were actually part of this release + filteredPRs, err := pullrequests.FetchPRsForRelease(ctx, gqlClient, *owner, *repo, stableTag, *releaseTag, currentRelease.PublishedAt.Time) + if err != nil { + return fmt.Errorf("filter PRs for release: %w", err) + } + pullRequests = NewLinearPullRequests(filteredPRs, validTeamKeys) + logger.Info("Found merged pull requests for release", "total", len(prs), "filtered", len(pullRequests), "previous", stableTag, "current", *releaseTag) + } else { + // Use all PRs between tags (original behavior) + pullRequests = NewLinearPullRequests(prs, validTeamKeys) + logger.Info("Found merged pull requests between releases", "count", len(pullRequests), "previous", stableTag, "current", *releaseTag) + } + + releasedIssues := []string{} + + for _, pr := range pullRequests { + if issueIDs := pr.IssueIDs(); len(issueIDs) > 0 { + for _, issueID := range issueIDs { + releasedIssues = append(releasedIssues, issueID) + logger.Debug("Found issue in pull request", "issueID", issueID, "pr", pr.Number) + } + } + } + + // Deduplicate issue IDs - same issue can appear in both PR body and branch name, + // or across multiple PRs referencing the same issue + releasedIssues = deduplicateIssueIDs(releasedIssues) + + logger.Info("Found issues in pull requests", "count", len(releasedIssues)) + + // Cache of team name -> released state ID (looked up on demand) + releasedStateIDByTeam := make(map[string]string) + + // Helper to get released state ID for a team (with caching) + getReleasedStateID := func(teamName string) (string, error) { + if stateID, ok := releasedStateIDByTeam[teamName]; ok { + return stateID, nil + } + stateID, err := linearClient.WorkflowStateID(ctx, *releasedStateName, teamName) + if err != nil { + return "", err + } + releasedStateIDByTeam[teamName] = stateID + logger.Debug("Found released workflow ID for team", "team", teamName, "workflowID", stateID) + return stateID, nil + } + + currentReleaseDateStr := currentRelease.PublishedAt.Format("2006-01-02") + + releasedCount := 0 + skippedCount := 0 + + for _, issueID := range releasedIssues { + // Get issue details including team + issueDetails, err := linearClient.GetIssueDetails(ctx, issueID) + if err != nil { + logger.Error("Failed to get issue details", "issueID", issueID, "error", err) + skippedCount++ + continue + } + + // Filter by team if specified + if len(teamFilter) > 0 && !teamFilter.Contains(issueDetails.TeamName) { + logger.Debug("Skipping issue from different team", "issueID", issueID, "team", issueDetails.TeamName, "filter", *linearTeams) + continue + } + + // Filter by project if specified + if len(projectFilter) > 0 && !projectFilter.Contains(issueDetails.ProjectName) { + logger.Debug("Skipping issue from different project", "issueID", issueID, "project", issueDetails.ProjectName, "filter", *linearProjects) + continue + } + + // Get the released state ID for this issue's team + releasedStateID, err := getReleasedStateID(issueDetails.TeamName) + if err != nil { + logger.Error("Failed to get released state for team", "issueID", issueID, "team", issueDetails.TeamName, "error", err) + skippedCount++ + continue + } + + if err := linearClient.MoveIssueToState(ctx, *dryRun, issueID, issueDetails, releasedStateID, *readyForReleaseStateName, currentRelease.TagName, currentReleaseDateStr); err != nil { + logger.Error("Failed to move issue to state", "issueID", issueID, "error", err) + skippedCount++ + } else { + releasedCount++ + } + } + + logger.Info("Linear sync completed", "processed", len(releasedIssues), "released", releasedCount, "skipped", skippedCount) + + return nil +} + +// caseInsensitiveSet is a set of strings that supports case-insensitive lookup. +type caseInsensitiveSet map[string]struct{} + +// parseCSV parses a comma-separated string into a caseInsensitiveSet. +// Returns an empty set for empty input. +func parseCSV(csv string) caseInsensitiveSet { + s := make(caseInsensitiveSet) + for _, item := range strings.Split(csv, ",") { + item = strings.TrimSpace(item) + if item != "" { + s[strings.ToLower(item)] = struct{}{} + } + } + return s +} + +// Contains checks if the set contains the given value (case-insensitive). +func (s caseInsensitiveSet) Contains(value string) bool { + _, ok := s[strings.ToLower(value)] + return ok +} + +// deduplicateIssueIDs removes duplicate issue IDs from the slice while preserving order +func deduplicateIssueIDs(issueIDs []string) []string { + seen := make(map[string]bool) + result := make([]string, 0, len(issueIDs)) + for _, id := range issueIDs { + if !seen[id] { + seen[id] = true + result = append(result, id) + } + } + return result +} diff --git a/.github/actions/linear-release-sync/src/main_test.go b/.github/actions/linear-release-sync/src/main_test.go new file mode 100644 index 0000000..ccfb0c8 --- /dev/null +++ b/.github/actions/linear-release-sync/src/main_test.go @@ -0,0 +1,702 @@ +package main + +import ( + "bytes" + "context" + "flag" + "io" + "os" + "strings" + "testing" + "time" + + pullrequests "github.com/loft-sh/github-actions/linear-release-sync/changelog/pull-requests" + "github.com/loft-sh/github-actions/linear-release-sync/changelog/releases" + "github.com/shurcooL/githubv4" +) + +func TestStrictFilteringFlag(t *testing.T) { + testCases := []struct { + name string + args []string + expectedValue bool + description string + }{ + { + name: "Default strict filtering (true)", + args: []string{"linear-sync", "--release-tag", "v1.0.0"}, + expectedValue: true, + description: "Default should be strict filtering enabled", + }, + { + name: "Explicit strict filtering true", + args: []string{"linear-sync", "--release-tag", "v1.0.0", "--strict-filtering=true"}, + expectedValue: true, + description: "Explicitly setting strict filtering to true", + }, + { + name: "Explicit strict filtering false", + args: []string{"linear-sync", "--release-tag", "v1.0.0", "--strict-filtering=false"}, + expectedValue: false, + description: "Explicitly setting strict filtering to false", + }, + { + name: "Explicit strict filtering false with equals", + args: []string{"linear-sync", "--release-tag", "v1.0.0", "--strict-filtering=false"}, + expectedValue: false, + description: "Using equals form for boolean flag", + }, + } + + for _, tc := range testCases { + t.Run(tc.name, func(t *testing.T) { + // Parse flags to test the strict-filtering flag + flagset := flag.NewFlagSet("test", flag.ContinueOnError) + flagset.SetOutput(io.Discard) // Suppress flag parsing output + + var ( + releaseTag = flagset.String("release-tag", "", "The tag of the new release") + strictFiltering = flagset.Bool("strict-filtering", true, "Only include PRs that were actually merged before the release was published") + ) + + err := flagset.Parse(tc.args[1:]) + if err != nil { + t.Fatalf("Failed to parse flags: %v", err) + } + + if *strictFiltering != tc.expectedValue { + t.Errorf("%s: expected strict-filtering=%v, got=%v", tc.description, tc.expectedValue, *strictFiltering) + } + + // Verify release-tag is parsed correctly + if *releaseTag != "v1.0.0" { + t.Errorf("Expected release-tag to be v1.0.0, got %s", *releaseTag) + } + }) + } +} + +func TestLinearSyncLogic_StrictFiltering(t *testing.T) { + // This test simulates the core logic flow with strict filtering + releaseTime := time.Date(2024, 1, 15, 12, 0, 0, 0, time.UTC) + + // Mock data + allPRs := []pullrequests.PullRequest{ + { + Number: 1, + Body: "Fix bug ENG-1234", + Merged: true, + MergedAt: &githubv4.DateTime{Time: releaseTime.Add(-2 * time.Hour)}, // Before release + }, + { + Number: 2, + Body: "Add feature ENG-5678", + Merged: true, + MergedAt: &githubv4.DateTime{Time: releaseTime.Add(1 * time.Hour)}, // After release + }, + { + Number: 3, + Body: "Update docs ENG-9012", + Merged: true, + MergedAt: &githubv4.DateTime{Time: releaseTime.Add(-30 * time.Minute)}, // Before release + }, + } + + currentRelease := releases.Release{ + PublishedAt: githubv4.DateTime{Time: releaseTime}, + TagName: "v1.2.0", + } + + testCases := []struct { + name string + strictFiltering bool + expectedPRCount int + expectedIssueCount int + description string + }{ + { + name: "With strict filtering", + strictFiltering: true, + expectedPRCount: 2, // Only PRs 1 and 3 (merged before release) + expectedIssueCount: 2, // ENG-1234 and ENG-9012 + description: "Should filter out PRs merged after release", + }, + { + name: "Without strict filtering", + strictFiltering: false, + expectedPRCount: 3, // All PRs + expectedIssueCount: 3, // All issues + description: "Should include all PRs between tags", + }, + } + + for _, tc := range testCases { + t.Run(tc.name, func(t *testing.T) { + var pullRequests []LinearPullRequest + + if tc.strictFiltering { + // Simulate filtered PRs (would come from FetchPRsForRelease) + filteredPRs := filterPRsByTime(allPRs, currentRelease.PublishedAt.Time) + pullRequests = NewLinearPullRequests(filteredPRs, nil) + } else { + // Use all PRs (original behavior) + pullRequests = NewLinearPullRequests(allPRs, nil) + } + + if len(pullRequests) != tc.expectedPRCount { + t.Errorf("%s: expected %d PRs, got %d PRs", tc.description, tc.expectedPRCount, len(pullRequests)) + } + + // Extract issue IDs + var releasedIssues []string + for _, pr := range pullRequests { + if issueIDs := pr.IssueIDs(); len(issueIDs) > 0 { + releasedIssues = append(releasedIssues, issueIDs...) + } + } + + if len(releasedIssues) != tc.expectedIssueCount { + t.Errorf("%s: expected %d issues, got %d issues", tc.description, tc.expectedIssueCount, len(releasedIssues)) + } + }) + } +} + +// Helper function to simulate the filtering logic +func filterPRsByTime(prs []pullrequests.PullRequest, releaseTime time.Time) []pullrequests.PullRequest { + var filtered []pullrequests.PullRequest + for _, pr := range prs { + if pr.MergedAt != nil && pr.MergedAt.After(releaseTime) { + continue + } + if pr.MergedAt != nil { + filtered = append(filtered, pr) + } + } + return filtered +} + +func TestRunFunction_FlagValidation(t *testing.T) { + testCases := []struct { + name string + envVars map[string]string + args []string + expectError bool + expectedError string + description string + }{ + { + name: "Missing GitHub token", + envVars: map[string]string{ + "LINEAR_TOKEN": "test-linear-token", + }, + args: []string{"linear-sync", "--release-tag", "v1.0.0", "--repo", "vcluster"}, + expectError: true, + expectedError: "github token must be set", + description: "Should fail when GitHub token is missing", + }, + { + name: "Missing Linear token", + envVars: map[string]string{ + "GITHUB_TOKEN": "test-github-token", + }, + args: []string{"linear-sync", "--release-tag", "v1.0.0", "--repo", "vcluster"}, + expectError: true, + expectedError: "linear token must be set", + description: "Should fail when Linear token is missing", + }, + { + name: "Missing release tag", + envVars: map[string]string{ + "GITHUB_TOKEN": "test-github-token", + "LINEAR_TOKEN": "test-linear-token", + }, + args: []string{"linear-sync", "--repo", "vcluster"}, + expectError: true, + expectedError: "release tag must be set", + description: "Should fail when release tag is missing", + }, + { + name: "Missing repo", + envVars: map[string]string{ + "GITHUB_TOKEN": "test-github-token", + "LINEAR_TOKEN": "test-linear-token", + }, + args: []string{"linear-sync", "--release-tag", "v1.0.0"}, + expectError: true, + expectedError: "repo must be set", + description: "Should fail when repo is missing", + }, + { + name: "All required parameters provided", + envVars: map[string]string{ + "GITHUB_TOKEN": "test-github-token", + "LINEAR_TOKEN": "test-linear-token", + }, + args: []string{"linear-sync", "--release-tag", "v1.0.0", "--repo", "vcluster"}, + expectError: false, + description: "Should succeed when all required parameters are provided", + }, + } + + for _, tc := range testCases { + t.Run(tc.name, func(t *testing.T) { + // Set environment variables + for key, value := range tc.envVars { + os.Setenv(key, value) + defer os.Unsetenv(key) + } + + // Clear any existing env vars not in test case + if _, exists := tc.envVars["GITHUB_TOKEN"]; !exists { + os.Unsetenv("GITHUB_TOKEN") + } + if _, exists := tc.envVars["LINEAR_TOKEN"]; !exists { + os.Unsetenv("LINEAR_TOKEN") + } + + var stderr bytes.Buffer + err := run(context.Background(), &stderr, tc.args) + + if tc.expectError { + if err == nil { + t.Errorf("%s: expected error but got none", tc.description) + } else if !strings.Contains(err.Error(), tc.expectedError) { + t.Errorf("%s: expected error containing '%s', got '%s'", tc.description, tc.expectedError, err.Error()) + } + } else { + if err != nil { + // For successful cases, we expect to fail later in the process (API calls) + // but not during initial validation + if strings.Contains(err.Error(), "github token must be set") || + strings.Contains(err.Error(), "linear token must be set") || + strings.Contains(err.Error(), "release tag must be set") || + strings.Contains(err.Error(), "repo must be set") { + t.Errorf("%s: unexpected validation error: %s", tc.description, err.Error()) + } + // Other errors (like API failures) are expected in this test environment + } + } + }) + } +} + +func TestDeduplicateIssueIDs(t *testing.T) { + testCases := []struct { + name string + input []string + expected []string + }{ + { + name: "no duplicates", + input: []string{"eng-1234", "eng-5678", "eng-9012"}, + expected: []string{"eng-1234", "eng-5678", "eng-9012"}, + }, + { + name: "with duplicates within single PR (body + branch)", + input: []string{"eng-8061", "eng-8061"}, + expected: []string{"eng-8061"}, + }, + { + name: "with duplicates across multiple PRs", + input: []string{"eng-1234", "eng-5678", "eng-1234", "eng-9012", "eng-5678"}, + expected: []string{"eng-1234", "eng-5678", "eng-9012"}, + }, + { + name: "empty list", + input: []string{}, + expected: []string{}, + }, + { + name: "all duplicates", + input: []string{"eng-1234", "eng-1234", "eng-1234"}, + expected: []string{"eng-1234"}, + }, + { + name: "preserves order", + input: []string{"eng-3333", "eng-1111", "eng-2222", "eng-1111"}, + expected: []string{"eng-3333", "eng-1111", "eng-2222"}, + }, + } + + for _, tc := range testCases { + t.Run(tc.name, func(t *testing.T) { + result := deduplicateIssueIDs(tc.input) + + if len(result) != len(tc.expected) { + t.Errorf("expected %d items, got %d", len(tc.expected), len(result)) + return + } + + for i, v := range result { + if v != tc.expected[i] { + t.Errorf("at index %d: expected %q, got %q", i, tc.expected[i], v) + } + } + }) + } +} + +func TestIssueIDs_DuplicateAcrossBodyAndBranch(t *testing.T) { + pr := LinearPullRequest{ + PullRequest: pullrequests.PullRequest{ + Number: 1, + Body: "Fixes ENG-1234", + HeadRefName: "eng-1234/fix-bug", + }, + validTeamKeys: nil, + } + + ids := pr.IssueIDs() + // Same issue appears in both body and branch — IssueIDs returns both, + // deduplication happens later in deduplicateIssueIDs + if len(ids) != 2 { + t.Errorf("expected 2 raw matches (dedup happens upstream), got %d: %v", len(ids), ids) + } + + deduped := deduplicateIssueIDs(ids) + if len(deduped) != 1 { + t.Errorf("expected 1 after dedup, got %d: %v", len(deduped), deduped) + } + if deduped[0] != "eng-1234" { + t.Errorf("expected eng-1234, got %s", deduped[0]) + } +} + +func TestParseCSV(t *testing.T) { + testCases := []struct { + input string + expected int + contains []string + }{ + {"", 0, nil}, + {"Engineering", 1, []string{"Engineering"}}, + {"Engineering,DevOps", 2, []string{"Engineering", "DevOps"}}, + {"Engineering, DevOps, Docs", 3, []string{"Engineering", "DevOps", "Docs"}}, + {" Engineering , ", 1, []string{"Engineering"}}, + {",,", 0, nil}, + } + + for _, tc := range testCases { + t.Run(tc.input, func(t *testing.T) { + result := parseCSV(tc.input) + if len(result) != tc.expected { + t.Errorf("parseCSV(%q): expected %d items, got %d", tc.input, tc.expected, len(result)) + } + for _, v := range tc.contains { + if !result.Contains(v) { + t.Errorf("parseCSV(%q): expected to contain %q", tc.input, v) + } + } + }) + } +} + +func TestCaseInsensitiveSet_Contains(t *testing.T) { + s := parseCSV("Engineering,DevOps") + + if !s.Contains("engineering") { + t.Error("should match lowercase") + } + if !s.Contains("ENGINEERING") { + t.Error("should match uppercase") + } + if !s.Contains("Engineering") { + t.Error("should match mixed case") + } + if s.Contains("Docs") { + t.Error("should not match absent value") + } +} + +func TestTeamAndProjectFiltering(t *testing.T) { + issues := []struct { + ID string + TeamName string + ProjectName string + }{ + {"ENG-1", "Engineering", "vCluster"}, + {"ENG-2", "Engineering", "Platform"}, + {"DOC-1", "Docs", "vCluster"}, + {"DEVOPS-1", "DevOps", ""}, + } + + testCases := []struct { + name string + teamFilter string + projectFilter string + expectedIssueIDs []string + }{ + { + name: "no filters passes everything", + teamFilter: "", + projectFilter: "", + expectedIssueIDs: []string{"ENG-1", "ENG-2", "DOC-1", "DEVOPS-1"}, + }, + { + name: "filter by single team", + teamFilter: "Engineering", + projectFilter: "", + expectedIssueIDs: []string{"ENG-1", "ENG-2"}, + }, + { + name: "filter by multiple teams", + teamFilter: "Engineering,Docs", + projectFilter: "", + expectedIssueIDs: []string{"ENG-1", "ENG-2", "DOC-1"}, + }, + { + name: "filter by project", + teamFilter: "", + projectFilter: "vCluster", + expectedIssueIDs: []string{"ENG-1", "DOC-1"}, + }, + { + name: "filter by team and project", + teamFilter: "Engineering", + projectFilter: "Platform", + expectedIssueIDs: []string{"ENG-2"}, + }, + { + name: "filter excludes all", + teamFilter: "NonExistentTeam", + projectFilter: "", + expectedIssueIDs: []string{}, + }, + { + name: "empty project does not match project filter", + teamFilter: "", + projectFilter: "vCluster", + expectedIssueIDs: []string{"ENG-1", "DOC-1"}, + }, + } + + for _, tc := range testCases { + t.Run(tc.name, func(t *testing.T) { + teamFilter := parseCSV(tc.teamFilter) + projectFilter := parseCSV(tc.projectFilter) + + var result []string + for _, issue := range issues { + if len(teamFilter) > 0 && !teamFilter.Contains(issue.TeamName) { + continue + } + if len(projectFilter) > 0 && !projectFilter.Contains(issue.ProjectName) { + continue + } + result = append(result, issue.ID) + } + + if len(result) != len(tc.expectedIssueIDs) { + t.Errorf("expected %d issues, got %d: %v", len(tc.expectedIssueIDs), len(result), result) + return + } + + for i, id := range result { + if id != tc.expectedIssueIDs[i] { + t.Errorf("at index %d: expected %q, got %q", i, tc.expectedIssueIDs[i], id) + } + } + }) + } +} + +func TestIssueIDs_InvalidPatterns(t *testing.T) { + testCases := []struct { + name string + body string + branch string + teamKeys ValidTeamKeys + expectedLen int + }{ + { + name: "single char prefix rejected by regex", + body: "Fixes A-1", + branch: "main", + teamKeys: nil, + expectedLen: 0, // regex requires 2+ char prefix + }, + { + name: "too-long prefix matches substring", + body: "Fixes VERYLONGTEAMK-1", + branch: "main", + teamKeys: nil, + expectedLen: 1, // regex matches substring YLONGTEAMK-1 + }, + { + name: "number too long matches substring", + body: "Fixes ENG-123456", + branch: "main", + teamKeys: nil, + expectedLen: 1, // regex matches ENG-12345 (first 5 digits) + }, + { + name: "valid pattern but unknown team filtered out", + body: "Fixes FAKE-1234", + branch: "main", + teamKeys: ValidTeamKeys{"eng": {}}, + expectedLen: 0, + }, + { + name: "numeric-only prefix rejected by regex", + body: "Fixes 123-456", + branch: "main", + teamKeys: nil, + expectedLen: 0, // regex requires [A-Z] prefix + }, + } + + for _, tc := range testCases { + t.Run(tc.name, func(t *testing.T) { + pr := LinearPullRequest{ + PullRequest: pullrequests.PullRequest{ + Number: 1, + Body: tc.body, + HeadRefName: tc.branch, + }, + validTeamKeys: tc.teamKeys, + } + + ids := pr.IssueIDs() + if len(ids) != tc.expectedLen { + t.Errorf("expected %d issue IDs, got %d: %v", tc.expectedLen, len(ids), ids) + } + }) + } +} + +func TestExtractTeamKey_NoHyphen(t *testing.T) { + result := extractTeamKey("nodash") + if result != "nodash" { + t.Errorf("expected %q, got %q", "nodash", result) + } +} + +func TestTeamKeyFiltering(t *testing.T) { + // Test that issue IDs are filtered by valid team keys + validKeys := ValidTeamKeys{ + "eng": {}, + "doc": {}, + "devops": {}, + } + + testCases := []struct { + name string + prBody string + prBranch string + validTeamKeys ValidTeamKeys + expectedIssues []string + description string + }{ + { + name: "Filter out invalid team keys", + prBody: "Fixes ENG-1234 and pr-3354", + prBranch: "feature/update", + validTeamKeys: validKeys, + expectedIssues: []string{"eng-1234"}, + description: "Should filter out pr-3354 as 'pr' is not a valid team key", + }, + { + name: "Filter out multiple invalid patterns", + prBody: "Fixes snap-1, ENG-5678, and build-123", + prBranch: "feature/update", + validTeamKeys: validKeys, + expectedIssues: []string{"eng-5678"}, + description: "Should filter out snap-1 and build-123", + }, + { + name: "Allow all valid team keys", + prBody: "Fixes ENG-1234, DOC-567, and DEVOPS-890", + prBranch: "feature/update", + validTeamKeys: validKeys, + expectedIssues: []string{"eng-1234", "doc-567", "devops-890"}, + description: "Should allow all issues with valid team keys", + }, + { + name: "Case insensitive team keys", + prBody: "Fixes eng-1234 and ENG-5678", + prBranch: "DOC-999/update", + validTeamKeys: validKeys, + expectedIssues: []string{"eng-1234", "eng-5678", "doc-999"}, + description: "Should match team keys case-insensitively", + }, + { + name: "No filtering when validTeamKeys is nil", + prBody: "Fixes pr-3354 and snap-1", + prBranch: "feature/update", + validTeamKeys: nil, + expectedIssues: []string{"pr-3354", "snap-1"}, + description: "Should not filter when validTeamKeys is nil", + }, + { + name: "Empty validTeamKeys filters everything", + prBody: "Fixes ENG-1234", + prBranch: "feature/update", + validTeamKeys: ValidTeamKeys{}, + expectedIssues: []string{}, + description: "Should filter all issues when validTeamKeys is empty", + }, + } + + for _, tc := range testCases { + t.Run(tc.name, func(t *testing.T) { + pr := LinearPullRequest{ + PullRequest: pullrequests.PullRequest{ + Number: 1, + Body: tc.prBody, + HeadRefName: tc.prBranch, + Merged: true, + }, + validTeamKeys: tc.validTeamKeys, + } + + extractedIssues := pr.IssueIDs() + + if len(extractedIssues) != len(tc.expectedIssues) { + t.Errorf("%s: expected %d issues, got %d issues", tc.description, len(tc.expectedIssues), len(extractedIssues)) + t.Errorf("Expected: %v, Got: %v", tc.expectedIssues, extractedIssues) + return + } + + // Check that all expected issues are present + expectedMap := make(map[string]bool) + for _, issue := range tc.expectedIssues { + expectedMap[issue] = true + } + + for _, issue := range extractedIssues { + if !expectedMap[issue] { + t.Errorf("%s: unexpected issue ID found: %s", tc.description, issue) + } + delete(expectedMap, issue) + } + + for issue := range expectedMap { + t.Errorf("%s: expected issue ID not found: %s", tc.description, issue) + } + }) + } +} + +func TestExtractTeamKey(t *testing.T) { + testCases := []struct { + issueID string + expectedKey string + }{ + {"eng-1234", "eng"}, + {"DOC-567", "doc"}, + {"DEVOPS-890", "devops"}, + {"pr-3354", "pr"}, + {"a-1", "a"}, + {"", ""}, + } + + for _, tc := range testCases { + t.Run(tc.issueID, func(t *testing.T) { + result := extractTeamKey(tc.issueID) + if result != tc.expectedKey { + t.Errorf("extractTeamKey(%q) = %q, want %q", tc.issueID, result, tc.expectedKey) + } + }) + } +} diff --git a/.github/actions/linear-release-sync/src/pr.go b/.github/actions/linear-release-sync/src/pr.go new file mode 100644 index 0000000..92fad2d --- /dev/null +++ b/.github/actions/linear-release-sync/src/pr.go @@ -0,0 +1,93 @@ +package main + +import ( + "regexp" + "strings" + + pullrequests "github.com/loft-sh/github-actions/linear-release-sync/changelog/pull-requests" +) + +var issuesInBodyREs = []*regexp.Regexp{ + regexp.MustCompile(`(?i)(?P[A-Z]{2,10}-\d{1,5})`), +} + +// ValidTeamKeys holds a set of known Linear team keys (lowercase) for filtering +type ValidTeamKeys map[string]struct{} + +type LinearPullRequest struct { + pullrequests.PullRequest + validTeamKeys ValidTeamKeys +} + +func NewLinearPullRequests(prs []pullrequests.PullRequest, validTeamKeys ValidTeamKeys) []LinearPullRequest { + linearPRs := make([]LinearPullRequest, 0, len(prs)) + + for _, pr := range prs { + linearPRs = append(linearPRs, LinearPullRequest{ + PullRequest: pr, + validTeamKeys: validTeamKeys, + }) + } + + return linearPRs +} + +// IssueIDs extracts the Linear issue IDs from either the pull requests body +// or it's branch name. +// +// Returns only issue IDs that match known Linear team keys (e.g., ENG-1234, DOC-567). +// Filters out false positives like pr-3354, snap-1 that match the regex pattern +// but aren't actual Linear issues. +// +// Will return an empty slice if no valid issues are found. +func (p LinearPullRequest) IssueIDs() []string { + issueIDs := []string{} + + for _, re := range issuesInBodyREs { + for _, body := range []string{p.Body, p.HeadRefName} { + matches := re.FindAllStringSubmatch(body, -1) + if len(matches) == 0 { + continue + } + + for _, match := range matches { + for i, name := range re.SubexpNames() { + issueID := "" + + switch name { + case "issue": + issueID = strings.ToLower(match[i]) + issueID = strings.TrimSpace(issueID) + } + + if strings.HasPrefix(strings.ToLower(issueID), "cve") { + issueID = "" + } + + // Filter by valid team keys if provided + if issueID != "" && p.validTeamKeys != nil { + teamKey := extractTeamKey(issueID) + if _, valid := p.validTeamKeys[teamKey]; !valid { + issueID = "" + } + } + + if issueID != "" { + issueIDs = append(issueIDs, issueID) + } + } + } + } + } + + return issueIDs +} + +// extractTeamKey extracts the team key from an issue ID (e.g., "eng" from "eng-1234") +func extractTeamKey(issueID string) string { + parts := strings.SplitN(issueID, "-", 2) + if len(parts) < 1 { + return "" + } + return strings.ToLower(parts[0]) +} diff --git a/.github/workflows/release-linear-release-sync.yaml b/.github/workflows/release-linear-release-sync.yaml new file mode 100644 index 0000000..23122ab --- /dev/null +++ b/.github/workflows/release-linear-release-sync.yaml @@ -0,0 +1,62 @@ +name: Release linear-release-sync + +on: + push: + tags: + - 'linear-release-sync/v*' + workflow_dispatch: + inputs: + tag: + description: 'Release tag (e.g. linear-release-sync/v1)' + required: true + +permissions: + contents: write + +jobs: + release: + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2 + with: + persist-credentials: false + + - uses: actions/setup-go@4b73464bb391d4059bd26b0524d20df3927bd417 # v6.3.0 + with: + go-version-file: .github/actions/linear-release-sync/src/go.mod + cache: false + + - name: Run tests + run: go test ./... + working-directory: .github/actions/linear-release-sync/src + + - name: Build binary + working-directory: .github/actions/linear-release-sync/src + run: >- + CGO_ENABLED=0 GOOS=linux GOARCH=amd64 + go build -trimpath -ldflags="-s -w" + -o ../linear-release-sync-linux-amd64 . + + - name: Generate checksum + working-directory: .github/actions/linear-release-sync + run: sha256sum linear-release-sync-linux-amd64 > linear-release-sync-linux-amd64.sha256 + + - name: Create or update release + env: + GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} + TAG: ${{ github.event.inputs.tag || github.ref_name }} + BINARY: .github/actions/linear-release-sync/linear-release-sync-linux-amd64 + CHECKSUM: .github/actions/linear-release-sync/linear-release-sync-linux-amd64.sha256 + run: | + if gh release view "$TAG" --repo "$GITHUB_REPOSITORY" >/dev/null 2>&1; then + gh release upload "$TAG" \ + "$BINARY" "$CHECKSUM" \ + --repo "$GITHUB_REPOSITORY" \ + --clobber + else + gh release create "$TAG" \ + "$BINARY" "$CHECKSUM" \ + --repo "$GITHUB_REPOSITORY" \ + --title "$TAG" \ + --notes "Automated release for linear-release-sync action." + fi diff --git a/.github/workflows/test-linear-release-sync.yaml b/.github/workflows/test-linear-release-sync.yaml new file mode 100644 index 0000000..c1b8099 --- /dev/null +++ b/.github/workflows/test-linear-release-sync.yaml @@ -0,0 +1,29 @@ +name: Test linear-release-sync + +on: + push: + branches: [main] + paths: + - '.github/actions/linear-release-sync/**' + pull_request: + paths: + - '.github/actions/linear-release-sync/**' + +jobs: + test: + runs-on: ubuntu-latest + permissions: + contents: read + steps: + - uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2 + with: + persist-credentials: false + - uses: actions/setup-go@4b73464bb391d4059bd26b0524d20df3927bd417 # v6.3.0 + with: + go-version-file: .github/actions/linear-release-sync/src/go.mod + - name: Run tests + run: go test ./... + working-directory: .github/actions/linear-release-sync/src + - name: Verify binary builds + working-directory: .github/actions/linear-release-sync/src + run: CGO_ENABLED=0 GOOS=linux GOARCH=amd64 go build -trimpath -ldflags="-s -w" -o /dev/null . diff --git a/CLAUDE.md b/CLAUDE.md index 8f82ece..9bfa611 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -6,6 +6,8 @@ This file provides guidance to Claude Code (claude.ai/code) when working with co - Run all tests: `make test` - Test semver-validation: `make test-semver-validation` - Test linear-pr-commenter: `make test-linear-pr-commenter` +- Test linear-release-sync: `make test-linear-release-sync` +- Build linear-release-sync binary: `make build-linear-release-sync` - Lint workflows: `make lint` (requires actionlint and zizmor) ## Code Style Guidelines @@ -19,6 +21,15 @@ This file provides guidance to Claude Code (claude.ai/code) when working with co - Use input validation and provide helpful error messages ## Release Process + +Actions use per-action tags (e.g. `semver-validation/v1`, `linear-release-sync/v1`). + +### YAML-only / Node.js actions (semver-validation, release-notification, etc.) - Update code and commit changes -- Tag the release: `git tag -f v1` -- Push tag: `git push origin v1 --force` \ No newline at end of file +- Tag the release: `git tag -f /v1` +- Push tag: `git push origin /v1 --force` + +### Go actions with pre-built binaries (linear-release-sync) +These actions download a pre-built binary from a GitHub release at runtime. +- **New version**: push a new tag (`git tag linear-release-sync/v1 && git push origin linear-release-sync/v1`) — triggers `release-linear-release-sync.yaml` automatically +- **Update existing version**: force-pushing a tag does NOT trigger workflows — use `workflow_dispatch` instead: `gh workflow run release-linear-release-sync.yaml -f tag=linear-release-sync/v1` \ No newline at end of file diff --git a/Makefile b/Makefile index 81bc093..582f41f 100644 --- a/Makefile +++ b/Makefile @@ -1,11 +1,11 @@ -.PHONY: test test-semver-validation test-linear-pr-commenter lint help +.PHONY: test test-semver-validation test-linear-pr-commenter test-linear-release-sync build-linear-release-sync lint help ACTIONS_DIR := .github/actions help: ## show this help @grep -E '^[a-zA-Z_-]+:.*?## .*$$' $(MAKEFILE_LIST) | awk 'BEGIN {FS = ":.*?## "}; {printf " %-30s %s\n", $$1, $$2}' -test: test-semver-validation test-linear-pr-commenter ## run all action tests +test: test-semver-validation test-linear-pr-commenter test-linear-release-sync ## run all action tests test-semver-validation: ## run semver-validation unit tests cd $(ACTIONS_DIR)/semver-validation && npm ci --silent && NODE_OPTIONS=--experimental-vm-modules npx jest --ci --coverage --watchAll=false @@ -13,6 +13,12 @@ test-semver-validation: ## run semver-validation unit tests test-linear-pr-commenter: ## run linear-pr-commenter unit tests cd $(ACTIONS_DIR)/linear-pr-commenter/src && go test -v ./... +test-linear-release-sync: ## run linear-release-sync unit tests + cd $(ACTIONS_DIR)/linear-release-sync/src && go test -v ./... + +build-linear-release-sync: ## build linear-release-sync binary (linux/amd64) + cd $(ACTIONS_DIR)/linear-release-sync/src && CGO_ENABLED=0 GOOS=linux GOARCH=amd64 go build -trimpath -ldflags="-s -w" -o ../linear-release-sync-linux-amd64 . + lint: ## run actionlint and zizmor on workflows actionlint .github/workflows/*.yaml zizmor .github/ diff --git a/README.md b/README.md index 382d0a0..23efa92 100644 --- a/README.md +++ b/README.md @@ -38,6 +38,26 @@ Validates whether a given version string follows semantic versioning (semver) fo See [semver-validation README](./.github/actions/semver-validation/README.md) for detailed documentation. +### Linear Release Sync Action + +Syncs Linear issues to the "Released" state when a GitHub release is published. Finds PRs between releases, extracts Linear issue IDs, and moves matching issues from "Ready for Release" to "Released". + +**Location:** `.github/actions/linear-release-sync` + +**Usage:** + +```yaml +- name: Sync Linear issues + uses: loft-sh/github-actions/.github/actions/linear-release-sync@linear-release-sync/v1 + with: + release-tag: ${{ needs.publish.outputs.release_version }} + repo-name: my-repo + github-token: ${{ secrets.GH_ACCESS_TOKEN }} + linear-token: ${{ secrets.LINEAR_TOKEN }} +``` + +See [linear-release-sync README](./.github/actions/linear-release-sync/README.md) for detailed documentation. + ## Available Reusable Workflows ### Validate Renovate Config @@ -97,6 +117,7 @@ Run tests for a specific action: ```bash make test-semver-validation make test-linear-pr-commenter +make test-linear-release-sync ``` Run linters (actionlint + zizmor): @@ -118,6 +139,8 @@ the action's files change: - `test-semver-validation.yaml` - triggers on `.github/actions/semver-validation/**` - `test-linear-pr-commenter.yaml` - triggers on `.github/actions/linear-pr-commenter/**` +- `test-linear-release-sync.yaml` - triggers on `.github/actions/linear-release-sync/**` +- `release-linear-release-sync.yaml` - builds and publishes the binary on tag push or `workflow_dispatch` ### Writing tests for new actions diff --git a/renovate.json b/renovate.json index 443f2ba..1d229ef 100644 --- a/renovate.json +++ b/renovate.json @@ -15,7 +15,14 @@ { "description": "Group linear-pr-commenter Go dependencies", "matchManagers": ["gomod"], + "matchFileNames": [".github/actions/linear-pr-commenter/**"], "groupName": "linear-pr-commenter" + }, + { + "description": "Group linear-release-sync Go dependencies", + "matchManagers": ["gomod"], + "matchFileNames": [".github/actions/linear-release-sync/**"], + "groupName": "linear-release-sync" } ], "customManagers": [