-
Notifications
You must be signed in to change notification settings - Fork 0
Description
What do we want to do about code coverage generation and analysis for PHP unit tests running in GitHub workflows?
The previous drone system ran PHP unit tests by using a Makefile target make test-php-unit-dbg That did the necessary PHP unit test run with coverage enabled. That was run in multiple pipelines (usually for different databases).
The coverage data from each pipeline was copied into some shared storage. Then a coverage analysis pipeline was run, that used the coverage reports from all the pipelines.
That system required keeping some shared storage available. GitHub workflows have a feature where you can upload and download "artifacts" (files, directories...) to pass them from one job to another (or to store them after the workflow is finished).
https://docs.github.com/en/actions/tutorials/store-and-share-data
We could implement the ability to upload the coverage data from each PHP unit test job as an artifact, then have a coverage analysis job that downloads all the coverage data artifacts and does the coverage analysis.
Or we could just gather coverage for only one PHP unit test run, and analyse that directly in that workflow job.
Do we keep using SonarCloud for the coverage analysis and reporting?