-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Description
The Problem
When AI agents use gws drive +upload or gws gmail +send to push files to Google Workspace, those files carry zero provenance metadata. There's no way for downstream consumers to know:
- Was this file AI-generated?
- Which model created it?
- What sources were used?
- Has it been human-reviewed?
With the EU AI Act Article 50 taking effect August 2, 2026, AI-generated content published for public interest must carry transparency metadata. Files flowing through Google Workspace are a primary vector for this.
Current Gap
gws drive +upload report.pdf uploads the file as-is. The Google Drive file properties don't indicate AI provenance. A team member who opens this file in Google Docs has no trust context.
Possible Approaches
- Pre-upload hook — a skill that stamps files with provenance before
+uploadruns - Google Drive custom properties — set
ai_generated,trust_score,sourceas file properties via the Drive API - Built-in
--provenanceflag —gws drive +upload report.pdf --provenance agent=claude,trust=0.85
We built a set of skills that do approach #1 using AKF (open metadata format): https://github.com/HMAKT99/akf-gws-skills
But curious if the gws team has thought about provenance natively — especially given the EU AI Act timeline.
Context
- EU AI Act Article 50: August 2, 2026
- 47% of developers now distrust AI-generated outputs (2026)
- Google Workspace is where most enterprise AI content lands
- https://akf.dev — open format, MIT licensed