Automation workflows for qui — a qBittorrent automation manager. These workflows manage the complete torrent lifecycle: tagging, maintenance, share limits, and cleanup.
21 automations organized by function, designed for a hardlink-aware setup with cross-seed support.
- qui instance with API access
- qBittorrent with hardlink detection enabled (save path and hardlink target on the same filesystem)
curlandpython3(for the export script)
Import individual JSON files through the qui web UI, or use the API:
curl -X POST "http://your-qui:7474/api/instances/${QUI_INSTANCE_ID:-1}/automations" \
-H "X-API-Key: YOUR_KEY" \
-H "Content-Type: application/json" \
-d @tagging/Tag\ -\ tracker\ name.jsonThe id field in each JSON is from the source instance and will be reassigned on import. Sort order and conditions are preserved.
# Set your qui URL and API key
export QUI_URL="http://your-qui:7474"
export QUI_API_KEY="your-api-key"
# Run the export script
./scripts/export.shThe export script fetches all automations from the API, strips instance-specific fields (instanceId, createdAt, updatedAt), and writes individual JSON files to the categorized directories. When adding new automations, update the FILE_MAP in the script.
qui_workflows/
├── tagging/ # Tracker name, noHL, stalledDL, tracker issue
├── maintenance/ # Resume, delete unregistered, recheck missing
├── limits/ # Share limits and speed limits by category
├── cleanup/ # Delete rules by category with guards
├── scripts/
│ └── export.sh # Export automations from live instance
└── README.md
Limits and cleanup automations are interleaved by sort order (limits at even positions, cleanup at odd) so that each category's cleanup rule runs immediately after its limits rule.
All limit and cleanup automations run on a 1-hour interval (intervalSeconds: 3600), except HL-remove-limits which uses qui's default interval.
| Sort | Name | Interval | Action |
|---|---|---|---|
| 1 | Tag: tracker name | 30min | Auto-tag with tracker display name |
| 4 | noHL | 6h | Tag noHL when no external hardlinks and completed 90+ minutes ago (excludes MAM, cross-seeds) |
| 5 | Tag: stalledDL | 30min | Tag stalledDL for stalled downloads |
| 6 | Tag: tracker issue | 30min | Tag issue for errored/down/missing files |
| Sort | Name | Interval | Action |
|---|---|---|---|
| 2 | Resume Incomplete | 6h | Resume stopped torrents that aren't complete |
| 3 | Resume: min seed failsafe | 30min | Resume any non-downloading, non-active torrent seeding < 15 days; also resets share limits to unlimited (re-applied by later limit automations) |
| 7 | Delete: unregistered | 15min | Delete unregistered torrents (90min age guard, tracker not down) |
| 8 | Recheck: missing files | default | Built-in qui action — rechecks torrents with missing files (empty conditions is intentional) |
| Sort | Name | Ratio | Seed Time | Upload Cap |
|---|---|---|---|---|
| 9 | HL-remove-limits | unlimited | unlimited | — |
| 10 | noHL-movies-limits | 33 | 15 days | — |
| 12 | noHL-tv-limits | unlimited | 21 days | — |
| 14 | TL-limits | unlimited | 365 days | ~25 MB/s (25390 KiB/s) |
| 16 | TLnoHL-limits | unlimited | 15 days | ~25 MB/s (25390 KiB/s) |
| 18 | noHL-catchall-limits | 15 | 35 days | — |
| 20 | noHL-xseed-limits | unlimited | 15 days | — |
| Sort | Name | Guard | Delete Trigger |
|---|---|---|---|
| 11 | noHL-movies-cleanup | seed >= 15d | ratio >= 33 OR seed >= 15d |
| 13 | noHL-tv-cleanup | seed >= 15d | ratio >= 3 OR seed >= 21d |
| 15 | TL-cleanup | — | seed >= 365d |
| 17 | TLnoHL-cleanup | seed >= 15d | ratio >= 1.25 OR seed >= 12d |
| 19 | noHL-catchall-cleanup | seed >= 15d | ratio >= 15 OR seed >= 35d |
| 21 | noHL-xseed-cleanup | — | seed >= 15d |
All matching automations fire in sortOrder sequence. For conflicting share limits, the last matching automation wins — earlier limits get overwritten by later ones.
This is why limits are ordered from most specific to least specific:
- HL-remove-limits (hardlinked → unlimited)
- Category-specific limits (movies, TV, TL)
- Catchall (everything else)
Delete automations cannot combine with other actions. Each delete rule is a separate automation. Share limits + speed limits can combine in a single automation.
Torrents are routed to rule groups based on conditions. Routing uses HARDLINK_SCOPE directly (not the noHL tag), so it applies immediately without waiting for the 90-minute tagging grace period.
| Condition | Group |
|---|---|
HARDLINK_SCOPE = outside_qbittorrent + not TL + not cross-seed |
HL (remove limits, seed forever) |
CATEGORY CONTAINS movie + not hardlinked + not TL + not cross-seed |
Movies |
CATEGORY CONTAINS tv + not hardlinked + not TL + not cross-seed |
TV |
TAGS CONTAINS TorrentLeech + hardlinked |
TL |
TAGS CONTAINS TorrentLeech + not hardlinked |
TLnoHL |
CATEGORY CONTAINS cross + not hardlinked |
Cross-seed |
| Not hardlinked + not MAM + not TL + not movie + not tv + not cross | Catchall |
- MyAnonamouse (MAM): Excluded from noHL tagging and catchall — seeds forever
- TorrentLeech (TL): Speed-limited to ~25 MB/s upload (25390 KiB/s), separate HL/noHL rules
- Cross-seeds: Dedicated rules — seed 15 days then delete. Cross-seeds with hardlinks are naturally exempt (treated as hardlinked)
- All deletes: Use
deleteWithFilesPreserveCrossSeedsmode — deletes the torrent but preserves files if a cross-seed instance references them. Exception: unregistered usesdeleteWithFiles(no cross-seed check since the torrent is already dead on the tracker)
The "Resume: min seed failsafe" (sort 3) catches torrents that get paused or stopped prematurely. It resumes any torrent that is not downloading, not active, and has been seeding less than 15 days.
Important: Cleanup guards should be <= the failsafe threshold to prevent stuck torrents in the gap between guard and failsafe.
The tag and tags fields in tagging automations contain identical conditions — this is a qui schema requirement (both must be present and match). The conditions.schemaVersion field is always "1".
Used in SEEDING_TIME, ADDED_ON_AGE, COMPLETION_ON condition fields:
| Seconds | Human |
|---|---|
| 900 | 15 minutes |
| 1800 | 30 minutes |
| 5400 | 90 minutes |
| 21600 | 6 hours |
| 86400 | 1 day |
| 1036800 | 12 days |
| 1296000 | 15 days |
| 1814400 | 21 days |
| 3024000 | 35 days |
| 31536000 | 365 days |
Used in shareLimits.seedingTimeMinutes:
| Minutes | Human |
|---|---|
| 21600 | 15 days |
| 30240 | 21 days |
| 50400 | 35 days |
| 525600 | 365 days |
Warning: Condition fields use seconds while seedingTimeMinutes uses minutes. Mixing these up is an easy mistake that results in dramatically wrong behavior.
This setup uses category-based routing (movies/tv/TL/catchall) while TRaSH's workflows use tracker-tier-based routing (Tier 1/2/3). Neither approach is wrong — they reflect different philosophies.
Key differences:
- No tier system: We route by category + tracker tag rather than tracker tier
- No upload speed throttling for non-TL: TRaSH throttles Tier 2 (2 MB/s) and Tier 3 (1 MB/s) after ratio > 2.0. We only throttle TL at 25 MB/s
- No stalled download cleanup: TRaSH deletes stalled downloads < 9.5% progress after 4h
- No problem cross-seed detection: TRaSH tags and deletes cross-seeds that are stalled/missing after 24h
Issues identified during review (documented, not yet applied):
-
noHL-movies-cleanup redundant OR: Guard
seed >= 15d+ triggerratio >= 33 OR seed >= 15d— the seed branch is always true when the guard passes, making ratio irrelevant. Every noHL movie deletes at exactly 15d. -
TLnoHL-cleanup guard/trigger mismatch: Guard is
seed >= 15dbut OR fallback isseed >= 12d. The 12d branch can never fire because the 15d guard blocks it. Either lower the guard toseed >= 12d(earliest delete at 12d) or raise the OR fallback toseed >= 15d(earliest delete at ratio OR 15d). -
Consider adding stalled download cleanup and problem cross-seed detection per TRaSH patterns.
- baker-scripts/StarrScripts — includes
qui-xseed.shfor cross-seed automation - TRaSH-Guides/qui_workflows — TRaSH's tier-based reference implementation
These automations are provided as-is with no warranty. They will delete torrents and files based on the configured conditions. Always review the JSON conditions before importing and test with dryRun: true first. The authors are not responsible for any data loss resulting from the use of these workflows.