A lightweight, automated PostgreSQL backup service that creates compressed backups and uploads them to S3-compatible storage.
- Scheduled Backups: Configurable cron-style scheduling using
robfig/cron - Compression: Automatic gzip compression to save storage space
- S3 Integration: Upload backups to AWS S3 or S3-compatible storage (MinIO, etc.)
- Retry Logic: Automatic retry on S3 upload failures
- Local Cleanup: Removes local backup files after successful S3 upload
- Flexible Configuration: Configure via YAML file or environment variables
- Docker Support: Ready-to-use Docker container
docker run -v $(pwd)/config.yaml:/app/config.yaml \
ghcr.io/010codingcollective/postgres-backup:latestversion: '3.8'
services:
postgres-backup:
image: ghcr.io/010codingcollective/postgres-backup:latest
volumes:
- ./config.yaml:/app/config.yaml
environment:
- POSTGRES_PASSWORD=your_password
- S3_SECRET_ACCESS_KEY=your_secret_keyConfiguration can be provided via:
- YAML file (
config.yaml) - Environment variables (highest priority)
schedule: "@daily" # Cron format: @daily, @hourly, or "0 2 * * *"
run_at_startup: false # Run backup immediately on startup
postgres_database: mydb
postgres_user: postgres
postgres_password: secret
postgres_host: localhost
postgres_port: "5432"
postgres_extra_opts: "--schema=public --blobs"
s3:
endpoint: "" # Leave empty for AWS S3, or set for MinIO/S3-compatible
region: us-east-1
bucket: my-backup-bucket
access_key_id: AKIAIOSFODNN7EXAMPLE
secret_access_key: wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
use_path_style: false # Set to true for MinIO and most S3-compatible servicesAll configuration can be overridden with environment variables:
SCHEDULE="@daily"
RUN_AT_STARTUP="false"
POSTGRES_DATABASE="mydb"
POSTGRES_USER="postgres"
POSTGRES_PASSWORD="secret"
POSTGRES_HOST="localhost"
POSTGRES_PORT="5432"
POSTGRES_EXTRA_OPTS="--schema=public --blobs"
S3_ENDPOINT=""
S3_REGION="us-east-1"
S3_BUCKET="my-backup-bucket"
S3_ACCESS_KEY_ID="AKIAIOSFODNN7EXAMPLE"
S3_SECRET_ACCESS_KEY="wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY"
S3_USE_PATH_STYLE="false"The schedule field uses cron format. Common examples:
@daily- Run once per day at midnight@hourly- Run once per hour@every 6h- Run every 6 hours0 2 * * *- Run at 2:00 AM daily0 */6 * * *- Run every 6 hours0 2 * * 0- Run at 2:00 AM every Sunday
Backups are stored in S3 with the following structure:
s3://your-bucket/backups/{database}-backup-{timestamp}.sql.gz
Example: backups/mydb-backup-20250128-020000.sql.gz
- Go 1.23+
- PostgreSQL client tools (
pg_dump)
go build -o pg-backup ../pg-backupdocker build -t postgres-backup .Or using buildah:
buildah bud -f Dockerfile -t postgres-backup .- Scheduler starts based on configured cron schedule
- pg_dump creates a PostgreSQL backup
- Compression compresses the backup with gzip
- S3 Upload uploads the compressed backup to S3 (with retry)
- Cleanup removes the local backup file after successful upload
- If
pg_dumpfails, the backup job is aborted - If S3 upload fails, it retries once
- If both S3 upload attempts fail, the local backup file is retained and logged
- If S3 is misconfigured, the application fails at startup
MIT License - see LICENSE file for details
Contributions are welcome! Please open an issue or submit a pull request.