Skip to content

Conversation

@ejscheepers
Copy link
Contributor

@ejscheepers ejscheepers commented Nov 13, 2025

This pull request introduces a comprehensive enhancement to the S3 backup workflow, delivering significant disk space savings, improved multi-connection organization, and seamless operational continuity when local backup files are purged.


Part 1: S3 Purge Local Backup After Upload

  • New Configuration: Added s3_purge_local boolean setting, fully integrated across database, backend, and frontend.
  • Behavior: When enabled, local .sql backup files are automatically deleted immediately after a successful upload to S3, freeing valuable server disk space.
  • User Control: A new UI toggle has been added to the S3 Storage Settings page, clearly labeled "Purge Local Backup After Upload", giving administrators full control over storage behavior.

Part 2: Intelligent S3 Fallback for All Backup Operations

To ensure zero disruption when local files are purged, a robust fallback mechanism has been implemented across all backup operations:

  • DownloadBackup(), RestoreBackup(), and CompareBackups() now:

    1. Check for the existence of the local backup file.
    2. If missing but an s3_object_key is stored:
      • Automatically download the file from S3 to a secure temporary location.
      • Proceed with the operation using the downloaded file.
      • Clean up the temporary file upon completion.
    3. If neither local file nor S3 key exists → return a clear, user-friendly error.
  • New Helper: ensureBackupFileAvailable() in backup_service.go centralizes this logic for consistency and maintainability.


Part 3: Automatic S3 Folder Organization by Database Connection

Backups are now automatically organized into subfolders based on the sanitized connection name, transforming flat S3 structures into a clean, scalable hierarchy:

Before:

s3://my-bucket/backups/backup_2025-11-14.sql

After:

s3://my-bucket/backups/production-postgres/backup_2025-11-14.sql
s3://my-bucket/backups/staging-mysql/backup_2025-11-14.sql
  • Backend Implementation:

    • New UploadFileWithPath() method in S3Storage accepts a subfolder.
    • New getObjectKeyWithPath() builds full S3 object keys using connection name.
    • uploadToS3IfEnabled() now passes sanitized connection name as subfolder.
    • All upload callers updated to include connection context.
  • Key Benefits:

    • Easy identification of backups per database.
    • Enables per-connection S3 lifecycle policies (e.g., 30-day retention for prod, 7-day for dev).
    • Fully backward compatible — download/restore uses stored s3_object_key with full path.

Frontend Enhancements (TypeScript/React)

  • New Field: s3_object_key?: string added to Backup interface in backup.ts.
  • Visual Indicators:
    • Added blue cloud badge with S3 label and cloud icon.
    • Displayed in:
      • Backup History List (mobile & desktop)
      • Dashboard Recent Activity
    • Consistent styling across light/dark themes.

Key Features & Guarantees

Feature Description
Space Saving Enable purge to eliminate local storage bloat
Seamless Fallback All operations work even with S3-only storage
Visual Clarity Blue S3 badges instantly identify cloud backups
Scalability Supports unlimited database connections
Backward Compatibility No breaking changes; local workflows unchanged
Cleanup Safety Temp files always removed post-operation
Error Resilience Clear messages if backup is unavailable

Result for Users

Administrators can now:

  1. Enable "Purge Local Backup After Upload" in S3 settings.
  2. Upload backups to S3 → local file is automatically deleted.
  3. Continue using Download, Restore, and Compare — system transparently pulls from S3.
  4. See at a glance which backups are in S3 via blue cloud badges.
  5. Save significant server disk space while maintaining full functionality.

@vercel
Copy link

vercel bot commented Nov 13, 2025

@ejscheepers is attempting to deploy a commit to the Dendi Team on Vercel.

A member of the Team first needs to authorize it.

@ejscheepers
Copy link
Contributor Author

Apologies for the AI PR Description, was just easier to summarise using it.

@dendianugerah
Copy link
Owner

Doesn’t matter man haha. I’ve already seen the code, looks really solid. I’ll test it out tomorrow

@ejscheepers
Copy link
Contributor Author

ejscheepers commented Nov 13, 2025

Well done on the project by the way! I've been looking for a tool like this without bloat for a while now. Currently I use pg_back on the server with a cron job, but the lack of ui is frustrating.

@dendianugerah dendianugerah merged commit ce8b88c into dendianugerah:main Nov 16, 2025
2 of 3 checks passed
@dendianugerah
Copy link
Owner

lgtm thanks

@ejscheepers
Copy link
Contributor Author

Thanks for merging! Just checking — when is the next Docker image build/push scheduled, or does it happen automatically on merge to main?

@dendianugerah
Copy link
Owner

yeah it happens automatically when merged into main

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants