A specialized tool for managing PostgreSQL upgrades in Docker Compose environments. This tool uses Docker Compose's own resolution engine to analyze your project configuration and help you identify and select services and their associated volumes for PostgreSQL upgrade operations.
- π Smart Configuration Parsing: Uses
docker compose config
for accurate, resolved configuration analysis - π― Upgrade-Focused: Specifically designed for PostgreSQL upgrade workflows with dedicated commands
- π₯οΈ No File Path Dependencies: Works from any Docker Compose project directory
- π Intuitive Interface: Interactive prompts with arrow-key navigation
- π Automated Workflow: Single command performs complete upgrade sequence
- π‘οΈ Data Verification: Pre-backup validation, post-upgrade verification for complete workflows, and import statistics for standalone imports
- π§ Volume Verification: Two-tier backup volume mounting verification with lightweight Docker API reconnection and container restart fallback
- π‘οΈ Enhanced Volume Validation: Strict validation ensures only proper Docker volumes are used, rejecting bind mounts and requiring complete volume definitions for production safety
- β‘ Flexible Commands: Separate commands for export, import, and full upgrade workflows
- ποΈ Clean Architecture: Separation of CLI concerns from business logic for better maintainability and testability
- π Automated Backup Creation: With integrity verification before upgrades
- π³ Docker Image Management: For new PostgreSQL versions (pull and build)
- β‘ Service Orchestration: Complete stop/start PostgreSQL container lifecycle
- π₯ Backup Import and Restoration: With comprehensive verification for upgrades and statistics display for imports
- π Database Statistics Collection: For upgrade verification and import monitoring
- π¨ Rich Terminal Output: With colored progress indicators and status messages
- β Well-Tested: Comprehensive test suite covering error handling, edge cases, and volume verification
git clone https://github.com/exilesprx/postgres-upgrader
cd postgres-upgrader
uv sync --dev
git clone https://github.com/exilesprx/postgres-upgrader
cd postgres-upgrader
pip install -e .[dev]
git clone https://github.com/exilesprx/postgres-upgrader
cd postgres-upgrader
poetry install --with dev
# Check if the tool works (console script)
uv run postgres-upgrader --help
# Or with main.py entry point
uv run main.py --help
# Or with package entry point
uv run python -m postgres_upgrader --help
The tool requires:
- Docker Compose: Must be installed and accessible via
docker compose config
- Docker Compose Project: Run the tool from a directory containing a
docker-compose.yml
file - PostgreSQL Credentials: Can be provided via:
.env
file (recommended):POSTGRES_USER=your_postgres_user POSTGRES_DB=your_database_name
- Docker Compose environment variables: The tool automatically reads resolved credentials from your Docker Compose configuration
The tool provides three main commands for different workflow needs:
# Navigate to your Docker Compose project directory
cd /path/to/your/docker-compose-project
# Run complete upgrade workflow (console script)
uv run postgres-upgrader upgrade
# Or with main.py entry point
uv run main.py upgrade
# Or use package entry point
uv run python -m postgres_upgrader upgrade
# Create backup without performing upgrade (console script)
uv run postgres-upgrader export
# Or with main.py entry point
uv run main.py export
# Package entry point
uv run python -m postgres_upgrader export
# Import data from existing backup (console script)
uv run postgres-upgrader import
# Or with main.py entry point
uv run main.py import
# Package entry point
uv run python -m postgres_upgrader import
# Show all available commands (console script)
uv run postgres-upgrader --help
# Or with main.py entry point
uv run main.py --help
# Package entry point
uv run python -m postgres_upgrader --help
Each command follows these patterns:
Export Command:
- Analyze Docker Compose configuration
- Prompt for service and volume selection
- Collect baseline database statistics
- Create backup and verify integrity
- Display backup statistics and location
Import Command:
- Analyze Docker Compose configuration
- Prompt for service and volume selection
- Start PostgreSQL service container
- Verify backup file integrity
- Verify backup volume mounting
- Import data from backup
- Update collation version
- Display import statistics
Upgrade Command (Complete Workflow):
- Analyze Docker Compose configuration
- Prompt for service and volume selection
- Collect baseline database statistics
- Create backup and verify integrity
- Stop and remove PostgreSQL service container
- Update image and rebuild service container
- Remove old data volume
- Start service with new PostgreSQL version
- Verify backup volume mounting
- Import data and verify upgrade restoration success
- Update collation version
[?] Select a service to inspect::
nginx
> postgres
[?] Select the main volume::
> database:/var/lib/postgresql/data
backups:/tmp/postgresql/backups
π Collecting database statistics...
Current database: 5 tables, 25 MB
πΎ Creating backup...
Backup created successfully: /tmp/postgresql/backups/backup-20251001_165130.sql
π Verifying backup integrity...
Backup verified: 12345 bytes, ~5 tables
# Import command output:
Import statistics:
Tables imported: 5
Estimated rows: 1000
Database size: 25 MB
# Upgrade command output:
β
Upgrade verification successful:
Tables: 5 (original: 5)
Estimated rows: 1000
Database size: 25 MB
π PostgreSQL upgrade completed successfully!
This tool uses Docker Compose's own configuration resolution via the docker compose config
command to get the exact same configuration that Docker Compose would use, including:
- Environment Variables: Automatically resolves all variable substitutions
- Volume Prefixes: Gets actual volume names with project prefixes (e.g.,
postgres-upgrader_database
) - Network Resolution: Handles complex networking configurations
- Real-time Configuration: Always reflects current project state
- Error Prevention: Eliminates manual parsing inconsistencies
The project follows a clean architecture with separation of concerns:
- Separation of Concerns: CLI logic separated from business logic through dedicated
Postgres
orchestration class - Context Manager: Automatic Docker client lifecycle management with proper resource cleanup
- Instance Variables: Methods use stored credentials rather than requiring parameters, reducing errors and improving consistency
- Command-Based Interface: Dedicated commands for export, import, and upgrade workflows allowing flexible usage patterns
main.py
: CLI entry point handling argument parsing and command routingpostgres.py
: Business logic orchestration class managing upgrade workflowsdocker.py
: Docker infrastructure operations and PostgreSQL database interactionscompose_inspector.py
: Docker Compose configuration parsing and resolution with enhanced volume validationprompt.py
: User interaction and service/volume selection interfaces
The tool implements strict volume validation to ensure production safety:
- Volume Type Enforcement: Only Docker volumes are supported; bind mounts are rejected to prevent accidental host filesystem access
- Complete Volume Definitions: All volume configurations must have resolved names and proper definitions in the Docker Compose volumes section
- Early Error Detection: Configuration validation happens before any Docker operations begin, providing clear error messages for misconfigurations
- Production-Safe Defaults: The validation ensures all volume operations are container-safe and don't accidentally access host directories
# Analyze Docker Compose configuration
from postgres_upgrader import parse_docker_compose
# Parse Docker Compose configuration
compose_data = parse_docker_compose()
# Get services and volumes
services = compose_data.services
print("Available services:", list(services.keys()))
volumes = compose_data.get_volumes("postgres")
print("Postgres volumes:", [v.raw for v in volumes])
# Access volume information
backup_volume = next((v for v in volumes if v.name == "backups"), None)
data_volume = next((v for v in volumes if v.name == "database"), None)
if backup_volume:
print(f"Backup volume path: {backup_volume.path}")
print(f"Resolved volume name: {backup_volume.resolved_name}") # e.g., "postgres-upgrader_backups"
if data_volume:
print(f"Data volume path: {data_volume.path}")
print(f"Resolved volume name: {data_volume.resolved_name}") # e.g., "postgres-upgrader_database"
from postgres_upgrader import parse_docker_compose, identify_service_volumes
compose_data = parse_docker_compose()
volume_config = identify_service_volumes(compose_data)
if volume_config:
service_name = volume_config.name
if volume_config.selected_backup_volume:
backup_path = volume_config.selected_backup_volume.path
print(f"Selected service: {service_name}")
print(f"Backup directory: {backup_path}")
# Use the new Postgres orchestration class for programmatic workflows
from postgres_upgrader.postgres import Postgres
from rich.console import Console
console = Console()
postgres = Postgres(console)
# Run individual workflows
try:
# Create backup only
postgres.handle_export_command(args=None)
# Import from backup
postgres.handle_import_command(args=None)
# Complete upgrade workflow
postgres.handle_upgrade_command(args=None)
except Exception as e:
console.print(f"β Error: {e}", style="bold red")
# Direct backup workflow using DockerManager
from postgres_upgrader import (
parse_docker_compose,
identify_service_volumes,
DockerManager,
prompt_container_user,
)
compose_data = parse_docker_compose()
selected_service = identify_service_volumes(compose_data)
if selected_service:
service_name = selected_service.name
# Get credentials from Docker Compose environment
user = compose_data.get_postgres_user(service_name)
database = compose_data.get_postgres_db(service_name)
# Get container user (typically "postgres")
container_user = prompt_container_user()
# Create backup using DockerManager with all required parameters
with DockerManager(compose_data.name, selected_service, container_user, user, database) as docker_mgr:
backup_path = docker_mgr.create_postgres_backup()
print(f"Backup created: {backup_path}")
# Run all tests
uv run pytest
# Run with verbose output
uv run pytest -v
# Run specific test file
uv run pytest tests/test_parse_docker_compose.py
# Run with coverage
uv run pytest --cov=src/postgres_upgrader
The project includes development dependencies for code quality:
# Install with dev dependencies (includes ruff)
uv sync --group dev
# Code linting and formatting (ruff is included as dev dependency)
uv run ruff check # Linting
uv run ruff format # Code formatting
# Optional: Install additional tools
uv add --dev mypy coverage
# With additional tools:
uv run mypy src/ # Type checking
uv run pytest --cov=src/postgres_upgrader # Coverage reporting
postgres-upgrader/
βββ src/postgres_upgrader/ # Main package
β βββ __init__.py # Package exports
β βββ compose_inspector.py # Docker Compose config parsing via subprocess
β βββ prompt.py # User interaction and volume selection
β βββ docker.py # Docker operations and PostgreSQL infrastructure
β βββ postgres.py # Business logic orchestration and workflow management
βββ tests/ # Test suite
β βββ test_docker.py # Docker infrastructure tests (including volume verification)
β βββ test_parse_docker_compose.py # Config resolution tests
β βββ test_postgres.py # Business logic orchestration tests
β βββ test_subprocess_integration.py # Docker Compose subprocess tests
β βββ test_user_interaction.py # User interaction tests
βββ main.py # CLI entry point with command parsing
βββ pyproject.toml # Project configuration
βββ uv.lock # Dependency lock file (uv)
βββ README.md # This file
- Python 3.13+
- Docker Compose v2+ (accessible via
docker compose config
command) - PostgreSQL credentials either in
.env
file or Docker Compose environment variables - A Docker Compose project with
docker-compose.yml
file - Dependencies:
pyyaml
,inquirer
,docker
,rich
- Dev Dependencies:
pytest
,ruff
See CONTRIBUTING.md for detailed guidelines on how to contribute to this project.
This project is licensed under the MIT License - see the LICENSE file for details.