DuplicAid is a CLI tool for managing PostgreSQL backups using SQL dumps. It provides a unified interface for creating, listing, and restoring backups from PostgreSQL instances running in Docker containers.
The tool supports both local and remote execution modes.
tiredofit/docker-db-backup:4.1.21 for backup operations.
- SQL Dumps: Create and restore database backups using pg_dump/pg_restore
- S3 Integration: Store and retrieve backups from S3-compatible storage
- Dual Execution Modes: Manage backups locally or on remote servers via SSH
Install duplicaid using uv:
# Install from PyPI
uv add duplicaid
# Or install from source
git clone <repository-url>
cd duplicaid
uv sync --extra devDuplicaid stores configuration in .duplicaid.yml in your current working directory by default. You can specify a different location using the --config flag.
Remote Mode (default):
- Manages PostgreSQL containers on a remote server via SSH
- Requires SSH key authentication
- All Docker commands executed on remote server
Local Mode:
- Manages PostgreSQL containers on the local machine
- No SSH connection required
- Docker commands executed locally
Initialize configuration interactively:
duplicaid config init- Execution Mode:
remoteorlocal - Remote Server (remote mode only): SSH connection details (host, user, port, key path)
- Container Names: PostgreSQL and backup container names
- PostgreSQL Credentials: Database user and password
- Paths: Docker Compose file location
Remote Mode:
execution_mode: remote
remote:
host: your-server.example.com
user: root
port: 22
ssh_key_path: /home/user/.ssh/id_rsa
containers:
postgres: postgres
backup: db-backup
postgres:
user: postgres
password: your_secure_password
host: postgres
s3:
endpoint: https://s3.amazonaws.com
bucket: my-backups
path: postgres/backups
# access_key and secret_key can be set here or via env vars:
# AWS_ACCESS_KEY_ID / S3_ACCESS_KEY
# AWS_SECRET_ACCESS_KEY / S3_SECRET_KEY
paths:
docker_compose: /home/user/postgres/docker-compose.yml
databases:
- myapp
- analyticsLocal Mode:
execution_mode: local
containers:
postgres: postgres
backup: db-backup
postgres:
user: postgres
password: your_secure_password
host: postgres
s3:
endpoint: http://localhost:9000
bucket: my-backups
path: postgres/backups
paths:
docker_compose: /home/user/postgres/docker-compose.ymlS3 Credentials:
S3 credentials can be configured in two ways:
-
In config file (less secure):
s3: access_key: YOUR_ACCESS_KEY secret_key: YOUR_SECRET_KEY
-
Via environment variables (recommended):
export AWS_ACCESS_KEY_ID=YOUR_ACCESS_KEY export AWS_SECRET_ACCESS_KEY=YOUR_SECRET_KEY # or export S3_ACCESS_KEY=YOUR_ACCESS_KEY export S3_SECRET_KEY=YOUR_SECRET_KEY
-
Initialize Configuration:
duplicaid config init
-
Check Status:
duplicaid status
-
Create a Backup:
duplicaid backup create
-
List Backups:
duplicaid list backups
-
Restore a Backup:
duplicaid restore mydb backup_file.sql.bz2
- Storage: S3-compatible storage (compressed with bzip2)
- Scope: All configured databases backed up automatically
- Format:
pgsql_hostname_database_YYYYMMDD-HHMMSS.sql.bz2
Backups can be listed from:
- S3-compatible storage (if configured)
- Local backup directory (fallback)
- Source: Automatically downloads from S3 if not found locally
- Scope: Database-specific restoration
- Compatibility: Works across PostgreSQL versions
- Python 3.12+
- Docker and Docker Compose
- PostgreSQL container
- tiredofit/db-backup container for backup operations
- SSH access to remote server
- SSH key authentication configured
- Docker daemon running locally
- Access to local Docker socket
-
Clone and setup:
git clone <repository-url> cd duplicaid uv sync --extra dev
-
Install pre-commit hooks:
uv run pre-commit install
-
Run tests:
uv run pytest
duplicaid/
├── pyproject.toml # Project configuration and dependencies
├── README.md # This file
├── src/
│ └── duplicaid/ # Main package
│ ├── __init__.py
│ ├── cli.py # CLI interface
│ ├── config.py # Configuration management
│ ├── backup.py # Backup operations
│ ├── ssh.py # SSH connectivity
│ ├── executor.py # Command execution
│ ├── discovery.py # Database discovery
│ └── local.py # Local operations
└── tests/ # Test suite
├── conftest.py
├── test_cli.py
├── test_config.py
├── test_integration.py
└── test_local_executor.py
The test suite includes:
- Unit tests: Test individual components
- Integration tests: Test component interactions
- CLI tests: Test command-line interface
Run specific test types:
# All tests
uv run pytest
# Unit tests only
uv run pytest -k "not integration"
# Integration tests only
uv run pytest -m integration
# With coverage
uv run pytest --cov=duplicaidThis project uses automated releases with semantic commits.
# 1. Create feature branch
git checkout -b feat/new-feature
# 2. Push and create PR
git push origin feat/new-feature
# 3. Merge PR → Auto-release to PyPIgit commit -m "fix: resolve timeout" # → patch release
git commit -m "feat: add encryption" # → minor release
git commit -m "feat!: redesign API" # → major release- PRs: Auto-test, lint, format
- Main branch: Auto-version, auto-publish to PyPI
- Pre-commit: Enforce quality and commit format
# Manual build (for testing)
uv build
# Automated publishing (via GitHub Actions)
# → Happens automatically on main branch pushes
# → No manual PyPI uploads needed
# Emergency manual publish (not recommended)
uv publish