Skip to content

A smart, automated database backup CLI for multiple DBs with secure storage, compression, and restore capabilities.

License

Notifications You must be signed in to change notification settings

kisugez/Database-Backup-Utility-Tool

Repository files navigation

Database Backup Utility

A comprehensive CLI tool for database backup and restore operations supporting MySQL, PostgreSQL, and MongoDB with multiple storage backends.

Python CLI PostgreSQL MySQL MongoDB License

Features

  • 🗄️ Multiple Database Support: MySQL, PostgreSQL, and MongoDB
  • 📦 Backup Types: Full, incremental, and differential backups
  • 🗜️ Automatic Compression: Gzip compression with configurable levels
  • ☁️ Cloud Storage: Support for AWS S3, Google Cloud Storage, and Azure Blob Storage
  • 💾 Local Storage: Traditional local filesystem storage
  • 🔄 Restore Operations: Full database and selective table/collection restore
  • 📊 Database Information: View database size and table/collection counts
  • 🔒 Secure: Password-protected connections and encrypted cloud storage
  • 📝 Comprehensive Logging: Betterstack-compatible structured logging
  • 🌐 Cross-Platform: Compatible with Windows, Linux, and macOS
  • Large Database Support: Efficient handling of large databases with chunked operations

Installation

Prerequisites

  • Python 3.8 or higher
  • Database command-line tools (depending on your database):
    • MySQL: mysql and mysqldump
    • PostgreSQL: psql, pg_dump, and pg_restore
    • MongoDB: mongodump and mongorestore

Install from Source

  1. Clone the repository:
git clone https://github.com/kisugez/Database-Backup-Utility.git
cd Database-Backup-Utility
  1. Install dependencies:
pip install -r requirements.txt
  1. Install the package:
pip install -e .

Install Dependencies Only

pip install -r requirements.txt

Quick Start

Test Database Connection

Before performing any backup or restore operations, test your database connection:

dbbackup test-connection \
  --db-type mysql \
  --host localhost \
  --port 3306 \
  --username root \
  --password your_password \
  --database mydb

Perform a Backup

MySQL Backup to Local Storage

dbbackup backup \
  --db-type mysql \
  --host localhost \
  --port 3306 \
  --username root \
  --password your_password \
  --database mydb \
  --storage-path ./backups

PostgreSQL Backup to AWS S3

dbbackup backup \
  --db-type postgresql \
  --host localhost \
  --port 5432 \
  --username postgres \
  --password your_password \
  --database mydb \
  --storage-type s3 \
  --storage-path my-backup-bucket \
  --aws-access-key YOUR_ACCESS_KEY \
  --aws-secret-key YOUR_SECRET_KEY \
  --s3-region us-east-1

MongoDB Collection Backup

dbbackup backup \
  --db-type mongodb \
  --host localhost \
  --port 27017 \
  --username admin \
  --password your_password \
  --database mydb \
  --table my_collection \
  --storage-path ./backups

Restore from Backup

Full Database Restore

dbbackup restore \
  --db-type mysql \
  --host localhost \
  --port 3306 \
  --username root \
  --password your_password \
  --database mydb \
  --backup-file mydb_full_20240118_120000.sql.gz \
  --storage-path ./backups

Restore Specific Table

dbbackup restore \
  --db-type mysql \
  --host localhost \
  --port 3306 \
  --username root \
  --password your_password \
  --database mydb \
  --backup-file users_full_20240118_120000.sql.gz \
  --storage-path ./backups \
  --table users

Restore from Local File

dbbackup restore \
  --db-type postgresql \
  --host localhost \
  --port 5432 \
  --username postgres \
  --password your_password \
  --database mydb \
  --backup-file /path/to/backup.dump.gz \
  --local-file

List Available Backups

# List local backups
dbbackup list-backups \
  --storage-type local \
  --storage-path ./backups

# List S3 backups with prefix filter
dbbackup list-backups \
  --storage-type s3 \
  --storage-path my-backup-bucket \
  --prefix mydb_ \
  --aws-access-key YOUR_ACCESS_KEY \
  --aws-secret-key YOUR_SECRET_KEY

Configuration File

Instead of passing all options via command line, you can use a configuration file:

JSON Configuration

Create a config.json file:

{
  "database": {
    "db_type": "mysql",
    "host": "localhost",
    "port": 3306,
    "username": "root",
    "password": "your_password",
    "database": "mydb"
  },
  "storage": {
    "storage_type": "local",
    "local_path": "./backups"
  },
  "backup": {
    "backup_type": "full",
    "compress": true,
    "compression_level": 6,
    "retention_days": 30
  },
  "log_file": "backup.log",
  "betterstack_token": null
}

YAML Configuration

Create a config.yaml file:

database:
  db_type: postgresql
  host: localhost
  port: 5432
  username: postgres
  password: your_password
  database: mydb

storage:
  storage_type: s3
  s3_bucket: my-backup-bucket
  s3_region: us-east-1
  aws_access_key: YOUR_ACCESS_KEY
  aws_secret_key: YOUR_SECRET_KEY

backup:
  backup_type: full
  compress: true
  compression_level: 9
  retention_days: 60

log_file: backup.log

Using Configuration File

dbbackup --config config.json backup
dbbackup --config config.yaml restore --backup-file mydb_full_20240118_120000.dump.gz

Environment Variables

You can also configure the tool using environment variables:

# Database configuration
export DB_TYPE=mysql
export DB_HOST=localhost
export DB_PORT=3306
export DB_USER=root
export DB_PASSWORD=your_password
export DB_NAME=mydb

# Storage configuration
export STORAGE_TYPE=local
export STORAGE_PATH=./backups

# For AWS S3
export STORAGE_TYPE=s3
export S3_BUCKET=my-backup-bucket
export S3_REGION=us-east-1
export AWS_ACCESS_KEY=YOUR_ACCESS_KEY
export AWS_SECRET_KEY=YOUR_SECRET_KEY

# Backup configuration
export BACKUP_TYPE=full
export BACKUP_COMPRESS=true
export BACKUP_COMPRESSION_LEVEL=6

# Logging
export LOG_FILE=backup.log
export BETTERSTACK_TOKEN=your_betterstack_token

Cloud Storage Configuration

AWS S3

dbbackup backup \
  --db-type mysql \
  --host localhost \
  --username root \
  --password your_password \
  --database mydb \
  --storage-type s3 \
  --storage-path my-bucket-name \
  --aws-access-key YOUR_ACCESS_KEY \
  --aws-secret-key YOUR_SECRET_KEY \
  --s3-region us-east-1

Google Cloud Storage

dbbackup backup \
  --db-type postgresql \
  --host localhost \
  --username postgres \
  --password your_password \
  --database mydb \
  --storage-type gcp \
  --storage-path my-bucket-name \
  --gcp-project my-project-id \
  --gcp-credentials /path/to/credentials.json

Azure Blob Storage

dbbackup backup \
  --db-type mongodb \
  --host localhost \
  --username admin \
  --password your_password \
  --database mydb \
  --storage-type azure \
  --storage-path my-container-name \
  --azure-account-name mystorageaccount \
  --azure-account-key YOUR_ACCOUNT_KEY

Logging

Console Logging

By default, logs are output to the console with colored formatting:

dbbackup --log-level DEBUG backup [options]

File Logging

Save logs to a file:

dbbackup --log-file backup.log backup [options]

Betterstack Integration

For centralized logging with Betterstack (Logtail):

  1. Install the Betterstack library:
pip install logtail-python
  1. Set your Betterstack token:
export BETTERSTACK_TOKEN=your_source_token

Or in config file:

{
  "betterstack_token": "your_source_token"
}

Command Reference

Global Options

  • --config, -c: Path to configuration file (JSON or YAML)
  • --log-level: Logging level (DEBUG, INFO, WARNING, ERROR)
  • --log-file: Path to log file

backup Command

Perform a database backup.

Options:

  • --db-type: Database type (mysql, postgresql, mongodb) [required]
  • --host: Database host (default: localhost)
  • --port: Database port
  • --username, -u: Database username [required]
  • --password, -p: Database password [required]
  • --database, -d: Database name [required]
  • --backup-type: Backup type (full, incremental, differential) (default: full)
  • --storage-type: Storage type (local, s3, gcp, azure) (default: local)
  • --storage-path: Storage path or bucket/container name
  • --compress/--no-compress: Enable/disable compression (default: enabled)
  • --table: Specific table/collection to backup

Cloud Storage Options:

  • AWS S3: --aws-access-key, --aws-secret-key, --s3-region
  • GCP: --gcp-credentials, --gcp-project
  • Azure: --azure-account-name, --azure-account-key

restore Command

Restore a database from backup.

Options:

  • --db-type: Database type (mysql, postgresql, mongodb) [required]
  • --host: Database host (default: localhost)
  • --port: Database port
  • --username, -u: Database username [required]
  • --password, -p: Database password [required]
  • --database, -d: Database name [required]
  • --backup-file: Backup file name or path [required]
  • --storage-type: Storage type (local, s3, gcp, azure) (default: local)
  • --storage-path: Storage path or bucket/container name
  • --local-file: Backup file is a local file (not in storage)
  • --table: Specific table/collection to restore

Cloud Storage Options: (same as backup command)

test-connection Command

Test database connection.

Options:

  • --db-type: Database type [required]
  • --host: Database host (default: localhost)
  • --port: Database port
  • --username, -u: Database username [required]
  • --password, -p: Database password [required]
  • --database, -d: Database name [required]

list-backups Command

List available backups in storage.

Options:

  • --storage-type: Storage type (default: local)
  • --storage-path: Storage path or bucket/container name [required]
  • --prefix: Filter backups by prefix

Cloud Storage Options: (same as backup command)

Troubleshooting

Database Command-Line Tools Not Found

Ensure the required command-line tools are installed and in your PATH:

MySQL:

# Ubuntu/Debian
sudo apt-get install mysql-client

# macOS
brew install mysql-client

# Windows
# Download from https://dev.mysql.com/downloads/mysql/

PostgreSQL:

# Ubuntu/Debian
sudo apt-get install postgresql-client

# macOS
brew install postgresql

# Windows
# Download from https://www.postgresql.org/download/windows/

MongoDB:

# Ubuntu/Debian
sudo apt-get install mongodb-database-tools

# macOS
brew install mongodb-database-tools

# Windows
# Download from https://www.mongodb.com/try/download/database-tools

Connection Errors

  1. Verify database is running
  2. Check host and port settings
  3. Verify username and password
  4. Check firewall settings
  5. Use test-connection command to diagnose

Permission Errors

Ensure the database user has appropriate permissions:

MySQL:

GRANT SELECT, LOCK TABLES, SHOW VIEW, TRIGGER ON mydb.* TO 'user'@'host';

PostgreSQL:

GRANT CONNECT ON DATABASE mydb TO user;
GRANT SELECT ON ALL TABLES IN SCHEMA public TO user;

MongoDB:

db.grantRolesToUser("user", ["backup", "restore"])

Git Workflow - Manual Push Instructions

This project was created on the copilot/create-cli-database-tool branch. To commit and push your changes manually to GitHub:

1. Check Current Status

git status

2. Stage Your Changes

# Stage all changes
git add .

# Or stage specific files
git add db_backup_utility/ README.md requirements.txt setup.py

3. Commit Your Changes

git commit -m "Your commit message describing the changes"

4. Push to GitHub

# Push to the current branch
git push origin copilot/create-cli-database-tool

# Or if you've set up tracking
git push

5. Create a Pull Request (Optional)

If you want to merge this branch into main:

  1. Go to https://github.com/kisugez/Database-Backup-Utility
  2. Click "Pull requests" tab
  3. Click "New pull request"
  4. Select copilot/create-cli-database-tool as the source branch
  5. Select main as the target branch
  6. Review changes and create the pull request
  7. Merge when ready

6. Switch to Main Branch (After Merge)

git checkout main
git pull origin main

Project Structure

Database-Backup-Utility/
├── db_backup_utility/          # Main package
│   ├── __init__.py
│   ├── cli.py                  # CLI interface
│   ├── core/                   # Core functionality
│   │   ├── config.py           # Configuration management
│   │   └── logger.py           # Logging setup
│   ├── databases/              # Database connectors
│   │   ├── base.py             # Base connector interface
│   │   ├── mysql_connector.py
│   │   ├── postgresql_connector.py
│   │   ├── mongodb_connector.py
│   │   └── factory.py          # Connector factory
│   ├── storage/                # Storage handlers
│   │   ├── base.py             # Base storage interface
│   │   ├── local_storage.py
│   │   ├── s3_storage.py
│   │   ├── gcp_storage.py
│   │   ├── azure_storage.py
│   │   └── factory.py          # Storage factory
│   ├── backup/                 # Backup operations
│   │   └── manager.py          # Backup manager
│   ├── restore/                # Restore operations
│   │   └── manager.py          # Restore manager
│   └── utils/                  # Utilities
│       ├── compression.py      # Compression utilities
│       └── helpers.py          # Helper functions
├── requirements.txt            # Python dependencies
├── setup.py                    # Package setup
├── .gitignore                  # Git ignore rules
└── README.md                   # This file

License

MIT License - See LICENSE file for details

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Support

For issues and questions, please open an issue on GitHub: https://github.com/kisugez/Database-Backup-Utility/issues

About

A smart, automated database backup CLI for multiple DBs with secure storage, compression, and restore capabilities.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Sponsor this project

Languages