A robust and well-tested migration tool specifically designed to migrate CoreProtect data from SQLite to MySQL/MariaDB/Percona. While built for CoreProtect, this tool can be adapted for any SQLite to MySQL migration with minimal modifications.
This tool performs destructive operations including TRUNCATE statements on MySQL tables.
- Always backup your databases before running
- Test on non-production data first
- Understand what the tool does before executing
- Use at your own risk
-
Configure the environment:
cp _env .env # Edit .env with your database credentials -
Load environment variables:
source .env -
Install dependencies:
bundle install
-
Run the migration:
ruby ./migration.rb
MySQL/MariaDB/Percona:
- Target database must exist
- CoreProtect tables must be pre-created with identical structure to SQLite
- Sufficient privileges for INSERT, TRUNCATE, and ALTER operations
SQLite:
- Source database file (
database.db) must be accessible - File permissions allowing read access
- Ruby 3.0+ (tested with Ruby 3.0)
- Required gems:
sqlite3- SQLite database interfacemysql2- MySQL database interfacerspec(development/testing)
- Sufficient disk space for temporary operations
- Network connectivity to MySQL server
- Memory proportional to batch size (OFFSET value)
Create a .env file based on the provided _env template:
| Variable | Description | Example | Required |
|---|---|---|---|
MYSQL_HOST |
MySQL server hostname | localhost |
✅ |
MYSQL_USER |
MySQL username | root |
✅ |
MYSQL_PASS |
MySQL password | password123 |
✅ |
MYSQL_DATABASE |
Target database name | minecraft |
✅ |
MYSQL_PORT |
MySQL port number | 3306 |
✅ |
OFFSET |
Batch size for processing | 200000 |
✅ |
SQLITE_DATABASE |
Path to SQLite file | ./database.db |
✅ |
Linux/macOS:
source ./.envWindows (PowerShell):
Get-Content .env | ForEach-Object {
if ($_ -match '^([^=]+)=(.*)$') {
[System.Environment]::SetEnvironmentVariable($matches[1], $matches[2])
}
}The OFFSET value controls batch processing size:
- Small batches (10,000-50,000): Slower but more memory efficient
- Large batches (200,000-500,000): Faster but requires more memory
- Very large tables: Consider smaller batches for stability
For better performance, consider these MySQL settings:
SET GLOBAL innodb_buffer_pool_size = 2G;
SET GLOBAL max_allowed_packet = 1073741824;
SET GLOBAL innodb_flush_log_at_trx_commit = 2;This project includes comprehensive test coverage:
# Run all tests
bundle exec rspec
# Run specific test categories
bundle exec rspec spec/processors/
bundle exec rspec spec/formatters/
bundle exec rspec spec/validators/
# Run with coverage report
bundle exec rspec --format documentationTest Coverage:
- ✅ 122 examples, 0 failures
- ✅ Unit tests for all core components
- ✅ Integration tests for migration workflows
- ✅ Error handling and edge cases
- ✅ Schema validation and compatibility
-
Validation Phase:
- Validates environment configuration
- Checks database connections
- Verifies table schema compatibility
-
Preparation Phase:
- Optimizes MySQL settings for bulk operations
- Disables foreign key checks and constraints
- Sets up progress tracking
-
Migration Phase:
- Processes tables in batches (OFFSET size)
- Handles data type conversions (Base64 for binary data)
- Maps SQLite
idcolumns to MySQLrowid - Provides real-time progress reporting
-
Finalization Phase:
- Re-enables constraints and checks
- Restores MySQL settings
- Generates migration summary
The tool provides detailed progress information:
- Current table being processed
- Records migrated vs total records
- Estimated time remaining (ETA)
- Overall migration percentage
- Automatic Resume: Continue from last successful batch
- Schema Mismatch Handling: Skip incompatible tables or migrate partial data
- Error Recovery: Graceful handling of connection issues and data errors
1. Packets larger than max_allowed_packet are not allowed
Solution: Increase MySQL packet size or reduce batch size:
SET GLOBAL max_allowed_packet=1073741824;Or reduce OFFSET in .env file.
2. undefined method 'map' for nil
Status: ✅ Fixed - This critical bug has been resolved in the current version.
3. Memory Issues with Large Tables
Solution: Reduce the OFFSET value:
# In .env file
OFFSET=50000 # Reduced from 2000004. Connection Timeouts
Solution: Increase MySQL timeout settings:
SET GLOBAL wait_timeout = 28800;
SET GLOBAL interactive_timeout = 28800;5. Schema Mismatch Errors
The tool will:
- Report missing/extra columns
- Offer to migrate compatible columns only
- Allow skipping problematic tables
| Table Size | Records | Estimated Time |
|---|---|---|
| Small | < 100K | 1-5 minutes |
| Medium | 100K-1M | 5-30 minutes |
| Large | 1M-10M | 30min-2 hours |
| Very Large | 10M+ | 2-8+ hours |
Note: co_block tables with 89M+ records may require 5+ hours.
- Use SSD storage for both source and target databases
- Increase MySQL buffer pool size
- Run during low-traffic periods
- Monitor system resources during migration
- Consider table-specific batch sizes for problematic tables
TableProcessor- Handles individual table migration logicValueFormatter- Converts SQLite data types to MySQL formatSchemaValidator- Validates table compatibilityProgressTracker- Manages migration state and recoveryRecoveryManager- Handles errors and resume functionalityDatabaseOptimizer- Manages MySQL performance settings
- Defensive Programming: Extensive error handling and validation
- Testability: Comprehensive test suite with high coverage
- Recoverability: Ability to resume from failures
- Observability: Detailed logging and progress reporting
- Modularity: Clean separation of concerns
Contributions are welcome! Please ensure:
- All tests pass:
bundle exec rspec - Follow Ruby conventions: Use consistent style
- Add tests for new features: Maintain test coverage
- Update documentation: Keep README current
This project is open source and available under the MIT License.
If you encounter issues:
- Check the troubleshooting section above
- Review test results:
bundle exec rspec - Enable debug logging for detailed output
- Open an issue with complete error details and environment info
Built with ❤️ for the Minecraft community