A Node.js-based solution for data migration between MSSQL databases with an interactive interface and standalone executable support.
- β Interactive Interface: User-friendly menu system for easy operation
- β Standalone Executable: Run without Node.js installation
- β Multilingual Support: English and Korean interfaces
- β Progress Monitoring: Real-time migration progress tracking with detailed history
- β MSSQL Data Migration: High-performance batch processing
- β XML Configuration Support: Flexible XML-based configuration
- β Column Overrides: Modify/add column values during migration
- β Pre/Post Processing: Execute SQL scripts before/after migration
- β Dynamic Variables: Extract and utilize data at runtime
- β Transaction Support: Ensure data consistency
- β Detailed Logging: 5-level log system with password masking
- β DRY RUN Mode: Simulation without actual changes
- β SELECT * Auto Processing: Automatic IDENTITY column exclusion
- π Global Timezone System: Support for 22 timezones worldwide with ${DATE.TIMEZONE:format}
- π Case-Insensitive Column Matching: No need to worry about column name case
- π Large Dataset Support: Handles SQL Server 2100 parameter limit automatically
- π Enhanced Debugging: Detailed diagnostics for delete operations
- config-manager: Loads/parses dbinfo and query XML. Attribute validation, parse global overrides/processes/dynamic vars.
- query-processor: Expands SELECT *, excludes IDENTITY columns, queries target schema.
- variable-manager: Variable substitution, date/timezone functions, applies global column override values (including JSON mapping).
- script-processor: Executes pre/post scripts with variable substitution.
- mssql-connection-manager: Source/target DB connections, query execution, deletes/batching.
- mssql-data-migrator-modular: Orchestrates migration, runs global/per-query processes, applies selective global overrides.
- ConfigManager β Load/parse config
- ScriptProcessor β Global preProcess
- VariableManager β Dynamic variable extraction/substitution
- QueryProcessor β SELECT * expansion + columns ready
- DataMigrator β Deletes (deleteBeforeInsert) β selective global overrides β batch inserts
- ScriptProcessor β Global postProcess
- Non-interactive CLI: Run tasks directly with
app.js --mode(no menu)- Modes:
validate,test,migrate,help - Works in both Node and packaged EXE
- Modes:
- See "Non-interactive CLI (New in v0.9.1)" section below for examples
- Consistent getTableColumns(): Returns
{ name }[]to align types across modules - Separated selective global overrides:
- Policy phase: Intersect
applyGlobalColumnswith target schema to choose columns - Apply phase: VariableManager safely applies only to existing row columns (with substitution/JSON mapping)
- Policy phase: Intersect
- Robustness: Handles mixed
{name}/string column arrays; reinforced case-insensitive matching - Operations: Recommend
sp_updatestats/UPDATE STATISTICS ... WITH FULLSCANfor post-process stats
-
Download Release Package
- Download
sql2db-v0.9.1-win-x64.zip - Extract to your desired location
- Download
-
Configure Database Connection
- Edit
config/dbinfo.jsonwith your database settings - Add query definition files to
queries/folder
- Edit
-
Run
# English version run.bat # Korean version μ€ννκΈ°.bat # Or directly (set language via environment variable) set LANGUAGE=en && sql2db.exe set LANGUAGE=kr && sql2db.exe
npm installCreate config/dbinfo.json file:
{
"dbs": {
"sourceDB": {
"server": "source-server.com",
"database": "source_db",
"user": "username",
"password": "password",
"isWritable": false
},
"targetDB": {
"server": "target-server.com",
"database": "target_db",
"user": "username",
"password": "password",
"isWritable": true
}
}
}# English version
npm start
# or
run.bat
# Korean version
npm run start:kr
# or
μ€ννκΈ°.batnode src/migrate-cli.js migrate --query ./queries/migration-queries.xmlRun tasks directly without the interactive menu using app.js --mode (works with Node and packaged EXE):
# Validate configuration
node app.js --lang=kr --mode=validate --query=queries/migration-queries.xml
# Test database connections
node app.js --lang=kr --mode=test
# Run migration
node app.js --lang=kr --mode=migrate --query=queries/migration-queries.xml
# Help
node app.js --mode=help
# Standalone EXE
sql2db.exe --lang=kr --mode=validate --query=queries/migration-queries.xml
sql2db.exe --lang=kr --mode=test
sql2db.exe --lang=kr --mode=migrate --query=queries/migration-queries.xml=========================================
MSSQL Data Migration Tool
Version 0.9.1
=========================================
1. Validate Query Definition File
2. Test Database Connection
3. Execute Data Migration
4. Check Migration Progress
5. Show Help
0. Exit
Please select (0-5):
- Validate Query Definition File: Check XML syntax and attribute names
- Test Database Connection: Verify database connectivity
- Execute Data Migration: Run data migration with selected query file
- Check Migration Progress: View migration history and detailed status
- Recent 3 migrations displayed by default
- Press 'A' to view all migrations
- Enter number to view detailed progress information
- Show Help: Display usage information
| Command | Description |
|---|---|
npm start or run.bat |
Interactive menu (English) |
npm run start:kr or μ€ννκΈ°.bat |
Interactive menu (Korean) |
node src/migrate-cli.js validate |
Configuration validation |
node src/migrate-cli.js test |
Connection test |
node src/migrate-cli.js migrate --dry-run |
Simulation execution |
node src/migrate-cli.js list-dbs |
List databases |
npm run build |
Build standalone executable |
npm run release |
Create release package |
<?xml version="1.0" encoding="UTF-8"?>
<migration>
<settings>
<sourceDatabase>sourceDB</sourceDatabase>
<targetDatabase>targetDB</targetDatabase>
<batchSize>1000</batchSize>
</settings>
<queries>
<query id="migrate_users" targetTable="users" enabled="true">
<sourceQuery>
<![CDATA[SELECT * FROM users WHERE status = 'ACTIVE']]>
</sourceQuery>
<columnOverrides>
<override column="migration_flag">1</override>
<override column="updated_by">MIGRATION_TOOL</override>
<override column="processed_at">${CURRENT_TIMESTAMP}</override>
<override column="migration_date">${CURRENT_DATE}</override>
</columnOverrides>
</query>
</queries>
<!-- Dynamic Variables -->
<dynamicVariables>
<dynamicVar id="active_customers" description="Active customer list">
<query>
<![CDATA[SELECT CustomerID, CustomerName FROM Customers WHERE IsActive = 1]]>
</query>
<extractType>column_identified</extractType>
</dynamicVar>
</dynamicVariables>
</migration>The tool supports dynamic variables that can extract data at runtime and use it in queries:
| Type | Description | Access Pattern | Default |
|---|---|---|---|
column_identified |
Extract all columns as arrays keyed by column name | ${varName.columnName} |
β Yes |
key_value_pairs |
Extract first two columns as key-value pairs | ${varName.key} |
No |
<!-- Using column_identified (default) from source database -->
<dynamicVar id="customer_data" description="Customer information">
<query>SELECT CustomerID, CustomerName, Region FROM Customers</query>
<!-- extractType omitted - defaults to column_identified -->
<!-- database omitted - defaults to sourceDB -->
</dynamicVar>
<!-- Using key_value_pairs from source database -->
<dynamicVar id="status_mapping" description="Status mapping">
<query>SELECT StatusCode, StatusName FROM StatusCodes</query>
<extractType>key_value_pairs</extractType>
<database>sourceDB</database>
</dynamicVar>
<!-- Using single_value from target database -->
<dynamicVar id="max_order_id" description="Maximum order ID">
<query>SELECT MAX(OrderID) as max_id FROM Orders</query>
<extractType>single_value</extractType>
<database>targetDB</database>
</dynamicVar>
<!-- Using single_column from source database -->
<dynamicVar id="active_user_ids" description="Active user IDs">
<query>SELECT UserID FROM Users WHERE Status = 'ACTIVE'</query>
<extractType>single_column</extractType>
<columnName>UserID</columnName>
<database>sourceDB</database>
</dynamicVar>-- In your migration queries
SELECT * FROM Orders
WHERE CustomerID IN (${customer_data.CustomerID})
AND Status IN (${status_mapping.StatusCode})The tool supports global column overrides that apply to all queries during migration. This feature supports both simple values and JSON values for dynamic configuration.
<globalColumnOverrides>
<override column="migration_date">${CURRENT_DATE}</override>
<override column="processed_at">GETDATE()</override>
<override column="data_version">2.1</override>
<override column="migration_flag">1</override>
<override column="updated_by">MIGRATION_TOOL</override>
</globalColumnOverrides>You can define JSON values that change based on specific conditions:
<globalColumnOverrides>
<!-- Simple value -->
<override column="migration_flag">1</override>
<!-- JSON value: Different data_version per table -->
<override column="data_version">{"users": "2.1", "orders": "2.2", "products": "2.3", "default": "2.0"}</override>
<!-- JSON value: Different values based on database -->
<override column="migration_date">{"sourceDB": "${CURRENT_DATE}", "targetDB": "2024-12-31", "default": "${CURRENT_DATE}"}</override>
<!-- JSON value: Different values based on time -->
<override column="batch_id">{"09": "BATCH_MORNING", "18": "BATCH_EVENING", "00": "BATCH_NIGHT", "default": "BATCH_DEFAULT"}</override>
</globalColumnOverrides><globalColumnOverrides>
<!-- Different priority levels per table -->
<override column="priority_level">{"users": "HIGH", "orders": "MEDIUM", "products": "LOW", "default": "NORMAL"}</override>
<!-- Different status codes per table -->
<override column="status_code">{"users": "ACTIVE", "orders": "PENDING", "products": "INACTIVE", "config": "SYSTEM", "default": "UNKNOWN"}</override>
<!-- Different data sources per table -->
<override column="data_source">{"users": "LEGACY_SYSTEM", "orders": "NEW_SYSTEM", "products": "EXTERNAL_API", "default": "MIGRATION_TOOL"}</override>
</globalColumnOverrides><globalColumnOverrides>
<!-- Different timestamps per database -->
<override column="created_at">{"sourceDB": "${CURRENT_TIMESTAMP}", "targetDB": "2024-12-31 23:59:59", "default": "${CURRENT_TIMESTAMP}"}</override>
<!-- Different user IDs per database -->
<override column="created_by">{"sourceDB": "LEGACY_USER", "targetDB": "MIGRATION_USER", "default": "SYSTEM"}</override>
<!-- Different environment flags per database -->
<override column="environment">{"sourceDB": "PRODUCTION", "targetDB": "STAGING", "default": "UNKNOWN"}</override>
</globalColumnOverrides><globalColumnOverrides>
<!-- Different batch IDs based on hour -->
<override column="batch_id">{"09": "BATCH_MORNING", "12": "BATCH_NOON", "18": "BATCH_EVENING", "00": "BATCH_NIGHT", "default": "BATCH_DEFAULT"}</override>
<!-- Different processing flags based on time -->
<override column="processing_flag">{"06": "EARLY_BATCH", "14": "DAY_BATCH", "22": "LATE_BATCH", "default": "REGULAR_BATCH"}</override>
<!-- Different time zones based on hour -->
<override column="timezone">{"00": "UTC", "09": "KST", "18": "EST", "default": "UTC"}</override>
</globalColumnOverrides><globalColumnOverrides>
<!-- Multi-level conditions: database + table -->
<override column="migration_type">{"sourceDB.users": "FULL_MIGRATION", "sourceDB.orders": "INCREMENTAL", "targetDB.users": "VALIDATION", "default": "STANDARD"}</override>
<!-- Conditional values with dynamic variables -->
<override column="customer_segment">{"premium": "VIP", "standard": "REGULAR", "basic": "BASIC", "default": "UNKNOWN"}</override>
<!-- Environment-specific configurations -->
<override column="config_version">{"dev": "1.0", "staging": "2.0", "prod": "3.0", "default": "1.0"}</override>
</globalColumnOverrides><globalColumnOverrides>
<!-- Using dynamic variables in JSON values -->
<override column="department_code">{"${active_departments.DepartmentID}": "${active_departments.DepartmentCode}", "default": "UNKNOWN"}</override>
<!-- Conditional values based on extracted data -->
<override column="region_code">{"${region_mapping.RegionID}": "${region_mapping.RegionCode}", "default": "GLOBAL"}</override>
<!-- Status mapping using dynamic variables -->
<override column="status_id">{"${status_codes.StatusName}": "${status_codes.StatusID}", "default": "0"}</override>
</globalColumnOverrides><globalColumnOverrides>
<!-- Complex nested JSON for configuration -->
<override column="config_data">{"users": {"priority": "HIGH", "batch_size": 500, "retry_count": 3}, "orders": {"priority": "MEDIUM", "batch_size": 1000, "retry_count": 2}, "default": {"priority": "NORMAL", "batch_size": 2000, "retry_count": 1}}</override>
<!-- Metadata with multiple properties -->
<override column="metadata">{"source": {"version": "1.0", "type": "legacy"}, "target": {"version": "2.0", "type": "modern"}, "default": {"version": "1.0", "type": "unknown"}}</override>
</globalColumnOverrides>| Context | Key Priority | Example | Result |
|---|---|---|---|
| Table Name | tableName β default β first key |
{"users": "2.1", "default": "2.0"} |
users ν
μ΄λΈ β "2.1" |
| Database | database β default β first key |
{"sourceDB": "DATE1", "default": "DATE2"} |
sourceDB β "DATE1" |
| No Match | default β first key |
{"users": "2.1", "default": "2.0"} |
μ μ μλ ν
μ΄λΈ β "2.0" |
<override column="priority_level">{"users": "HIGH", "orders": "MEDIUM", "products": "LOW", "default": "NORMAL"}</override>
<override column="status_code">{"users": "ACTIVE", "orders": "PENDING", "products": "INACTIVE", "config": "SYSTEM", "default": "UNKNOWN"}</override>Control which global overrides apply to specific queries:
<!-- Apply all global overrides -->
<sourceQuery applyGlobalColumns="all">
<![CDATA[SELECT * FROM users WHERE status = 'ACTIVE']]>
</sourceQuery>
<!-- Apply only specific global overrides -->
<sourceQuery applyGlobalColumns="migration_date,processed_at,updated_by">
<![CDATA[SELECT * FROM orders WHERE order_date >= '2024-01-01']]>
</sourceQuery>
<!-- Don't apply any global overrides -->
<sourceQuery applyGlobalColumns="none">
<![CDATA[SELECT * FROM config WHERE is_active = 1]]>
</sourceQuery>- π User Manual: Complete usage guide
- π Installation Guide: Detailed installation instructions
- π Change Log: Version-specific changes
- ποΈ Implementation Summary: Technical implementation details
The project includes various database scripts:
- π create-sample-tables.sql: Sample tables for testing
- π create-example-table.sql: Example table with various data types
- π insert-sample-data.sql: Sample data insertion
To create an example table with various data types and constraints for migration testing:
-- Execute in SQL Server Management Studio
-- Or run from command line
sqlcmd -S your-server -d your-database -i resources/create-example-table.sqlThis table includes:
- Various data types (string, numeric, date, boolean, JSON, binary)
- Computed columns (full_name, age_group)
- Check constraints (age, salary, email format, etc.)
- Performance optimization indexes
- Useful views and stored procedures
- Sample data in multiple languages
Starting from v0.1, real-time progress tracking and monitoring features have been added:
# List progress
node src/progress-cli.js list
# Show specific migration details
node src/progress-cli.js show migration-2024-12-01-15-30-00
# Real-time monitoring
node src/progress-cli.js monitor migration-2024-12-01-15-30-00
# Resume information
node src/progress-cli.js resume migration-2024-12-01-15-30-00
# Restart interrupted migration
node src/migrate-cli.js resume migration-2024-12-01-15-30-00 --query ./queries/migration-queries.xml
# Overall summary
node src/progress-cli.js summary
# Clean up old files
node src/progress-cli.js cleanup 7- β‘ Real-time Tracking: Real-time migration progress monitoring
- π Performance Metrics: Processing speed, estimated completion time
- π Detailed Analysis: Phase, query, and batch-level detailed information
- π Interruption Recovery: Resume interrupted migrations from the completed point
- πΎ Permanent Storage: Progress file for history management
- π οΈ CLI Tools: Various query and management commands
Added functionality to automatically exclude IDENTITY columns when using SELECT *:
- Auto Detection: Automatically detects
SELECT * FROM table_namepatterns - IDENTITY Column Exclusion: Automatically identifies and excludes IDENTITY columns from target tables
- Automatic Column List Generation: Automatically sets
targetColumns - Source Query Transformation: Converts
SELECT *to explicit column lists
<query id="migrate_users" targetTable="users" enabled="true">
<sourceQuery>
<![CDATA[SELECT * FROM users WHERE status = 'ACTIVE']]>
</sourceQuery>
<!-- targetColumns is automatically set (IDENTITY columns excluded) -->
</query>- Detect
SELECT *pattern - Query all columns from target table
- Identify and exclude IDENTITY columns
- Automatically set
targetColumns - Transform source query to explicit column list
SELECT * detected. Automatically retrieving column information for table users.
IDENTITY column auto-excluded: id
Auto-set column list (15 columns, IDENTITY excluded): name, email, status, created_date, ...
Modified source query: SELECT name, email, status, created_date, ... FROM users WHERE status = 'ACTIVE'
Global column overrides (Map) are βselectively appliedβ per query based on the XML applyGlobalColumns policy. Only the selected columns are safely applied to the actual data rows.
- Policy phase: Intersect
applyGlobalColumnsvalue (all,none,created_date,col1,col2, etc.) with the target table schema to choose applicable columns. - Apply phase: Only the selected columns are applied, and only to rows that actually contain those columns.
- Column matching is case-insensitive.
<globalColumnOverrides>
<override column="processed_at">GETDATE()</override>
<override column="data_version">2.1</override>
<override column="CREATED_DATE">${DATE:yyyy-MM-dd HH:mm:ss}</override>
<override column="company_code">{"COMPANY01":"APPLE","COMPANY02":"AMAZON"}</override>
<override column="email">{"a@company.com":"a@gmail.com"}</override>
</globalColumnOverrides>
<query id="migrate_users_all" enabled="true">
<sourceQuery targetTable="users" targetColumns="*" applyGlobalColumns="created_date">
<![CDATA[
SELECT * FROM users
]]>
</sourceQuery>
</query>With the above, only created_date is applied to this query, even though multiple overrides exist globally. Case-insensitive matching will apply if the actual column is created_date or any case variant.
- Time functions like
${DATE:...},${DATE.UTC:...}supported (22 timezones) - JSON mapping is only applied when the original value matches a mapping key; otherwise the original is preserved
Selective global overrides: created_date
Applied columns: created_date
Rows processed: N
In some environments, ALTER DATABASE ... SET AUTO_UPDATE_STATISTICS ON can generate warnings. For post-migration stats refresh, prefer:
EXEC sp_updatestats;
-- Or for specific tables
UPDATE STATISTICS [dbo].[users] WITH FULLSCAN;
UPDATE STATISTICS [dbo].[products] WITH FULLSCAN;The project includes batch files for testing various features:
test-xml-migration.bat # XML configuration test
test-dry-run.bat # DRY RUN mode test
test-dbid-migration.bat # DB ID reference test
test-log-levels.bat # Log level test
test-select-star-identity.bat # SELECT * IDENTITY exclusion test
test-dynamic-variables.js # Dynamic variables testnpm installnpm run buildThis will create a standalone executable in the dist/ directory:
dist/sql2db.exe(Windows 64-bit)
The build process uses pkg to bundle the Node.js application:
- Target: Windows x64 (Node.js 18)
- Compression: GZip
- Assets Included:
- All source files (
src/**/*.js) - Configuration files (
config/**/*.json) - Query definition files (
queries/**/*.xml,queries/**/*.json,queries/**/*.sql) - Example files (
examples/**/*.xml) - Resource files (
resources/**/*.sql) - Documentation files (README, USER_MANUAL, CHANGELOG)
- All source files (
# Run the executable directly (default: English)
dist\sql2db.exe
# Or use with language option (via environment variable)
set LANGUAGE=kr && dist\sql2db.exe
set LANGUAGE=en && dist\sql2db.exeThe standalone executable includes everything needed to run the application without requiring Node.js installation.
- Fork this repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Create a Pull Request
- π¬ Issue Reports: GitHub Issues
- π Documentation: Refer to documents in project root
- π§ Bug Fixes: Contribute via Pull Request
MIT License
Copyright (c) 2024 MSSQL Data Migration Tool
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
- Contact: sql2db.nodejs@gmail.com
- Website: sql2db.com