Export JSON/XML/YAML/CSV/MYSQL/PSQL/SQLITE/SQLSERVER/MONGODB #249
Workflow file for this run
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
name: Export All Database Formats | |
on: | |
push: | |
branches: | |
- master | |
paths: | |
- 'sql/**' | |
- 'bin/Commands/Export/**' | |
- '.github/workflows/export.yml' | |
workflow_dispatch: | |
inputs: | |
pass: | |
description: "Passcode" | |
required: true | |
env: | |
MYSQL_ROOT_PASSWORD: root | |
POSTGRES_PASSWORD: postgres | |
MONGODB_VERSION: '6.0' | |
jobs: | |
validate: | |
name: Setup Database and Get Counts | |
runs-on: ubuntu-24.04 | |
outputs: | |
region_count: ${{ steps.counts.outputs.region_count }} | |
subregion_count: ${{ steps.counts.outputs.subregion_count }} | |
country_count: ${{ steps.counts.outputs.country_count }} | |
state_count: ${{ steps.counts.outputs.state_count }} | |
city_count: ${{ steps.counts.outputs.city_count }} | |
current_date: ${{ steps.counts.outputs.current_date }} | |
steps: | |
- name: Checkout | |
uses: actions/checkout@v4 | |
with: | |
fetch-depth: 2 | |
- name: Setup PHP | |
uses: shivammathur/setup-php@v2 | |
with: | |
php-version: 8.2 | |
extensions: intl, pdo_mysql | |
coverage: none | |
ini-values: "post_max_size=256M, memory_limit=512M" | |
- name: Cache Composer dependencies | |
uses: actions/cache@v4 | |
with: | |
path: bin/vendor | |
key: ${{ runner.os }}-composer-${{ hashFiles('bin/composer.lock') }} | |
restore-keys: ${{ runner.os }}-composer- | |
- name: Start MySQL service | |
run: | | |
sudo systemctl start mysql.service | |
mysql --version | |
# Wait for MySQL to be ready | |
while ! mysqladmin ping -h"127.0.0.1" --silent; do | |
echo "Waiting for MySQL..." | |
sleep 1 | |
done | |
- name: Setup MySQL Database | |
run: | | |
mysql -uroot -proot -e "CREATE DATABASE IF NOT EXISTS world CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci;" | |
mysql -uroot -proot -e "SHOW DATABASES;" | |
# Validate SQL file before import | |
echo "Validating SQL file..." | |
if ! mysql -uroot -proot --default-character-set=utf8mb4 world < sql/world.sql; then | |
echo "β SQL import failed" | |
exit 1 | |
fi | |
echo "β SQL import successful" | |
- name: Install Composer Dependencies | |
working-directory: ./bin | |
run: | | |
composer install --no-dev --optimize-autoloader | |
php console list | |
- name: Get Data Counts | |
id: counts | |
run: | | |
# Get counts for documentation | |
region_count=$(mysql -uroot -proot -e 'SELECT COUNT(*) FROM world.regions;' -s) | |
subregion_count=$(mysql -uroot -proot -e 'SELECT COUNT(*) FROM world.subregions;' -s) | |
country_count=$(mysql -uroot -proot -e 'SELECT COUNT(*) FROM world.countries;' -s) | |
state_count=$(mysql -uroot -proot -e 'SELECT COUNT(*) FROM world.states;' -s) | |
city_count=$(mysql -uroot -proot -e 'SELECT COUNT(*) FROM world.cities;' -s) | |
current_date=$(date +'%dth %b %Y') | |
echo "π Data counts:" | |
echo " Regions: $region_count" | |
echo " Subregions: $subregion_count" | |
echo " Countries: $country_count" | |
echo " States: $state_count" | |
echo " Cities: $city_count" | |
# Set outputs | |
echo "region_count=$region_count" >> $GITHUB_OUTPUT | |
echo "subregion_count=$subregion_count" >> $GITHUB_OUTPUT | |
echo "country_count=$country_count" >> $GITHUB_OUTPUT | |
echo "state_count=$state_count" >> $GITHUB_OUTPUT | |
echo "city_count=$city_count" >> $GITHUB_OUTPUT | |
echo "current_date=$current_date" >> $GITHUB_OUTPUT | |
export: | |
name: Export Database Formats | |
runs-on: ubuntu-24.04 | |
needs: validate | |
strategy: | |
matrix: | |
format: [json-xml-yaml, csv, sql-dumps, sqlite, sqlserver-mongodb] | |
fail-fast: false | |
steps: | |
- name: Checkout | |
uses: actions/checkout@v4 | |
- name: Setup PHP | |
uses: shivammathur/setup-php@v2 | |
with: | |
php-version: 8.2 | |
extensions: intl, pdo_mysql, pdo_pgsql | |
coverage: none | |
ini-values: "post_max_size=256M, memory_limit=512M" | |
- name: Setup Node.js | |
if: matrix.format == 'sql-dumps' | |
uses: actions/setup-node@v4 | |
with: | |
node-version: 20.x | |
cache: 'npm' | |
cache-dependency-path: nmig/package-lock.json | |
- name: Cache Composer dependencies | |
uses: actions/cache@v4 | |
with: | |
path: bin/vendor | |
key: ${{ runner.os }}-composer-${{ hashFiles('bin/composer.lock') }} | |
restore-keys: ${{ runner.os }}-composer- | |
- name: Setup Databases | |
run: | | |
# MySQL | |
sudo systemctl start mysql.service | |
while ! mysqladmin ping -h"127.0.0.1" --silent; do sleep 1; done | |
mysql -uroot -proot -e "CREATE DATABASE world CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci;" | |
mysql -uroot -proot --default-character-set=utf8mb4 world < sql/world.sql | |
# PostgreSQL (only for sql-dumps) | |
if [ "${{ matrix.format }}" = "sql-dumps" ]; then | |
sudo systemctl start postgresql.service | |
while ! pg_isready; do sleep 1; done | |
sudo -u postgres psql -c "CREATE DATABASE world;" | |
sudo -u postgres psql -c "ALTER USER postgres PASSWORD 'postgres';" | |
fi | |
- name: Setup MongoDB | |
if: matrix.format == 'sqlserver-mongodb' | |
uses: supercharge/mongodb-github-action@1.10.0 | |
with: | |
mongodb-version: ${{ env.MONGODB_VERSION }} | |
mongodb-replica-set: rs0 | |
- name: Install MongoDB Tools | |
if: matrix.format == 'sqlserver-mongodb' | |
run: | | |
wget -q https://fastdl.mongodb.org/tools/db/mongodb-database-tools-ubuntu2204-x86_64-100.7.3.deb | |
sudo dpkg -i mongodb-database-tools-ubuntu2204-x86_64-100.7.3.deb | |
rm mongodb-database-tools-ubuntu2204-x86_64-100.7.3.deb | |
mongoimport --version | |
- name: Install Python Dependencies | |
if: matrix.format == 'sqlite' | |
run: | | |
python -m pip install --upgrade pip | |
pip install mysql-to-sqlite3 | |
mysql2sqlite --version | |
- name: Install Composer Dependencies | |
working-directory: ./bin | |
run: composer install --no-dev --optimize-autoloader | |
- name: Export JSON/XML/YAML | |
if: matrix.format == 'json-xml-yaml' | |
working-directory: ./bin | |
run: | | |
echo "π Exporting JSON..." | |
php console export:json | |
echo "π Exporting XML..." | |
php console export:xml | |
echo "π Exporting YAML..." | |
php console export:yaml | |
echo "β JSON/XML/YAML export completed" | |
- name: Export CSV | |
if: matrix.format == 'csv' | |
working-directory: ./bin | |
run: | | |
echo "π Exporting CSV..." | |
php console export:csv | |
echo "β CSV export completed" | |
- name: Export SQL Dumps | |
if: matrix.format == 'sql-dumps' | |
run: | | |
echo "π Exporting MySQL dumps..." | |
mkdir -p sql | |
mysqldump -uroot -proot --single-transaction --routines --triggers \ | |
--add-drop-table --disable-keys --set-charset --skip-add-locks \ | |
world regions > sql/regions.sql | |
mysqldump -uroot -proot --single-transaction --routines --triggers \ | |
--add-drop-table --disable-keys --set-charset --skip-add-locks \ | |
world subregions > sql/subregions.sql | |
mysqldump -uroot -proot --single-transaction --routines --triggers \ | |
--add-drop-table --disable-keys --set-charset --skip-add-locks \ | |
world countries > sql/countries.sql | |
mysqldump -uroot -proot --single-transaction --routines --triggers \ | |
--add-drop-table --disable-keys --set-charset --skip-add-locks \ | |
world states > sql/states.sql | |
mysqldump -uroot -proot --single-transaction --routines --triggers \ | |
--add-drop-table --disable-keys --set-charset --skip-add-locks \ | |
world cities > sql/cities.sql | |
echo "π Setting up PostgreSQL migration..." | |
cp nmig.config.json nmig/config/config.json | |
cd nmig | |
npm ci | |
npm run build | |
npm start | |
cd .. | |
echo "π Exporting PostgreSQL dumps..." | |
mkdir -p psql | |
export PGPASSWORD=postgres | |
pg_dump --dbname=postgresql://postgres:postgres@localhost/world \ | |
-Fp --inserts --clean --if-exists -t regions > psql/regions.sql | |
pg_dump --dbname=postgresql://postgres:postgres@localhost/world \ | |
-Fp --inserts --clean --if-exists -t subregions > psql/subregions.sql | |
pg_dump --dbname=postgresql://postgres:postgres@localhost/world \ | |
-Fp --inserts --clean --if-exists -t countries > psql/countries.sql | |
pg_dump --dbname=postgresql://postgres:postgres@localhost/world \ | |
-Fp --inserts --clean --if-exists -t states > psql/states.sql | |
pg_dump --dbname=postgresql://postgres:postgres@localhost/world \ | |
-Fp --inserts --clean --if-exists -t cities > psql/cities.sql | |
pg_dump --dbname=postgresql://postgres:postgres@localhost/world \ | |
-Fp --inserts --clean --if-exists > psql/world.sql | |
echo "β SQL dumps completed" | |
- name: Export SQLite | |
if: matrix.format == 'sqlite' | |
run: | | |
echo "π Exporting SQLite databases..." | |
mkdir -p sqlite | |
# Individual table exports with error handling | |
export_table() { | |
local table=$1 | |
echo "Exporting $table..." | |
if ! mysql2sqlite -d world -t "$table" --mysql-password root -u root -f "sqlite/${table}.sqlite3"; then | |
echo "β Failed to export $table" | |
return 1 | |
fi | |
echo "β $table exported successfully" | |
} | |
export_table "regions" | |
export_table "subregions" | |
export_table "countries" | |
export_table "states" | |
export_table "cities" | |
echo "π Exporting combined database..." | |
mysql2sqlite -d world --mysql-password root -u root -f sqlite/world.sqlite3 | |
echo "β SQLite export completed" | |
- name: Export SQL Server & MongoDB | |
if: matrix.format == 'sqlserver-mongodb' | |
working-directory: ./bin | |
run: | | |
echo "π Exporting SQL Server..." | |
php console export:sql-server | |
echo "π Exporting MongoDB..." | |
php console export:mongodb | |
echo "π Setting up MongoDB import..." | |
cd ../mongodb | |
sleep 5 # Wait for MongoDB | |
# Import with error handling | |
import_collection() { | |
local collection=$1 | |
echo "Importing $collection..." | |
if ! mongoimport --host localhost:27017 --db world \ | |
--collection "$collection" --file "${collection}.json" --jsonArray; then | |
echo "β Failed to import $collection" | |
return 1 | |
fi | |
echo "β $collection imported successfully" | |
} | |
import_collection "regions" | |
import_collection "subregions" | |
import_collection "countries" | |
import_collection "states" | |
import_collection "cities" | |
echo "π Creating MongoDB dump..." | |
mongodump --host localhost:27017 --db world --out mongodb-dump | |
tar -czf world-mongodb-dump.tar.gz mongodb-dump | |
rm -rf mongodb-dump *.json | |
echo "β MongoDB export completed" | |
- name: Verify Exports | |
run: | | |
echo "π Verifying exports..." | |
case "${{ matrix.format }}" in | |
"json-xml-yaml") | |
ls -la json/ xml/ yml/ | |
echo "JSON files: $(find json/ -name "*.json" | wc -l)" | |
echo "XML files: $(find xml/ -name "*.xml" | wc -l)" | |
echo "YAML files: $(find yml/ -name "*.yml" | wc -l)" | |
;; | |
"csv") | |
ls -la csv/ | |
echo "CSV files: $(find csv/ -name "*.csv" | wc -l)" | |
;; | |
"sql-dumps") | |
ls -la sql/ psql/ | |
echo "MySQL dumps: $(find sql/ -name "*.sql" | wc -l)" | |
echo "PostgreSQL dumps: $(find psql/ -name "*.sql" | wc -l)" | |
;; | |
"sqlite") | |
ls -la sqlite/ | |
echo "SQLite files: $(find sqlite/ -name "*.sqlite3" | wc -l)" | |
for file in sqlite/*.sqlite3; do | |
echo "Tables in $(basename $file): $(sqlite3 "$file" ".tables" | wc -w)" | |
done | |
;; | |
"sqlserver-mongodb") | |
ls -la sqlserver/ mongodb/ | |
echo "SQL Server files: $(find sqlserver/ -name "*.sql" | wc -l)" | |
echo "MongoDB files: $(find mongodb/ -name "*.tar.gz" | wc -l)" | |
;; | |
esac | |
- name: Upload Artifacts | |
uses: actions/upload-artifact@v4 | |
with: | |
name: export-${{ matrix.format }} | |
path: | | |
json/ | |
xml/ | |
yml/ | |
csv/ | |
sql/ | |
psql/ | |
sqlite/ | |
sqlserver/ | |
mongodb/ | |
retention-days: 30 | |
finalize: | |
name: Update Documentation & Create PR | |
runs-on: ubuntu-24.04 | |
needs: [validate, export] | |
if: always() && needs.validate.result == 'success' | |
steps: | |
- name: Checkout | |
uses: actions/checkout@v4 | |
- name: Download All Artifacts | |
uses: actions/download-artifact@v4 | |
with: | |
path: exports/ | |
- name: Merge Artifacts | |
run: | | |
echo "π Merging export artifacts..." | |
for dir in exports/*/; do | |
echo "Processing $dir" | |
cp -r "$dir"* . 2>/dev/null || true | |
done | |
rm -rf exports/ | |
- name: Update README.md | |
run: | | |
echo "π Updating README.md..." | |
sed -i "s/Total Regions : [0-9]* <br>/Total Regions : ${{ needs.validate.outputs.region_count }} <br>/" README.md | |
sed -i "s/Total Sub Regions : [0-9]* <br>/Total Sub Regions : ${{ needs.validate.outputs.subregion_count }} <br>/" README.md | |
sed -i "s/Total Countries : [0-9]* <br>/Total Countries : ${{ needs.validate.outputs.country_count }} <br>/" README.md | |
sed -i "s/Total States\/Regions\/Municipalities : [0-9]* <br>/Total States\/Regions\/Municipalities : ${{ needs.validate.outputs.state_count }} <br>/" README.md | |
sed -i "s/Total Cities\/Towns\/Districts : [0-9]* <br>/Total Cities\/Towns\/Districts : ${{ needs.validate.outputs.city_count }} <br>/" README.md | |
sed -i "s/Last Updated On : .*$/Last Updated On : ${{ needs.validate.outputs.current_date }}/" README.md | |
- name: Generate Export Summary | |
run: | | |
echo "π Generating export summary..." | |
cat > EXPORT_SUMMARY.md << EOF | |
# Export Summary - ${{ needs.validate.outputs.current_date }} | |
## Data Statistics | |
- **Regions**: ${{ needs.validate.outputs.region_count }} | |
- **Subregions**: ${{ needs.validate.outputs.subregion_count }} | |
- **Countries**: ${{ needs.validate.outputs.country_count }} | |
- **States/Regions**: ${{ needs.validate.outputs.state_count }} | |
- **Cities**: ${{ needs.validate.outputs.city_count }} | |
## Export Status | |
$(if [ -d "json" ]; then echo "β JSON exports completed"; else echo "β JSON exports failed"; fi) | |
$(if [ -d "xml" ]; then echo "β XML exports completed"; else echo "β XML exports failed"; fi) | |
$(if [ -d "yml" ]; then echo "β YAML exports completed"; else echo "β YAML exports failed"; fi) | |
$(if [ -d "csv" ]; then echo "β CSV exports completed"; else echo "β CSV exports failed"; fi) | |
$(if [ -d "sql" ]; then echo "β MySQL dumps completed"; else echo "β MySQL dumps failed"; fi) | |
$(if [ -d "psql" ]; then echo "β PostgreSQL dumps completed"; else echo "β PostgreSQL dumps failed"; fi) | |
$(if [ -d "sqlite" ]; then echo "β SQLite exports completed"; else echo "β SQLite exports failed"; fi) | |
$(if [ -d "sqlserver" ]; then echo "β SQL Server exports completed"; else echo "β SQL Server exports failed"; fi) | |
$(if [ -d "mongodb" ]; then echo "β MongoDB exports completed"; else echo "β MongoDB exports failed"; fi) | |
## File Counts | |
- JSON files: $(find . -name "*.json" 2>/dev/null | wc -l) | |
- XML files: $(find . -name "*.xml" 2>/dev/null | wc -l) | |
- YAML files: $(find . -name "*.yml" 2>/dev/null | wc -l) | |
- CSV files: $(find . -name "*.csv" 2>/dev/null | wc -l) | |
- SQL files: $(find . -name "*.sql" 2>/dev/null | wc -l) | |
- SQLite files: $(find . -name "*.sqlite3" 2>/dev/null | wc -l) | |
EOF | |
- name: Create Pull Request | |
uses: peter-evans/create-pull-request@v6 | |
with: | |
commit-message: | | |
π¦ Export database formats - ${{ needs.validate.outputs.current_date }} | |
- Updated all export formats (JSON, XML, YAML, CSV, SQL, SQLite, MongoDB) | |
- Total records: ${{ needs.validate.outputs.country_count }} countries, ${{ needs.validate.outputs.state_count }} states, ${{ needs.validate.outputs.city_count }} cities | |
- Export validation passed | |
committer: Darshan Gada <gadadarshan@gmail.com> | |
author: GitHub Actions <actions@github.com> | |
signoff: true | |
branch: export/automated-${{ github.run_number }} | |
delete-branch: true | |
title: "π Automated Database Export - ${{ needs.validate.outputs.current_date }}" | |
body: | | |
## π¦ Automated Database Export | |
This PR contains the latest export of all database formats. | |
### π Statistics | |
- **Regions**: ${{ needs.validate.outputs.region_count }} | |
- **Subregions**: ${{ needs.validate.outputs.subregion_count }} | |
- **Countries**: ${{ needs.validate.outputs.country_count }} | |
- **States**: ${{ needs.validate.outputs.state_count }} | |
- **Cities**: ${{ needs.validate.outputs.city_count }} | |
### π§ Export Formats | |
- β JSON (structured data) | |
- β XML (markup format) | |
- β YAML (human-readable) | |
- β CSV (spreadsheet format) | |
- β MySQL dumps | |
- β PostgreSQL dumps | |
- β SQLite databases | |
- β SQL Server scripts | |
- β MongoDB collections | |
### π¦ Quality Checks | |
- Data validation passed | |
- File integrity verified | |
- Documentation updated | |
--- | |
*Generated automatically by GitHub Actions* | |
labels: | | |
exports | |
automated | |
data-update | |
reviewers: dr5hn | |
draft: false |