In this task, you create a Cloud Composer 3 environment.
On the Google Cloud console title bar, in the Search field, type cloud composer, and then click Composer.
On the Environments page, for Create environment, select Composer 3.
For Name, type cepf-dagify-migration-lab.
For Location, select us-east4.
Scroll to the bottom and click Show Advanced configuration.
For Environment bucket, select Custom bucket.
For Bucket name, click Browse, select qwiklabs-gcp-01-a52d1a55ade6-dagify-bucket, and then click Select.
Leave all other fields as default.
Click Create to create the environment. NOTE: It can take up to 20 minutes to create the environment. You can continue with the lab and return to recheck this task before you deploy DAG to Cloud Composer. Click Check my progress to verify the objective. Please select the desired custom Cloud Storage bucket while creating the Cloud Composer Environment. Create a Cloud Composer Environment Please select the desired custom Cloud Storage bucket while creating the Cloud Composer Environment.
In this task, you download and set up the DAGify tool.
In the console, on the Navigation menu (Navigation menu icon), click Compute Engine > VM instances.
For the instance named lab-setup, in the Connect column, click SSH to open an SSH-in-browser terminal window.
In the terminal window, run the following command to clone the DAGify repository:
git clone https://github.com/GoogleCloudPlatform/dagifyNavigate to the folder dagify and execute the command below:
cd ~/dagify
make dagify-clean
``
Wait a minute for the Python Packages to be installed.

Once completed, activate the Python virtual environment by running the command:
```bash
source venv/bin/activateVerify DAGify setup by running the following command:
python ./DAGify.py -hClick Check my progress to verify the objective. Download and configure DAGify
Task 3. Run the DAGify command with sample data in default mode In this task, you use DAGify in default mode to convert the sample Control-M export file in XML format into a Python native DAG.
In the lab-setup terminal, view the Control-M job XML file 001-tfatf.xml by running the following command: cat ./sample_data/control-m/001-tfatf.xml Copied! Output:
<?xml version="1.0" encoding="UTF-8"?>
<DEFTABLE xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="Folder.xsd">
<SMART_FOLDER FOLDER_NAME="fx_fld_001">
<!-- Folder 001, Application 001, Sub Application 001, Job 001 -->
<JOB APPLICATION="fx_fld_001_app_001" SUB_APPLICATION="fx_fld_001_app_001_subapp_001" JOBNAME="fx_fld_001_app_001_subapp_001_job_001" DESCRIPTION="fx_fld_001_app_001_subapp_001_job_001_reports" TASKTYPE="Command" CMDLINE="echo I am task A" PARENT_FOLDER="fx_fld_001">
<OUTCOND NAME="fx_fld_001_app_001_subapp_001_job_001_ok" ODATE="ODAT" SIGN="+" />
</JOB>
<!-- Folder 001, Application 001, Sub Application 001, Job 002 -->
<JOB APPLICATION="fx_fld_001_app_001" SUB_APPLICATION="fx_fld_001_app_001_subapp_001" JOBNAME="fx_fld_001_app_001_subapp_001_job_002" DESCRIPTION="fx_fld_001_app_001_subapp_001_job_002_reports" TASKTYPE="Command" CMDLINE="echo I am task B" PARENT_FOLDER="fx_fld_001">
<INCOND NAME="fx_fld_001_app_001_subapp_001_job_001_ok" ODATE="ODAT" AND_OR="A" />
<OUTCOND NAME="fx_fld_001_app_001_subapp_001_job_002_ok" ODATE="ODAT" SIGN="+" />
</JOB>
</SMART_FOLDER>Run DAGify using the default configuration:
python ./DAGify.py -s ./sample_data/control-m/001-tfatf.xml -o ./output/lab-output-task-3/This converts the sample Control-M export file in XML format into a Python native DAG and stores it in the output format lab-output-task-3 folder.
This also stores the final DAG Python file within a subfolder of the name of the Control-M project folder, which in this case, is 001-tfatf.
List the output DAG files in the ./output/lab-output-task-3/001-tfatf/ directory by running the command below:
ls ./output/lab-output-task-3/001-tfatf/Output:
fx_fld_001.py
This is your converted Python DAG and it's ready for deployment within Cloud Composer.
View the converted Python DAG file fx_fld_001.py by running the command below:
cat ./output/lab-output-task-3/001-tfatf/fx_fld_001.pyOutput:
# Apache Airflow Base Imports
from airflow import DAG
from airflow.decorators import task
from airflow.sensors.external_task import ExternalTaskMarker
from airflow.sensors.external_task import ExternalTaskSensor
import datetime
# Apache Airflow Custom & DAG/Task Specific Imports
from airflow.operators.bash import BashOperator
default_args = {
'owner': 'airflow',
}
with DAG(
dag_id="fx_fld_001",
start_date=datetime.datetime(2024, 1, 1),
schedule_interval="@daily", # TIMEFROM not found, default schedule set to @daily,
catchup=False,
) as dag:
# DAG Tasks
fx_fld_001_app_001_subapp_001_job_001 = BashOperator(
task_id="fx_fld_001_app_001_subapp_001_job_001",
bash_command="echo I am task A",
trigger_rule="all_success",
dag=dag,
)Click Check my progress to verify the objective. Run DAGify command with sample data in the default mode
In this task, you check that the Cloud Composer environment is up and then deploy the converted DAG file into the environment.

In the console title bar, type cloud composer in the Search field and then click on Composer. Validate that your Cloud Composer environment cepf-dagify-migration-lab is up and running and ready for use.
Note: If the Cloud Composer environment is still being created, wait for the creation process to complete. From the environment list, click cepf-dagify-migration-lab, to open the Environment details page.
Click Open DAGs folder to access the DAGs folder in the Cloud Storage bucket qwiklabs-gcp-01-a52d1a55ade6-dagify-bucket.
Here you can see the example default DAG airflow_monitoring.py that is issued when a composer environment is created.
Return to the terminal window.
Upload the new Python DAG fx_fld_001.py to the Cloud Storage bucket qwiklabs-gcp-01-a52d1a55ade6-dagify-bucket associated with Cloud Composer environment using the gcloud storage cp command:
gcloud storage cp -r ~/dagify/output/lab-output-task-3/001-tfatf/* gs://qwiklabs-gcp-01-a52d1a55ade6-dagify-bucket/dags/lab-output-task-3/Inspect that the Python DAG file has been uploaded into the correct storage bucket.
gcloud storage ls gs://qwiklabs-gcp-01-a52d1a55ade6-dagify-bucket/dags/lab-output-task-3/*Output:
gs://qwiklabs-gcp-01-a52d1a55ade6-dagify-bucket/dags/lab-output-task-3/fx_fld_001.py Go back to the Environment details page for cepf-dagify-migration-lab environment.
Authenticate using student-00-18a02f1357d9@qwiklabs.net user.
Verify that fx_fld_001 DAG is visible in the DAGs list.
NOTE: It can take up to 2-3 minutes for uploaded DAGs to Synchronize into Google Cloud Composer Explore Cloud Composer In this task you explore the detailed view of fx_fld_001 in Cloud Composer.
Click fx_fld_001 to open the detailed view of the DAG.
Click Code to view the original converted Python source code.
Click Graph to view the graph of all tasks and dependencies.
Click Check my progress to verify the objective. Deploy DAG to Cloud Composer
In this task, you use DAGify with the DAG divider flag ( -d) to divide control-m workflow into multiple DAG files based on the XML Key APPLICATION.
Switch to the lab-setup terminal window.
View Control-M job XML file 002-tftf.xml by running the command:
cat ./sample_data/control-m/002-tftf.xmlOutput:
<?xml version="1.0" encoding="UTF-8"?>
<DEFTABLE xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="Folder.xsd">
<SMART_FOLDER FOLDER_NAME="fx_fld_001">
<!-- Folder 001, Application 001, Sub Application 001, Job 001 -->
<JOB APPLICATION="fx_fld_001_app_001" SUB_APPLICATION="fx_fld_001_app_001_subapp_001" JOBNAME="fx_fld_001_app_001_subapp_001_job_001" DESCRIPTION="fx_fld_001_app_001_subapp_001_job_001_reports" TASKTYPE="Command" CMDLINE="" PARENT_FOLDER="fx_fld_001">
<OUTCOND NAME="fx_fld_001_app_001_subapp_001_job_001_ok" ODATE="ODAT" SIGN="+" />
</JOB>
<!-- Folder 001, Application 001, Sub Application 001, Job 002 -->
<JOB APPLICATION="fx_fld_001_app_001" SUB_APPLICATION="fx_fld_001_app_001_subapp_001" JOBNAME="fx_fld_001_app_001_subapp_001_job_002" DESCRIPTION="fx_fld_001_app_001_subapp_001_job_002_reports" TASKTYPE="Command" CMDLINE="" PARENT_FOLDER="fx_fld_001">
<INCOND NAME="fx_fld_001_app_001_subapp_001_job_001_ok" ODATE="ODAT" AND_OR="A" />
<OUTCOND NAME="fx_fld_001_app_001_subapp_001_job_002_ok" ODATE="ODAT" SIGN="+" />
</JOB>
</SMART_FOLDER>Run the DAGify command using the DAG divider flag:
python ./DAGify.py -s ./sample_data/control-m/002-tftf.xml -o ./output/lab-output-task-5/ -d APPLICATIONThis converts the sample Control-M export file in XML format into a Python native DAG and stores it in the output format lab-output-task-5 folder.

The final DAG Python file is stored within a subfolder in the Control-M project folder. For this case, that folder is 002-tftf.
List the output DAG files by running the command:
ls ./output/lab-output-task-5/002-tftf/You should see multiple Python files created in the directory ./output/lab-output-task-5/002-tftf/ similar to following:
Output:
fx_fld_001_app_001.py fx_fld_001_app_002.py These are new Python DAGs that are ready to deploy within Cloud Composer.
View a converted Python DAG file, in this case fx_fld_001_app_001.py, by running the command:
cat ./output/lab-output-task-5/002-tftf/fx_fld_001_app_001.pyOutput:
# Apache Airflow Base Imports
from airflow import DAG
from airflow.decorators import task
from airflow.sensors.external_task import ExternalTaskMarker
from airflow.sensors.external_task import ExternalTaskSensor
import datetime
# Apache Airflow Custom & DAG/Task Specific Imports
from airflow.operators.bash import BashOperator
default_args = {
'owner': 'airflow',
}
with DAG(
dag_id="fx_fld_001_app_001",
start_date=datetime.datetime(2024, 1, 1),
schedule_interval="@daily", # TIMEFROM not found, default schedule set to @daily,
catchup=False,
) as dag:
# DAG Tasks
fx_fld_001_app_001_subapp_001_job_001 = BashOperator(
task_id="fx_fld_001_app_001_subapp_001_job_001",
bash_command="",
trigger_rule="all_success",
dag=dag,
)...
Deploy the converted Python DAG files by running the gcloud storage cp command to move them to the Cloud Storage bucket associated with the Cloud Composer environment:
gcloud storage cp -r ~/dagify/output/lab-output-task-5/002-tftf/* gs://qwiklabs-gcp-01-a52d1a55ade6-dagify-bucket/dags/lab-output-task-5/Verify that the Python DAG files are uploaded into the correct storage bucket by running the command:
gcloud storage ls gs://qwiklabs-gcp-01-a52d1a55ade6-dagify-bucket/dags/lab-output-task-5/*










