diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image003.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image003.png new file mode 100644 index 00000000..f002682b Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image003.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image004.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image004.png new file mode 100644 index 00000000..28f1d798 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image004.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image005.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image005.png index a1768fba..eee58a60 100644 Binary files a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image005.png and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image005.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image006.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image006.png index 8ce3d02e..29d854fb 100644 Binary files a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image006.png and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image006.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image007.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image007.png new file mode 100644 index 00000000..f0f503b3 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image007.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image008.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image008.png new file mode 100644 index 00000000..d4f314fe Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image008.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image009.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image009.png index fbdc8cdc..286221a5 100644 Binary files a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image009.png and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image009.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image010.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image010.png new file mode 100644 index 00000000..c5cc0483 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image010.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image011.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image011.png new file mode 100644 index 00000000..2d5dadcd Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image011.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image013.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image013.png index aec9e6b8..356dc54a 100644 Binary files a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image013.png and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image013.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image015.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image015.png new file mode 100644 index 00000000..c456829b Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image015.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image016.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image016.png new file mode 100644 index 00000000..76f73441 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image016.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image017.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image017.png new file mode 100644 index 00000000..731f28a2 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image017.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image018.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image018.png new file mode 100644 index 00000000..f045c2b6 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image018.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image019.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image019.png new file mode 100644 index 00000000..c9b7802a Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image019.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image020.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image020.png new file mode 100644 index 00000000..149c822f Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image020.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image021.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image021.png new file mode 100644 index 00000000..08758de3 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image021.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image022.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image022.png new file mode 100644 index 00000000..c5ce6660 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image022.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image023.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image023.png new file mode 100644 index 00000000..5c73a73d Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image023.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image024.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image024.png new file mode 100644 index 00000000..5eb221f3 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image024.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image025.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image025.png new file mode 100644 index 00000000..3b663b16 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image025.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image026.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image026.png new file mode 100644 index 00000000..7bc17c6b Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image026.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image027.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image027.png new file mode 100644 index 00000000..514e3a33 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image027.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image029.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image029.png new file mode 100644 index 00000000..d69cdbf3 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image029.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image031.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image031.png new file mode 100644 index 00000000..0fc85389 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image031.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image032.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image032.png new file mode 100644 index 00000000..e75e44c0 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image032.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image034.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image034.png new file mode 100644 index 00000000..30385815 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image034.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image036.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image036.png new file mode 100644 index 00000000..3ac122cb Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image036.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image037.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image037.png new file mode 100644 index 00000000..2718bc8b Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image037.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image038.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image038.png new file mode 100644 index 00000000..239471af Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image038.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image039.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image039.png new file mode 100644 index 00000000..13874070 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image039.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image040.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image040.png new file mode 100644 index 00000000..52a6405b Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image040.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image041.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image041.png new file mode 100644 index 00000000..c0f7b52c Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image041.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image042.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image042.png index 0d7556d5..59c02168 100644 Binary files a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image042.png and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image042.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image043.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image043.png new file mode 100644 index 00000000..ba03b9d1 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image043.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image044.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image044.png index 10691517..0f18b4d0 100644 Binary files a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image044.png and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image044.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image045.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image045.png new file mode 100644 index 00000000..e618ae4c Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image045.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image046.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image046.png new file mode 100644 index 00000000..0bf249d6 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image046.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image047.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image047.png new file mode 100644 index 00000000..0fe40f4c Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image047.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image048.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image048.png new file mode 100644 index 00000000..6b0953f3 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image048.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image052.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image052.png new file mode 100644 index 00000000..c2f69e3e Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image052.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image054.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image054.png new file mode 100644 index 00000000..4b32f512 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image054.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image055.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image055.png new file mode 100644 index 00000000..7514c91f Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image055.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image056.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image056.png new file mode 100644 index 00000000..6257634e Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image056.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image057.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image057.png new file mode 100644 index 00000000..b41034cf Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image057.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image058.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image058.png new file mode 100644 index 00000000..6329d128 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image058.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image059.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image059.png new file mode 100644 index 00000000..041656c1 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image059.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image060.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image060.png new file mode 100644 index 00000000..902f18bb Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image060.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image061.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image061.png new file mode 100644 index 00000000..70e55d7b Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image061.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image062.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image062.png new file mode 100644 index 00000000..02c867f1 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image062.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image063.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image063.png new file mode 100644 index 00000000..ac34e108 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image063.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image064.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image064.png new file mode 100644 index 00000000..383ea348 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image064.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image065.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image065.png new file mode 100644 index 00000000..2f56077a Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image065.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image066.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image066.png new file mode 100644 index 00000000..570ce3e6 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image066.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image067.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image067.png new file mode 100644 index 00000000..c855a9a1 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image067.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image068.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image068.png new file mode 100644 index 00000000..21b2a2c3 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image068.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image069.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image069.png new file mode 100644 index 00000000..7eb533d2 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image069.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image070.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image070.png new file mode 100644 index 00000000..e8b423a0 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image070.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image073.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image073.png new file mode 100644 index 00000000..3aa6df09 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image073.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image074.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image074.png new file mode 100644 index 00000000..5153600d Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image074.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image075.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image075.png new file mode 100644 index 00000000..a04ba93a Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image075.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image076.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image076.png new file mode 100644 index 00000000..d88a814d Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image076.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image077.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image077.png new file mode 100644 index 00000000..312915cf Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image077.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image078.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image078.png new file mode 100644 index 00000000..e693a50e Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image078.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image079.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image079.png new file mode 100644 index 00000000..dce31c7f Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image079.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image100.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image100.png new file mode 100644 index 00000000..c3fdc9b2 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image100.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image101.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image101.png new file mode 100644 index 00000000..f3d064f7 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image101.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image102.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image102.png new file mode 100644 index 00000000..b78d859f Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image102.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image103.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image103.png new file mode 100644 index 00000000..0725b7a0 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image103.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image104.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image104.png new file mode 100644 index 00000000..16ec9dff Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image104.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image105.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image105.png new file mode 100644 index 00000000..9991b0e6 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image105.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image107.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image107.png new file mode 100644 index 00000000..321d4310 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image107.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image108.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image108.png new file mode 100644 index 00000000..9657c765 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image108.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image109.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image109.png new file mode 100644 index 00000000..90a733b0 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image109.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image110.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image110.png new file mode 100644 index 00000000..dfc86ab6 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image110.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image111.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image111.png new file mode 100644 index 00000000..bf2d6215 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image111.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image112.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image112.png new file mode 100644 index 00000000..308bdefe Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image112.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image113.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image113.png new file mode 100644 index 00000000..1c6fb5b6 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image113.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image114.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image114.png new file mode 100644 index 00000000..2b8f2cb6 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image114.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image115.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image115.png new file mode 100644 index 00000000..6d189b32 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image115.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image116.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image116.png new file mode 100644 index 00000000..5026c77c Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image116.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image119.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image119.png new file mode 100644 index 00000000..6a0fb47c Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image119.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image130.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image130.png new file mode 100644 index 00000000..6bc80390 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image130.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image131.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image131.png new file mode 100644 index 00000000..d5b55bda Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image131.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image132.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image132.png new file mode 100644 index 00000000..e88ed7a3 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image132.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image133.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image133.png new file mode 100644 index 00000000..79b8f33c Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image133.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image134.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image134.png new file mode 100644 index 00000000..aec1e9b7 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image134.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image135.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image135.png new file mode 100644 index 00000000..fa6bd4ea Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image135.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image136.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image136.png new file mode 100644 index 00000000..7b4464e2 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image136.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image137.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image137.png new file mode 100644 index 00000000..ec15aad2 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image137.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image138.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image138.png new file mode 100644 index 00000000..fc9493cc Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image138.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image139.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image139.png new file mode 100644 index 00000000..4dc44ca1 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image139.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image140.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image140.png new file mode 100644 index 00000000..badc9083 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image140.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image141.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image141.png new file mode 100644 index 00000000..be66d177 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image141.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image142.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image142.png new file mode 100644 index 00000000..fc88a184 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image142.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image143.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image143.png new file mode 100644 index 00000000..158aaafa Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image143.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image144.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image144.png new file mode 100644 index 00000000..cfada348 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image144.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image145.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image145.png new file mode 100644 index 00000000..1dc42fda Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image145.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image146.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image146.png new file mode 100644 index 00000000..d7a5ce66 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image146.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image147.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image147.png new file mode 100644 index 00000000..e94a4c59 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image147.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image148.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image148.png new file mode 100644 index 00000000..9805e8f7 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image148.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image149.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image149.png new file mode 100644 index 00000000..8b11e560 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image149.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image150.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image150.png new file mode 100644 index 00000000..63bdbbf5 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image150.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image151.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image151.png new file mode 100644 index 00000000..64aec751 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image151.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image152.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image152.png new file mode 100644 index 00000000..d6de9b25 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image152.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image153.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image153.png new file mode 100644 index 00000000..7093bc02 Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image153.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image154.png b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image154.png new file mode 100644 index 00000000..4dbdaf9c Binary files /dev/null and b/03-Azure/01-02 Data/03-Talk_to_your_data/Images/image154.png differ diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/README.md b/03-Azure/01-02 Data/03-Talk_to_your_data/README.md index dad43799..d2942ced 100644 --- a/03-Azure/01-02 Data/03-Talk_to_your_data/README.md +++ b/03-Azure/01-02 Data/03-Talk_to_your_data/README.md @@ -1,6 +1,6 @@ ![image](./Images/Preview.png) -# Microsoft Migrate & Modernize MicroHack Day - Ask Analyze Act - Talk to Your Data in the Era of AI +# Ask Analyze Act - Talk to Your Data in the Era of AI - [**MicroHack introduction**](#MicroHack-introduction) - [**MicroHack context**](#microhack-context) - [**Objectives**](#objectives) diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/challenges/challenge-02.md b/03-Azure/01-02 Data/03-Talk_to_your_data/challenges/challenge-02.md index 767f5c8b..97aa011e 100644 --- a/03-Azure/01-02 Data/03-Talk_to_your_data/challenges/challenge-02.md +++ b/03-Azure/01-02 Data/03-Talk_to_your_data/challenges/challenge-02.md @@ -4,21 +4,23 @@ ## Goal -The goal of this exercise is to build a unified, analytics‑ and AI‑ready data foundation in Microsoft Fabric that enables governed reporting, natural language querying, and intelligent data interaction through a Data Agent. +The goal of this exercise is to build a report-ready and AI-ready semantic layer in Microsoft Fabric by creating and optimizing a Semantic Model, refining a Power BI report, and preparing model metadata for natural language analysis with Copilot. ## Actions -* Combine mirrored Azure SQL Managed Instance databases and external CSV files into a single Lakehouse using shortcuts -* Create and optimize a Semantic Model from the Lakehouse tables, including relationships and time‑based logic -* Prepare the data for AI by simplifying the schema and providing AI instructions for business context -* Set up a Data Agent using the prepared Lakehouse and Semantic Model as trusted data sources +* Enable the required Power BI / Fabric trial to unlock reporting and AI capabilities +* Create and optimize a Semantic Model from the Lakehouse tables, including key relationships and cross-filter behavior +* Auto-create and refine a Power BI report using Copilot prompts and manual adjustments +* Prepare the semantic model for AI by simplifying the schema, reviewing verified answers, and adding AI instructions +* Explore Power BI Copilot with the prepared model to validate business-friendly responses ## Success criteria -* You have successfully unified operational and external data in a single Lakehouse that is accessible via Microsoft Fabric -* You have successfully created and optimized a Semantic Model that supports reliable reporting and efficient query performance -* You have successfully prepared the data schema and AI instructions to enable accurate natural language queries -* You have successfully validated that the Data Agent returns relevant, context‑aware answers based on trusted analytics data +* You have successfully enabled the Power BI / Fabric trial and can access reporting and Copilot features +* You have successfully created and optimized a Semantic Model that supports reliable reporting and efficient analysis +* You have successfully generated and refined a report that reflects key business insights +* You have successfully prepared the model for AI by simplifying schema exposure and adding business-focused AI instructions +* You have successfully validated Power BI Copilot responses against the prepared semantic model [Open the step-by-step solution for Challenge 2](../walkthrough/challenge-02/solution-02.md) \ No newline at end of file diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/challenges/challenge-03.md b/03-Azure/01-02 Data/03-Talk_to_your_data/challenges/challenge-03.md index 959a703c..d9278cf5 100644 --- a/03-Azure/01-02 Data/03-Talk_to_your_data/challenges/challenge-03.md +++ b/03-Azure/01-02 Data/03-Talk_to_your_data/challenges/challenge-03.md @@ -4,23 +4,25 @@ ## Goal -The goal of this exercise is to understand how data context and instructions influence the behavior, reasoning quality, and trustworthiness of a Data Agent, and validate how a well‑configured agent performs in a real M365 Copilot user experience. +The goal of this exercise is to understand how instruction quality and data context influence Data Agent response quality, then validate those improvements from baseline behavior through production-ready guidance and M365 Copilot publishing. ## Actions -* Interact with the Data Agent using predefined prompts without any instructions to observe default behavior -* Design and apply custom Data Agent instructions -* Test the Data Agent again and evaluate response improvements -* Compare custom instructions with the provided Agent instructions and Data Source instructions -* Publish the Data Agent and access it through M365 Copilot chat +* Set up a Data Agent connected to the Lakehouse as the trusted data source +* Use prompt engineering principles and run prompts without instructions to observe baseline behavior +* Write your own Agent Instructions and Data Source Description/Instructions +* Compare your setup with the lab reference instructions across stepwise maturity levels (Step 0 to Step 4) +* Re-test the Data Agent to confirm improvements in clarity, structure, and reliability +* Publish the Data Agent and open it in M365 Copilot chat ## Success criteria -* You have successfully observed clear differences between uninstructed and instructed agent behavior -* You have successfully demonstrated that custom instructions improve the accuracy, clarity, and usefulness of agent responses -* You have successfully understood the impact of Agent instructions versus Data Source instructions -* You have successfully validated that the Data Agent behaves consistently when accessed via M365 Copilot +* You have successfully observed clear differences between baseline (no instructions) and instructed behavior +* You have successfully demonstrated that stronger instructions improve response quality step by step +* You have successfully understood the distinct impact of Agent Instructions versus Data Source Description/Instructions +* You have successfully validated improved Data Agent behavior with repeated prompt testing +* You have successfully published the agent and verified it in M365 Copilot [Open the step-by-step solution for Challenge 3](../walkthrough/challenge-03/solution-03.md) diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/walkthrough/challenge-01/solution-01.md b/03-Azure/01-02 Data/03-Talk_to_your_data/walkthrough/challenge-01/solution-01.md index 3401a2c4..f5775012 100644 --- a/03-Azure/01-02 Data/03-Talk_to_your_data/walkthrough/challenge-01/solution-01.md +++ b/03-Azure/01-02 Data/03-Talk_to_your_data/walkthrough/challenge-01/solution-01.md @@ -24,10 +24,10 @@ # Generic Migration Content | **Narrative** | **Notes** | |:-----|:-------| -| *Notes for outside of the workshop:* *Familiarise yourself with Microsoft migration tools and the Azure Database Migration Guide* | Azure Database Migration Guide: [https://www.microsoft.com/en-us/download/default.aspx](https://azure.microsoft.com/en-gb/services/database-migration/) DMA & download link: Azure Data Studio and Migration Extension download Links: [Download and install Azure Data Studio - Azure Data Studio \| Microsoft Learn](https://learn.microsoft.com/en-us/sql/azure-data-studio/download-azure-data-studio?view=sql-server-ver16&tabs=redhat-install%2Credhat-uninstall) [Azure SQL migration extension for Azure Data Studio - Azure Data Studio \| Microsoft Learn](https://learn.microsoft.com/en-us/sql/azure-data-studio/extensions/azure-sql-migration-extension?view=sql-server-ver16) Microsoft Migration Portal: [https://datamigration.microsoft.com/](https://www.microsoft.com/en-us/download/default.aspx) Identify the right Azure SQL Database, Azure SQL Managed Instance or SQL Server on Azure VM SKU for your on-premises database | +| *Notes for outside of the workshop: Familiarise yourself with Microsoft Fabric, Data Agents, M365 Copilot and Azure databases* | **Microsoft Fabric & Mirroring Resources:**
- [Microsoft Fabric Overview](https://learn.microsoft.com/en-us/fabric/fundamentals/microsoft-fabric-overview)
- [OneLake Overview](https://learn.microsoft.com/en-us/fabric/onelake/onelake-overview)
- [Mirroring in Fabric](https://learn.microsoft.com/en-us/fabric/mirroring/overview)
- [OneLake Shortcuts](https://learn.microsoft.com/en-us/fabric/onelake/onelake-shortcuts)
- [OneLake Explorer](https://learn.microsoft.com/en-us/fabric/onelake/onelake-file-explorer)
- [Fabric Data Agent](https://learn.microsoft.com/en-us/fabric/data-science/concept-data-agent)
- [Fabric Data Agent in Microsoft Copilot Studio](https://learn.microsoft.com/en-us/fabric/data-science/data-agent-microsoft-copilot-studio)
| # Lab Overview -In this lab, you will set up database mirroring from Azure SQL Managed Instance to Microsoft Fabric. This process enables you to replicate operational data into Fabric’s OneLake for real-time analytics, reporting, and AI—without affecting source database performance. +In this walkthrough, you will set up database mirroring from Azure SQL Managed Instance to Microsoft Fabric. This process enables you to replicate operational data into Fabric’s OneLake for real-time analytics, reporting, and AI—without affecting source database performance. By using System Assigned Managed Identity (SAMI) for secure access, you will configure a mirrored database in Fabric, and start the mirroring process. You will learn to monitor synchronization and repeat the setup for additional databases. @@ -58,18 +58,18 @@ In this task, you will enable managed identity authentication on your Azure SQL - + In the Choose a database connection to get started window, select Azure SQL Managed Instance as the data source. Confirm that Azure SQL Managed Instance appears under New sources, then proceed to configure the connection. - + - In the Server (1) field, paste the Azure SQL Managed Instance server name: sqlmi-ttyd-01.d6a4157f03ba.database.windows.net so that Microsoft Fabric knows exactly which database to connect to for mirroring. This step directs Fabric to the correct SQL Managed Instance, ensuring secure authentication and accurate data synchronization from your operational system to the analytics environment. - + In the Server (1) field, paste the Azure SQL Managed Instance server name:sqlhackmi-z5v5uebsfrojm.8b4846304eec.database.windows.net so that Microsoft Fabric knows exactly which database to connect to for mirroring. This step directs Fabric to the correct SQL Managed Instance, ensuring secure authentication and accurate data synchronization from your operational system to the analytics environment. + Copy and paste directly. @@ -89,32 +89,32 @@ In this task, you will enable managed identity authentication on your Azure SQL - For Authentication kind, select Basic. In the Username (4) field, enter the SQL login username: DemoUser and in the Password (5) field, enter the corresponding password: Demo@pass1234567. - + For Authentication Kind (4), select Basic. In Username (5), enter: demouser. In Password (6), enter: Demo@pass1234567. + Username and password can be copied and pasted directly. - Ensure Use encrypted connection is checked to protect your data during transfer between Azure SQL Managed Instance and Microsoft Fabric. Click Connect (6) to validate the connection and continue. - This may take a few minutes to complete. + Ensure Use encrypted connection is checked to protect your data during transfer between Azure SQL Managed Instance and Microsoft Fabric. Click Connect (7) to validate the connection and continue. + This may take a few minutes to complete. - + Once you are successfully connected, on the Choose data screen, review the list of available tables. Select the tables you want to replicate (or select Select all if required for the lab). Click Connect to proceed. This may take a few minutes to connect. - + In the Destination screen, review the Name of the mirrored database. Verify that Azure SQL Database Managed Instance is shown as the source. Click Create mirrored database to start the mirroring process. - + Please remain on this page and avoid refreshing while the system completes the database mirroring process. @@ -143,7 +143,7 @@ In this task, you will initiate the mirroring process between your Azure SQL Man After 2-5 minutes, select Monitor Replication to see the replication status. - + Replicating Status:
Running: Replication is currently running, bringing snapshot and change data into OneLake.
Running with warning: Replication is running with transient errors.
Stopping/Stopped: Replication is stopped.
Error: Fatal error in replication that can't be recovered. @@ -151,7 +151,7 @@ In this task, you will initiate the mirroring process between your Azure SQL Man - + @@ -171,28 +171,28 @@ In this task, you will repeat the setup and monitoring procedures for additional - + In the Choose a database connection to get started window, select Azure SQL Managed Instance as the data source. Confirm that Azure SQL Managed Instance appears under New sources, then proceed to configure the connection. - + - In the Server (1) field, paste the Azure SQL Managed Instance server name: sqlmi-ttyd-01.d6a4157f03ba.database.windows.net + In the Server (1) field, paste the Azure SQL Managed Instance server name: sqlhackmi-z5v5uebsfrojm.8b4846304eec.database.windows.net - + In the Database (2) field, enter the source database name: TailspinToysFeedback_User### Replace ### with your  ttyd user postfix - + Select existing Data gateway (3) @@ -202,18 +202,39 @@ In this task, you will repeat the setup and monitoring procedures for additional - For Authentication kind, select Basic. In the Username (4) field, enter the SQL login username: DemoUser and in the Password (5) field, enter the corresponding password: Demo@pass1234567.
Ensure Use encrypted connection is checked. Click Connect (6) to validate the connection and continue. + For Authentication kind (4) , select Basic. In the Username (5) field, enter the SQL login username: demouser and in the Password (6) field, enter the corresponding password: Demo@pass1234567.
Ensure Use encrypted connection is checked. Click Connect to validate the connection and continue. - + + + + +
+ Troubleshooting
+ If you see the error "The specified connection name already exists. Try choosing a different name.", go back to the connection name and enter a new unique connection name at the end (for example: TailspinToysFeedback_User###_2).
+ Keep the same Server, Database, Data gateway, Authentication kind, Username, and Password values, then click Connect again. +
+ + This happens when a connection with the same name already exists in your workspace. + + + + Select all tables from the database. This may take a few minutes to complete. - + + + + In the Destination screen, review the Name of the mirrored database. Verify that Azure SQL Database Managed Instance is shown as the source. Click Create mirrored database to start the mirroring process. + + + + @@ -229,21 +250,21 @@ In this task, you will integrate operational data mirrored from Azure SQL Manage NarrativeNotes Recap: You mirrored two databases in Challenge 1.
You need to create a Lakehouse after mirroring two databases so you can combine and centralize data from multiple sources. Open your Workspace and Click New Item (1). In the search bar, search for Lakehouse (2) and select the Lakehouse (3). - + You need to enter lakehouse name and create it.Example Name: TailspinToysAnalytics and Leave Lakehouse schemas enabled. - + Open your new Lakehouse. Within the Tables folder navigate to dbo schema. By clicking the ellipsis (...) create shortcuts to the required tables (mirrored databases). Select New table shortcut - + In the New shortcut window, under Internal sources, select Microsoft OneLake.This option allows you to create shortcuts from mirrored databases stored in OneLake. - + In the Select a data source type screen, locate the mirrored database TailspinToys_User###. Select TailspinToys_User### created in Challenge 1. - + In the new shortcut screen, expand tables → dbo. Select alle tables except zzVersion (1). Click Next (2) to proceed.Important: Do NOT select zzVersion (this table is not required). - + Review the Summary screen and confirm the selected tables. Verify that the shortcut location is your current Lakehouse. Click Create to create the shortcuts. - + In the Explorer pane, confirm that the tables from TailspinToys_User### now appear under Tables. This confirms that the shortcuts were created successfully. - + @@ -259,36 +280,38 @@ In this task, you will integrate operational data mirrored from Azure SQL Manage -This time select select the mirrored database TailspinToysFeedback_User###. Click Next to continue. - +This time select the mirrored database TailspinToysFeedback_User###. Click Next to continue. + Expand Tables → dbo. Select all tables except Customer (1). Click Next (2) to proceed.Important: Do NOT select the Customer table, as it already exists from the first database. - + Review the Summary screen and confirm the selected tables. Verify the shortcut destination is the same Lakehouse. Click Create to finalize the shortcuts. Next you create a Shortcut from a file. Therefore click the ellipsis (...) next to the files folder in the explorer pane. This let's you easily integrate external data, such as CSV files, alongside your mirrored databases within the same Lakehouse for centralized analysis. Select New shortcut (2) to start creating a shortcut to an external data source. - + In the New shortcut window, choose the source type and Under External sources, select Azure Data Lake Storage Gen2 so you can directly link and access files stored in your organization's data lake, enabling seamless integration of external datasets with your Lakehouse environment for unified analytics. -Select New connection and Paste DataLake Storage URL: https://adls52026614.dfs.core.windows.net/ (1). Click Next (2) to continue.This ensures you are establishing a secure and direct connection to your organization’s Data Lake for accessing external files. - +Select New connection and Paste DataLake Storage URL: https://adlsgen2employeedata.dfs.core.windows.net/ (1). Click Next (2) to continue.This ensures you are establishing a secure and direct connection to your organization’s Data Lake for accessing external files. + Select the folder with your user number (1). Click Next (2) to continue.Selecting your user-specific folder ensures you only access and work with the files intended for your lab activities. - + Review the selected folder (1) and Click Skip(2) as you do not need to apply any transformation before creating the shortcut. Click Create to create the shortcut in your Lakehouse. - + Check shortcut creationIf the shortcut does not appear immediately, click the three dots next to Files and select Refresh. +Click container### and verify that the CSV file is visible in this container.Replace ### with your user number prefix/postfix. + Expand the Files folder. Click the Shortcut folder with your user name. Locate the CSV file and click the three dots (⋯) (e.g. employees_user_data.csv). - + Select Load to Tables (1) from the context menu and Choose New table (2). - + Change the table name to employees (1). Set the separator to , (comma) (2). Review the settings, then click Load (3).CSV files use commas to separate values, so selecting a comma ensures data loads into the correct columns. - + In the Explorer pane, locate Tables. Click the three dots (⋯) next to Tables (1). Select Refresh (2). Confirm that a table named employees appears under Tables (3).This confirms that the CSV data has been successfully converted into a Lakehouse table. - + ## Summary diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/walkthrough/challenge-02/solution-02.md b/03-Azure/01-02 Data/03-Talk_to_your_data/walkthrough/challenge-02/solution-02.md index ec966550..36aa6ce7 100644 --- a/03-Azure/01-02 Data/03-Talk_to_your_data/walkthrough/challenge-02/solution-02.md +++ b/03-Azure/01-02 Data/03-Talk_to_your_data/walkthrough/challenge-02/solution-02.md @@ -36,7 +36,7 @@ In this step, you activate the required Power BI / Fabric trial so you can use r NarrativeNotes -Click your profile image and start the Free Trial.You need a Power BI license to create and use AI-driven reports directly in Microsoft Fabric. +Click your profile image (1), start the Fabric Free Trial (2), select Sweden Central as the trial region (3) to align with your Lakehouse and databases, and confirm activation (4).You need a Power BI license to create and use AI-driven reports directly in Microsoft Fabric. @@ -51,13 +51,19 @@ In this task, you will create a semantic model from the Lakehouse tables. A sema NarrativeNotes In the Lakehouse view, select New semantic model from the top menu. - + In the New semantic model dialog: enter the semantic model name (1), click Select all (2) to include all available tables, and click Confirm (3).Example semantic model name: TalkToYourDataSemanticModel. - + +Before clicking Confirm, make sure newly created tables (for example employees) is included in the selection.Use your screenshot below to verify that the table is selected. + +Click on Confirm. + Wait until the semantic model is successfully created. You will then see it listed in the workspace.If creation fails, switch once to the SQL Analytics Endpoint view and create it from there. +Navigate back to the workspace by clicking the workspace icon in the left navigation. + Verify that the semantic model was created successfully. - + ## 3. Create the Relationships @@ -71,31 +77,29 @@ In this task, you will optimize the semantic model by creating the required rela NarrativeNotes Confirm the semantic model is in Edit mode before optimization work.Relationship and metadata changes are only available in Editing mode. - + Click Manage relationships from the ribbon while in Editing mode.This opens the relationship manager for creating and adjusting joins. - + In the Manage relationships dialog, click New relationship.Start defining core fact-to-dimension joins. - + Create Sales[CustomerID] -> Customer[CustomerID] and click Save.Enables customer-level analysis of sales. - + Create Sales[OrderDate] -> DimDate[Date].Required for month, quarter, and year reporting. - + Create ProductFeedback[ProductID] -> Product[ProductID].Connects feedback records to the product dimension. - - + Create State[RegionID] -> Region[RegionID].Builds the state-to-region geography hierarchy. - + Create Sales[CustomerStateID] -> State[StateID].Maps sales to customer location. - + Create Sales[ProductID] -> Product[ProductID].This is the core product join for sales analytics. - -Create employees[SalesOfficeID] -> SalesOffice[SalesOfficeID].Enables organizational analysis by office and region. - + +Create employees[SalesOfficeID] -> SalesOffice[SalesOfficeID].Enables organizational analysis by office and region. Fabric may auto-select Email first because it is a common column, so make sure you choose SalesOfficeID before saving the relationship. + Review all listed relationships and confirm they are Active, then close the dialog.Quick validation to ensure no required join is missing. - + Verify the model diagram now shows connected relationship lines.The model should now reflect the intended star / multi-fact shape. - + In this semantic model, you use a multi-fact model, which means the semantic model contains more than one fact table, where each fact table represents a different business process. This approach is common when multiple types of business activities need to be analyzed together while still sharing common dimensions. @@ -116,11 +120,8 @@ In this semantic model, you use a multi-fact model, which means the seman - + - - + Apply the same between the Sales table and the Product table by changing Cross-filter direction the value to Both (1). This allows filters to flow in both directions. Click Save (2) to apply the updated relationship settings. Sales metrics can now be analyzed together with product attributes using bidirectional filtering. - + Recheck the model diagram after cross-filter updates.Confirm relationship behavior is stable before metadata edits. - - + Switch from Editing back to Viewing mode after optimization is complete. - + Congratulations! Your data model is now ready to use! @@ -153,10 +150,12 @@ In this step, use the semantic model to automatically generate the first report NarrativeNotes +Navigate back to the workspace by clicking the workspace icon in the left navigation. + Open the semantic model and select Explore from the top menu, then click Auto-create a report.This generates a first draft of the report automatically based on the available fields and data patterns. - + Review the generated Quick summary page and check the suggested visuals.You can also adjust the field selection in the Your data pane on the right if needed. - + ## 5. Refine the Report with Prompts @@ -170,7 +169,7 @@ In this step, use prompts or Copilot suggestions to improve the auto-generated r NarrativeNotes Switch to Edit mode from the report menu and confirm with Continue.Use this mode to manually refine and optimize the generated report. - + First click the Copilot icon (1). Then a sidebar opens to the left where you can add your prompt (2) to add something on the report page, like: Provide a detailed, insight-focused overview of "Sum of Quantity by RegionName" visual's data.
Use Copilot prompts to refine the auto-generated report with better visuals, clearer wording, or additional insights.This screenshot shows an example of how Power BI Copilot can help improve the report. @@ -185,12 +184,9 @@ If you prefer you can also manually adjust visuals, formatting, and layout, and NarrativeNotes - - -Now try yourself edditing the report!
+Now try yourself editing the report!
When you are done, click Save, enter a report name, and confirm.This stores the generated and refined report in your workspace. - + ## 7. Prepare the Data for AI @@ -204,23 +200,23 @@ In this task, you will simplify the data schema and prepare the model for AI and NarrativeNotes Open your workspace and open the semantic model in Microsoft Fabric. - + From the top menu, select Prep data for AI. - + In the Prep data for AI dialog, review and configure the steps: simplify the data schema, verified answers, and AI instructions. - + Select Simplify the data schema from the left navigation and expand the semantic model to view all tables and columns.This step keeps only the relevant tables and columns available to AI, improving query quality and response accuracy. - + Review the table selections carefully.Customer: uncheck CustomerID and DateCreated
DimDate: keep all columns checked
Employees: uncheck EmployeeID and SalesOfficeID
FeedbackMedia and FeedbackVote: uncheck all columns - + Continue simplifying the model based on business relevance.Product: uncheck ProductID
ProductFeedback: uncheck CustomerID, FeedbackID, and ProductID
Region: uncheck RegionID - + Review the remaining fact and dimension tables.Sales: uncheck CustomerID, CustomerStateID, OrderDate, ProductID, and ShipDate
SalesOffice: uncheck SalesOfficeID and StateID - + Finalize the schema cleanup.State: uncheck RegionID and StateID - + After reviewing your selections, click Apply. - + ## 8. Optional: Add Verified Answers @@ -234,7 +230,7 @@ Verified answers are optional and are created from Power BI reports, not NarrativeNotes In the left navigation, select Verified answers (preview).No configuration is required at this step. Leave this section unchanged and continue. - + ## 9. Set Up AI Instructions @@ -248,10 +244,11 @@ In this task, you will add business-focused AI instructions so Copilot and other NarrativeNotes In the left navigation, select Add AI instructions (preview). In the text box, add clear, business-focused instructions, then click Apply. - + See the example section below for a sample instruction set. - -Example of AI Instructions

+ +
+Example of AI Instructions (click to expand)
You help users explore and understand this semantic model by answering questions using the correct business logic, relationships, and terminology. Provide clear, concise insights without exposing SQL unless explicitly requested. Model Overview @@ -270,15 +267,6 @@ ProductFeedback: Ratings & text reviews. FeedbackMedia: Images/videos tied to feedback. FeedbackVote: Helpful/unhelpful votes. - - Behavior Rules Infer the correct facts and dimensions based on natural language. Compute metrics when needed (for example, TotalPrice = Quantity * UnitPrice – DiscountAmount). @@ -288,17 +276,12 @@ Use business-friendly wording instead of column names unless asked. Avoid making up fields or relationships not present in the schema. Guidance for Common Requests -“Top products” → Use Sales.Quantity or TotalPrice. -“Customer trends” → Use Customer and Sales. -“Rating analysis” → Use ProductFeedback.Rating. -“Regional performance” → Use State / Region with Sales. -“Employee counts or salaries” → Use Employees grouped by SalesOffice or Department. - - - +"Top products" -> Use Sales.Quantity or TotalPrice. +"Customer trends" -> Use Customer and Sales. +"Rating analysis" -> Use ProductFeedback.Rating. +"Regional performance" -> Use State / Region with Sales. +"Employee counts or salaries" -> Use Employees grouped by SalesOffice or Department. +
## 10. Explore Power BI Copilot @@ -311,8 +294,12 @@ In this step, open Power BI Copilot from the report and ask natural language que NarrativeNotes +Navigate back to the workspace by clicking the workspace icon in the left navigation. + +Open the previously created Report. + Click Copilot in the report and select Get started.You can now ask natural language questions directly against the prepared report and semantic model. - + ## Summary diff --git a/03-Azure/01-02 Data/03-Talk_to_your_data/walkthrough/challenge-03/solution-03.md b/03-Azure/01-02 Data/03-Talk_to_your_data/walkthrough/challenge-03/solution-03.md index 5432661e..95be5561 100644 --- a/03-Azure/01-02 Data/03-Talk_to_your_data/walkthrough/challenge-03/solution-03.md +++ b/03-Azure/01-02 Data/03-Talk_to_your_data/walkthrough/challenge-03/solution-03.md @@ -11,25 +11,22 @@ 1. [Prompt Scenarios](#31-prompt-scenarios) 2. [Advanced Prompt Scenarios ](#32-advanced-prompt-scenarios) 4. [Write some instructions yourself](#4-write-some-instructions-yourself) - 1. [Data agent instructions](#41-data-agent-instructions) - 2. [Agent instructions](#42-agent-instructions) - 3. [Example: Evolving Agent Instructions](#43-example-evolving-agent-instructions) -5. [Compare our Agent instruction and Data source instruction to yours](#5-compare-our-agent-instruction-and-data-source-instruction-to-yours) + 1. [Agent instructions](#41-agent-instructions) + 2. [Data source description and instructions](#42-data-source-description-and-instructions) +5. [Compare our Agent instruction and Data source instruction to yours](#5-compare-our-agent-instructions-and-data-source-guidance-with-yours) 6. [Try the data agent and check for improvements](#6-try-the-data-agent-and-check-for-improvements) 7. [Publish and Open Data Agent to M365 Copilot](#7-publish-and-open-data-agent-to-m365-copilot) [Summary](#summary) # Lab Overview -In this lab, you will explore how instructions and data context influence the behavior and quality of responses generated by a Data Agent in Microsoft Fabric and M365 Copilot. +In this walkthrough, you will learn how instruction quality and data context directly affect the quality of Data Agent responses in Microsoft Fabric and M365 Copilot. -You will begin by setting up a Data Agent connected to your Lakehouse. Next, you will learn about prompt engineering principles and run a set of predefined prompts, covering sales performance, time intelligence, feedback analysis, chain-of-thought reasoning, and persona-based queries without providing any instructions, in order to observe the default behavior of the agent. +You will first set up a Data Agent connected to your Lakehouse, then run prompts without custom instructions to observe baseline behavior. Next, you will create your own Agent Instructions and Data Source Description/Instructions. -You will then write your own instructions, including a data schema description and custom agent instructions that define the agent's role, tone, output structure, language, and rules for handling uncertainty. You will compare your instructions with the reference Agent instructions and Data Source descriptions provided in the lab. +Using the same example prompt, you will compare responses step by step (from no instructions to production-ready guidance) and see how structure, consistency, and reliability improve as instructions become clearer. -After applying instructions, you will re-run the same prompts and compare the before and after results side by side across five prompt examples to evaluate the improvements in accuracy, clarity, and usefulness. - -Finally, you will publish the Data Agent to M365 Copilot and access it through the M365 Copilot chat experience, allowing you to see how your configured agent behaves in a real user-facing environment. +After validating the improvements, you will publish the Data Agent to M365 Copilot and test it in a real chat experience. # 1. Data Agent Setup In this task, you will set up a Data Agent to enable intelligent data interactions with your Lakehouse. The agent will use the unified Lakehouse as its data source, allowing it to answer user questions based on the most up-to-date and trusted data. @@ -40,16 +37,16 @@ In this task, you will set up a Data Agent to enable intelligent data interactio NarrativeNotes -Click New item (Top Left). Search Data agent in search bar (1) and select Data agent (preview) (2). - +Click New item (Top Left). Search Data agent in search bar (1) and select Data agent(2). + In the Create data agent dialog: Enter a name for the agent (1)and click Create(2) Example Name: SpaceRangerAgent - + On the Set up your data agent screen, click Add data source. - + Select the Lakehouse by clicking the checkbox next to it. Click Add to attach the Lakehouse to the Data Agent. - + In the Explorer pane on the left, confirm the Lakehouse is listed as an data sourceLeave the Page open, you will work with this agent in the next chapter. - + # 2. Prompt Engineering for Data Agents @@ -71,17 +68,13 @@ Every strong prompt directed at a Data Agent contains four building blocks. Unde # 3. Try our prompts without any instructions given ## 3.1. Prompt Scenarios -### Sales Performance (Warm-up) +### Sales Performance & Time Intelligence (Warm-up) - Which top 5 toys generated the highest revenue this year? - -### Time Intelligence & Trends - Show monthly revenue trend for 2025. -- Compare revenue year-over-year by quarter. Include a % growth compared to the previous quarter. ### Feedback & Ratings - Which toys have high sales but low average ratings? - What is the average rating per toy and how many reviews exist? Sort by Review Count DESC. -- Tell me which customers have given the most feedback. Show me how many of these reviews were positive. ## 3.2. Advanced Prompt Scenarios ### Chain of Thoughts @@ -89,97 +82,110 @@ Every strong prompt directed at a Data Agent contains four building blocks. Unde - Identify the months with the highest toy sales. Then analyze whether those months also show higher customer satisfaction scores. - Identify toys with declining sales performance over the last 6 months. Then compare their customer ratings to determine if dissatisfaction may explain the decline. -- Identify the top 5 toys by revenue this year. Then compare their average rating. Highlight any toy with rating below 3.5. ### Persona -**Persona prompting** assigns the agent a specific role or professional identity before asking the question. By framing the agent as *"a Senior Business Analyst"* or *"a Regional Sales Director"*, you shift the tone, depth, and format of the response to match what that role would produce - structured reports, strategic recommendations, or executive summaries - rather than raw data tables. +**Persona prompting** assigns the agent a specific role or professional identity before asking the question. By framing the agent as *"a Regional Sales Director"*, you shift the tone, depth, and format of the response to match what that role would produce - structured reports, strategic recommendations, or executive summaries - rather than raw data tables. - Act as a Regional Sales Director. Draft a strategic plan for next month’s regional event. Use the current event data to rank the regions by sales potential and provide a specific 'Action Plan' for the lowest performing region to boost their sales. Include a table of target sales goals. # 4. Write some instructions yourself -## 4.1. Data agent instructions -Describe data schema -## 4.2. Agent instructions +## 4.1. Agent instructions Consider role and purpose/tone, structure, and verbosity/Language/Rules for uncertainty and data gaps -## 4.3. Example: Evolving Agent Instructions -Before moving to the full advanced version in section 5, use this progression to see how instruction quality improves step by step while keeping data and prompts constant. +| **Narrative** | **Screenshot** | +|:------------|:--------------| +|Agent Instruction|![](../../Images/image074.png)| -
Step 0 - Minimal +## 4.2. Data source description and instructions +Consider the data schema. +- Data Source Description: what data you have (tables, keys, relationships). +- Data Source Instructions: how the agent should use that data (joins, time logic, safety rules). -```text -You are a Microsoft Fabric Data Agent. +| **Narrative** | **Screenshot** | +|:------------|:--------------| +|Data Source Description|![](../../Images/image073.png)| +|Data Source Instruction(Box bottom)|![](../../Images/image075.png)| -Your task is to answer questions using the connected data sources. -``` +# 5. Compare our agent instructions and data source guidance with yours +In this task, you will see how answer quality improves based on how clearly you define the agent instructions and the data source description/instructions. +You can also optionally compare your own setup with the reference instructions used in this lab. -
+💡 **Important:** AI-generated responses may differ each time, even for the same question. -
Step 1 - Add analyst role, thinking process, and answer rules +**Example Prompt:** +Identify the months with the highest toy sales. Then analyze whether those months also show higher customer satisfaction scores. -```text -You are a Microsoft Fabric Data Agent acting as a professional data analyst. - -Your goal is to answer business questions clearly, accurately, and based only on the available data. - -Thinking process: -Before answering, explicitly consider: -- The metric being requested -- The required level of detail (grain) -- The dimensions and time context -- Which tables are required and why - -Answer rules: -- Do not invent data or assumptions. -- If the question is ambiguous, ask one short clarifying question. -- If data is missing, explain the limitation and suggest what is possible. -- Use clear, structured language suitable for business users. -- Keep verbosity moderate: no long explanations unless necessary. -``` +### Step 0 - No Instructions -
+| Agent Instructions | Data Source Description and Instructions | +|:--|:--| +|

Default state:
- No agent instruction text is provided.
- The agent must infer intent on its own. |

Default state:
- No data source description or instruction text is provided.
- The agent has no routing context for the data. | -
Step 2 - Add response structure and consistency +**Data Agent Answer** + +What to notice: +- The agent reports no matching data for the request. +- It cannot provide month-by-month sales and satisfaction analysis. +- This shows how no instruction/context can lead to unusable output. -```text -You are a Microsoft Fabric Data Agent acting as a professional data analyst. - -Your objective is to answer business questions clearly, accurately, and consistently, -using only the connected data sources. - -Before answering, explicitly think through: -- The metric being requested and how it should be calculated -- The required level of detail (grain) -- The relevant dimensions and time context -- Which tables and relationships will be used, and why - -Response guidelines: -- Start with a direct, single-sentence answer to the business question. -- Then present the results in a clear, structured format (for example, a table). -- Do not invent data or assumptions. -- If the question is ambiguous, ask one short clarifying question. -- If data is missing or unavailable, clearly explain the limitation and suggest what can be answered instead. -- Use a professional, business-friendly tone. -- Control verbosity: be concise and structured; avoid unnecessary explanation. -``` +### Step 1 - Minimal -
+| Agent Instructions | Data Source Description and Instructions | +|:--|:--| +|

Text DetailsYou are a Microsoft Fabric Data Agent.
Answer questions using the connected data.

Included in this step:
- A basic role is introduced.

|

Text DetailsData Source Description
The dataset contains sales, products, customers, and dates.
Data Source Instructions
Use the dataset to answer questions.

Included in this step:
- A minimal data source description and instructions are added.
| -
Step 3 - Full advanced instructions +**Data Agent Answer** + +What to notice: +- The agent now returns top sales months with numeric values. +- The satisfaction analysis is still inconsistent. +- Output is improved, but still mostly narrative and not fully structured. -In section 5, you will compare your version with the complete advanced Agent Instructions (routing guide, canonical joins, terminology, guardrails, and formatting conventions). +### Step 2 - Added Role, Scope, Joins, and Response Structure -
+| Agent Instructions | Data Source Description and Instructions | +|:--|:--| +|

Text DetailsYou are a Microsoft Fabric Data Agent.
Answer business questions using only the connected tables.

Consider the data schema:
- Sales contains transactional data.
- Product represents toys.
- Customer contains customer attributes.
- DimDate must be used for all time-based filtering.

Avoid guessing when data is not available.

Included in this step:
- Role and scope are enforced for connected tables only.
- Schema-aware guidance is added (Sales, Product, Customer, DimDate).
- The agent is told to avoid guessing when data is missing.
|

Text DetailsData Source Description
Dataset includes Sales, Product (toys), Customer, and DimDate.

Sales is the main fact table and is connected to Product, Customer,
and DimDate via foreign keys.

Data Source Instructions
Use Sales for revenue and quantity.
Join Product for toy names.
Use DimDate for filtering by time.
Do not expose customer emails.

Included in this step:
- Main dataset entities and relationships are clarified.
- Usage rules are added for joins and time filtering.
- Privacy guidance is added to avoid exposing customer emails.
| -# 5. Compare our Agent instruction and Data source instruction to yours -| **Narrative** | **Screenshot** | -|:------------|:--------------| -|Agent Instruction|![](../../Images/image074.jpg)| -|Example of Agent Instructions|![](../../Images/image072.jpg)| +**Data Agent Answer** + +What to notice: +- The answer clearly separates sales results and satisfaction results. +- It explicitly flags missing rating data for one top-sales month. +- Conclusion quality improves because schema/use rules reduce ambiguity. + +### Step 3 - Enhanced Guardrails and Routing Guidance + +| Agent Instructions | Data Source Description and Instructions | +|:--|:--| +|

Text DetailsYou are a Microsoft Fabric Data Agent acting as a data analyst.
Before answering:
- Identify the metric (Revenue, Quantity, Orders, Avg Rating).
- Identify the grain (Product, Customer, State, Month, etc.).
- Identify required filters (date, geography, category).
Always:
- Use Sales for performance metrics.
- Use Product.ProductName as the toy name.
- Use DimDate for all time logic.
- Explain clearly if requested data is not available.

Included in this step:
- Guardrails for behavior and wording are strengthened.
- The agent is guided to avoid over-claiming. |

Text DetailsData Source Description
The dataset follows a star-schema-like structure.

Sales is the central fact table.
Dimensions include Product (toys), Customer, State/Region, and DimDate.
Feedback-related tables are used only for ratings and reviews.

Data Source Instructions
- Revenue = SUM(Sales.TotalPrice)
- Quantity = SUM(Sales.Quantity)
- Orders = COUNT(DISTINCT Sales.OrderNumber)

Join rules:
Sales -> Product via ProductID
Sales -> DimDate via OrderDate

Avoid PII fields unless explicitly requested.

Included in this step:
- Routing and usage guidance is expanded.
- More explicit rules are added for safer interpretation. | + +**Data Agent Answer** + +What to notice: +- The response is concise and avoids over-claiming when feedback data is missing. +- It gives a clear summary of what is known vs unknown. +- Coverage is safer, but narrower than Step 2 (focuses on fewer months). + +### Step 4 - Production-ready + +| Agent Instructions | Data Source Description and Instructions | +|:--|:--| +|

Included in this step:
- Full planning, validation, and response guardrails are defined.
- Behavior is tuned for consistency under complex prompts. |

Included in this step:
- Data model and usage rules are comprehensive.
- Time logic, routing, and safety constraints are explicit. | + +**Data Agent Answer** + +What to notice: +- The answer includes a direct comparison table for sales vs ratings. +- Numeric evidence is clear and easy to verify. +- The final takeaway is explicit and business-ready. + +
+Optional: Production-ready instruction text for Step 4
-Example of Full advanced Agent instructions +Agent Instructions ```text General Agent Instructions (Always-On System Prompt) @@ -257,13 +263,8 @@ Performance: Push filters to the source (dates/keys), avoid unnecessary joins, a
- -| **Narrative** | **Screenshot** | -|:------------|:--------------| -|Example of Data Source Description|![](../../Images/image073.jpg)| -
-Example of Data Source Description details +Data Source Description ```text Dataset contains Customers, Products (toys), Sales, Feedback, Votes, Media, Dates, Regions, States, Sales Offices, and Employees. Core keys: ProductID, CustomerID, OrderNumber, StateID, RegionID, and FeedbackID. @@ -272,19 +273,15 @@ Connected via: Sales.ProductID→Product.ProductID; Sales.CustomerID→Customer.CustomerID; Sales.CustomerStateID→State.StateID; State.RegionID→Region.RegionID; Sales.OrderDate→DimDate.Date; ProductFeedback.ProductID→Product.ProductID; ProductFeedback.CustomerID→Customer.CustomerID; ProductFeedback.FeedbackDate→DimDate.Date; FeedbackVote.FeedbackID→ProductFeedback.FeedbackID; FeedbackVote.CustomerID→Customer.CustomerID; FeedbackVote.VoteDate→DimDate.Date; FeedbackMedia.FeedbackID→ProductFeedback.FeedbackID; FeedbackMedia.UploadedDate→DimDate.Date; employees.SalesOfficeID→SalesOffice ``` +
-| **Narrative** | **Screenshot** | -|:------------|:--------------| -|Example of Data Source Instruction Box bottom|![](../../Images/image075.jpg)|
-Example of Data Source Instruction Box bottom details +Data Source Instructions ```text Rules - - Use the dataset to answer questions about toy sales, customer behavior, product attributes, feedback, geographic distribution, and organizational structure. Always treat Product.ProductName as the canonical toy name. @@ -293,10 +290,6 @@ Time filtering must always be done using DimDate by joining on the fact table’ Avoid using or exposing PII (Email fields) or binary data (Product.Photo) unless explicitly requested. - - - - 1. Customer Purpose: Customer master data. @@ -307,8 +300,6 @@ Important fields: Name, Country, DateCreated, optional Email and Birthday. How to use: - - Join from Sales, Feedback, and Votes via CustomerID. Use for segmentation (country, creation date). @@ -319,10 +310,6 @@ Do not expose emails by default. Birthday should only be used for aggregated insights, not individual reporting. - - - - 2. Product (Toys) Purpose: Full product (toy) catalog. @@ -333,8 +320,6 @@ Important fields: ProductName, SKU, Category, ItemGroup, KitType, Demographic, C How to use: - - Treat ProductName as the mandatory toy label in all outputs. Join from Sales and Feedback via ProductID. @@ -345,10 +330,6 @@ Notes: Do not return binary Photo unless explicitly required. - - - - 3. Sales Purpose: Core fact table for orders. @@ -357,8 +338,6 @@ Key: OrderNumber Relationships: - - ProductID → Product CustomerID → Customer @@ -381,10 +360,6 @@ Notes: Filter early by date and keys for performance. - - - - 4. DimDate Purpose: Calendar dimension for time intelligence. @@ -395,8 +370,6 @@ Important fields: Year, Quarter, Month, MonthNumber, CalendarWeek, Day, WeekDay. How to use: - - Always join facts to DimDate for weekly, monthly, quarterly, and yearly analysis. Use MonthNumber for numeric month filtering. @@ -405,10 +378,6 @@ Notes: Do not parse dates manually—always rely on DimDate. - - - - 5. ProductFeedback Purpose: Customer feedback on products. @@ -417,8 +386,6 @@ Key: FeedbackID Relationships: - - ProductID → Product CustomerID → Customer @@ -435,10 +402,6 @@ Notes: ReviewText is free-form; do not infer meaning unless asked. - - - - 6. FeedbackVote Purpose: Votes on feedback entries. @@ -447,8 +410,6 @@ Key: VoteID Relationships: - - FeedbackID → ProductFeedback CustomerID → Customer @@ -465,10 +426,6 @@ Notes: VoteType is categorical—do not assume positive/negative unless user defines it. - - - - 7. FeedbackMedia Purpose: Media attached to feedback entries. @@ -481,18 +438,12 @@ Important fields: MediaType, MediaUrl, UploadedDate. How to use: - - Join to ProductFeedback for media counts and type distribution. Notes: Do not expose MediaUrl unless requested. - - - - 8. State Purpose: Geographic dimension for states. @@ -505,8 +456,6 @@ Important fields: StateCode, StateName, TimeZone. How to use: - - Join from Sales via CustomerStateID. Roll up to Region when applicable. @@ -515,10 +464,6 @@ Notes: RegionID may be null; use left joins when navigating to Region. - - - - 9. Region Purpose: Geographic roll-up for states. @@ -529,14 +474,8 @@ Important fields: RegionName. How to use: - - Join from State to support region-level analytics. - - - - 10. SalesOffice Purpose: Office/branch information. @@ -549,18 +488,12 @@ Important fields: Address fields, PostalCode, Telephone, Facsimile, Email. How to use: - - Use for organizational/territory analysis and mapping employees to geography. Notes: Email is sensitive; do not expose without explicit user request. - - - - 11. employees Purpose: Employee directory. @@ -573,8 +506,6 @@ Important fields: FirstName, LastName, Title, Department, HireDate, Salary, Emai How to use: - - Analyze headcount by office/department, tenure cohorts (via HireDate), compensation aggregates. Use SalesOffice to join into geographic hierarchy. @@ -583,10 +514,6 @@ Notes: Treat Email as sensitive; Salary should only be aggregated, not shown individually unless explicitly requested. - - - - Join & Routing Summary (Quick Reference) Sales analysis: @@ -617,46 +544,16 @@ Time analytics: Always use DimDate joined via the fact’s date. ``` -
- -# 6. Try the data agent and check for improvements -Run the same prompts again after adding your Agent instructions and Data source descriptions. Compare the baseline and improved responses side by side. - -### Prompt Example 1 -**Prompt:** Which top 5 toys generated the highest revenue this year? - -| **Without Instructions and Data Source Descriptions** | **With Instructions and Data Source Descriptions** | -|:----------------------------------------------------|:-------------------------------------------------| -| The agent returns the correct top 5 items, but the output is mostly a plain list. It is readable, but the response gives less structure and makes comparison harder for the user.

| The agent still returns the correct ranking, but now the answer is formatted as a table with clear columns for `ProductName` and `TotalRevenue`. This makes the output easier to scan and more useful for business reporting.

| - -### Prompt Example 2 -**Prompt:** Compare revenue year-over-year by quarter. Include a % growth compared to the previous quarter. - -| **Without Instructions and Data Source Descriptions** | **With Instructions and Data Source Descriptions** | -|:----------------------------------------------------|:-------------------------------------------------| -| The agent provides the numbers, but the answer is presented as narrative bullet points grouped by year. This works, but it makes quarter-by-quarter comparison less direct and puts more effort on the reader.

| The improved version organizes the same information into a table with `Year`, `Quarter`, `Total Revenue`, and `% Growth from Previous Quarter`. This is a much clearer format for trend analysis and makes the growth pattern immediately visible.

| -### Prompt Example 3 -**Prompt:** Which toys have high sales but low average ratings? - -| **Without Instructions and Data Source Descriptions** | **With Instructions and Data Source Descriptions** | -|:----------------------------------------------------|:-------------------------------------------------| -| Without guidance, the agent concludes that no products match the condition. The answer is cautious, but it does not produce a useful product-level analysis for follow-up action.

| With instructions, the agent identifies specific products that combine strong sales with weak ratings and presents them in a table with `ProductName`, `TotalUnitsSold`, `AvgRating`, and a remark. This makes the result actionable for product and sales teams.

| - -### Prompt Example 4 -**Prompt:** Identify the months with the highest toy sales. Then analyze whether those months also show higher customer satisfaction scores. +
-| **Without Instructions and Data Source Descriptions** | **With Instructions and Data Source Descriptions** | -|:----------------------------------------------------|:-------------------------------------------------| -| The agent fails to connect the sales and satisfaction data and falls back to a limitation message. This shows the model is not able to do about joining the two perspectives without guidance.

| After adding instructions and data context, the agent identifies the peak sales month and compares its customer satisfaction score with other months. The response now produces an actual conclusion instead of stopping at a limitation notice.

| + +# 6. Try the data agent and check for improvements +In this task, you have already seen in Section 5 how the answer quality improves as instructions become clearer and more structured. -### Prompt Example 5 -**Prompt:** Act as a Regional Sales Director. Draft a strategic plan for next month’s regional event. Use the current event data to rank the regions by sales potential and provide a specific 'Action Plan' for the lowest performing region to boost their sales. Include a table of target sales goals. +Now try it yourself with the same prompt and confirm the improvement step by step in your own Data Agent. -| **Without Instructions and Data Source Descriptions** | **With Instructions and Data Source Descriptions** | -|:----------------------------------------------------|:-------------------------------------------------| -| The agent cannot retrieve the regional sales data, so it falls back to a generic framework. While still helpful, the answer is not truly data-driven and does not satisfy the ranking requirement fully.

| The improved response ranks the regions using actual sales values, identifies the lowest-performing region, and proposes a focused action plan with next-month target goals. This is much closer to the kind of business-ready output attendees should aim for.

| # 7. Publish and Open Data Agent to M365 Copilot In this task, you will publish and open the Data Agent so it can be used for testing, sharing, and future reuse. @@ -668,13 +565,13 @@ In this task, you will publish and open the Data Agent so it can be used for tes NarrativeNotes In the Data Agent editor, locate the Publish button in the top menu (1). Enable Also publish to the Agent Store in Microsoft 365 Copilot by turning the toggle On (2). Click Publish (3) to complete the publishing process.Once published, the agent moves from Draft to Published status. - + Open the Notifications panel from the top right corner (1). Verify the message "Successfully published data agent" appears (2).This confirms the agent was published without errors. - + Return to the Data Agent main page. In the top right status indicator (1), confirm: Status shows Published.A published version is available. The agent is now ready to be used for testing natural language questions, sharing with others, and integration with Copilot experiences. - + Open the Fabric Agent at m365.cloud.microsoft/chat. - + @@ -684,7 +581,7 @@ In this lab, you have accomplished the following: - Observed the default behavior of a Data Agent in Microsoft Fabric and M365 Copilot by interacting with it without any custom instructions. -- Designed and applied custom agent instructions to clearly define the data schema, agent role, tone, language, and rules for handling uncertainty and missing data. +- Designed and applied custom agent instructions to clearly define the data schema, agent role, tone, language, and rules. - Compared custom agent instructions with built-in Agent and Data Source instructions to understand how different instruction layers influence reasoning and output.