Be smarter with every change
Semantic Link Labs Edition
This provides a quick and automated way to identify where and how specific fields, measures, and tables are used across Power BI reports in all workspaces by analyzing the Report Layer. It also breaks down the details of your models, reports, and dataflows for easy review, giving you an all-in-one Power BI & Fabric Governance solution. All of the metadata is outputted to a Fabric Lakehouse and can be scheduled for automated updates.
This is the Semantic Link Labs Python & Fabric Lakehouse edition. It requires a Fabric Workspace. If looking for Impact IQ's One-Click, Designed for Everyone edition that runs on any computer, provides report, model, and dataflow backups, and leverages the Power BI + Fabric Service and REST API across all Workspaces, check out: https://github.com/BeSmarterWithData/ImpactIQ
Have specific Reports and/or Models downloaded you want to analyze? Don't have direct access to the Workspace but have the PBIX? Check out Impact IQ's Local edition here: https://github.com/BeSmarterWithData/ImpactIQ-Local
- Impact Analysis: Fully understand the downstream impact of data model changes, ensuring you don’t accidentally break visuals or dashboards—especially when reports connected to a model span multiple workspaces.
- Used and Unused Objects: Identify which tables, columns, and measures are actively used and where. Equally as important, see what isn't used and can be safely removed from your model to save space and complexity.
- Comprehensive Environment Overview: Gain a clear, detailed view of your entire Power BI environment, including complete breakdowns of your models, reports, and dataflows and their dependencies.
- User-Friendly Output: Results are presented in a Power BI model, making them easy to explore, analyze, and share with your team.
- Workspace Selector → Only want to run this against 1, 2, 10 workspaces? Now a popup will allow you to choose which workspaces you run this against. Select All will still run against eveyrthing and a built-in timer ensures no selection will run against everything.
- Unused Model Objects → Identify model fields/measures not used in any visuals, measures, calculated columns, or relationships.
- Broken Visuals (with Page Links) → See all broken visuals/filters and jump directly to the impacted report page.
- Report-Level Measures Inventory → Surface report-only measures with full DAX and usage details.
- New Report Layouts & Wireframe → See where your visuals sit on the page with a wireframe layout - thanks to @stephbruno for this feature!
-
Download the notebook
- Download
GovernanceNotebook.ipynbfrom this repository
- Download
-
Import into your workspace
- In your Fabric workspace, click Import → Notebook
- Choose Upload from this computer
- Select the downloaded
GovernanceNotebook.ipynbfile - Click Open
- Create a new Lakehouse
- Once in the notebook, click Add lakehouse on the left panel
- Select New lakehouse
- Give it a descriptive name (e.g., "PowerBIGovernance")
- Important: Enable Lakehouse schemas checkbox
- Click Create
By default, the notebook is pre-configured with defaults:
- Lakehouse Schema:
dbo(the default schema) - Workspaces:
["All"](scans all workspaces you have access to) - Parallel Workers:
5(number of parallel API calls)
You can modify these settings at the top of the notebook if needed:
LAKEHOUSE_SCHEMA = "dbo" # Schema name in your Lakehouse
WORKSPACE_NAMES = ["All"] # ["All"] or ["Workspace1", "Workspace2"]
MAX_PARALLEL_WORKERS = 5 # 1-10 (higher = faster but more API load)-
Open the template
- Download
Semantic Link Power BI Governance Model.pbitfrom this repository - Open it with Power BI Desktop
- Download
-
Get the Lakehouse SQL Connection String
- Go back to your Fabric workspace
- Find your Lakehouse in the workspace items list
- Look for the item with type SQL analytics endpoint (same name as your Lakehouse)
- Click on it to open the SQL analytics endpoint
- At the bottom left, click Copy SQL connection string
-
Enter connection parameters
- The template will prompt you for:
- SQL Connection String: Paste the connection string from step 3
- Lakehouse Name: Enter the exact name of your Lakehouse (e.g., "PowerBIGovernance")
- Click Load
- The template will prompt you for:
-
Authenticate
- Choose Organizational account authentication
- Sign in with your Microsoft/Azure credentials
- Click Connect
-
Wait for data to load
- Power BI will load all the metadata from your Lakehouse
- This may take a few minutes depending on data volume
To automate regular metadata extraction:
-
Create or open a Data Pipeline
- In your workspace, click New → Data pipeline
- Give it a name (e.g., "Power BI Governance Extraction")
-
Add the notebook activity
- In the pipeline canvas, search for "Notebook" activity
- Drag it onto the canvas
- Configure the activity:
- Workspace: Select your workspace
- Notebook: Select your governance notebook
-
Schedule the pipeline
- Click Schedule at the top
- Enable the schedule
- Set your desired frequency (e.g., daily at 2 AM)
- Click Apply
-
Save and run
- Click Run to test immediately
- Monitor the run status in the pipeline view
.. ..